- Movin' on Up
- New Blog Link
- OH MY GOD!!!!! IT'S THE SWINE FLU!!!!!!!
- Some Definitions (And Maybe Even Some Practical Examples)
- It's Hard to Stop a Moving Train: A Primer on Inertia in Modern Medicine
- Spam Blog
- Winding Down
- If Only it Were a Joke
- A Quick Run Down of Why the Economy is Performing So Terribly
- Stimulus for the Soul
- And So it Begins......
- Politics and Human Nature: Sometimes It Doesn't Pay to Take Over Completely
- When Morality Meets Scarcity in Medicine
- Mandated Health Insurance Isn't a Capitalist Solution (Or a Solution At All for That Matter)
- Duty Hours/Regulation/The IOM/The Cost/The Logic/AHHHHHHHH?
- A Note From the Interview Trail
- It Passed
- A Little Bit More on the State of the Financial System
- An Economy in Crisis
- Every Centralized Idea to Fix the Healthcare Will Fail, and Here's Why.
- Medical Malpractice is a Symptom: And We All Know That You Can't Cure a Disease By Treating It's Symptoms
- A quick thought
- Marginal Utility Becomes Mainstream (The Right and the Wrong Way to Do It)
- How This is Going to Happen
Im just letting everyone know that I havent stopped blogging. I am in the process of completing a move across the country, so Im a little slow. Check back in a week or two at the latest for more.
As some of you may have noticed, I am a little bit stingy with handing out links on my blog. After recently speaking with a blog author however, I think that Ive found a good one to add to the list. The blog is entitled, "Are You A Doctor?" It is written by a physicians assistant who works in the ED at the Mayo Clinic in Rochester, MN. We agree on a number of things, though healthcare policy unfortunately seems to not be one of them. Yet, I respect his perspective. He is intimately familiar with some of the recent developments in US healthcare policy, and his position at the Mayo Clinic gives him first-hand knowledge as to how some the ideas are being handled by the countrys healthcare big-whigs. Plus, its always good to have a spy at the mother ship. All joking aside, I hope you check him out. You can find the link to his blog in my sidebar with the others.
I have been reading so much media hype over the swine flu, that my head is beginning to hurt. For my readers that have been living under a rock, there is apparently a horrible flu in Mexico that has killed a few people while simultaneously sending the entire world into a frenzy. Infectious disease is certainly not my forte, but I can explain a little bit about influenza. I can also explain a little bit about my personal opinion on this disease. What lay people consider the flu is actually a variety of similar ailments. The most common causes are from a couple of different types of influenza virus. Similar symptoms may come from the parainfluenze virus, as well as a couple of other outliers. The only "flu" that is actually the flu comes from influenza virus. This particular virus that is causing the swine flu is of the Influenza A type of influenza virus. This type of virus may express a variety of different antigens. The antigens to which antibodies are commonly made are generally knows as H and N antigens. The variation within these antigens is why there are so many different strains of flu and why unlike say the chicken pox, you can get infected with the flu virus repeatedly. Different influenza viruses can exchange DNA with each other to produce genetically distinct daughter strains and promote variation. Humans are not the only species that can be infected by the flu. Strains of the flu can affect other animals, including species of birds and pigs. Generally, bird flu and pig or swine flu are not infectious to humans. However, a flu that can afflict these species may exchange some DNA with human flu. Occasionally, a flu that is infectious to another species may, either by spontaneous mutation or by exchange of DNA with a human virus, become infectious to humans. Most of the time, this infection is isolated to a specific person or group of people that have contact with the infected animal or animals. However, there are cases where this new infection will further mutate to allow transmission from human to human. This introduces a bunch of new DNA into the human influenza genome. Humans usually walk around with some immunity to the flu. We have a part of our immune system that fights viruses in general, but most humans have been exposed to flu before, and thus most people walk around with additional immunity. The different strains of influenza are distinct enough, that they can re-infect the same host. However, they are similar enough that a previously exposed host can usually mount a partial response to the infection. We could think of this as damage control. Youre sick with the flu, but your immune system can hold it at bay with relative ease due to its similarity to other strains that you have been exposed to. You are also part of the way towards developing immunity to the new flu, so the response is quicker. In cases where there is exchange of DNA from an animal to a human flu (or worse yet when an entirely new flu virus crosses over from animal to human), people will lack this immunity. This gives the infected person fewer resources to hold the disease at bay up front, and it takes longer to develope a specific response to the virus. The VAST majority of the time, this makes for a bad long lasting flu. It is usually NOT fatal. However, occasionally an extremely virulent strain will cross over, and the previously unexposed human population will have very few big guns to stop it up front. Major flu pandemics (such as the flu of 1918) can be the worst-case scenario result of this sort of event. There are a few more facts to keep in mind about the flu. Common everyday influenza is already the bane of the very young and old, the immunocompromised, and the sickly. Every year, 30,000 people will die in the US from the common everyday flu. Most of these people will fall into one of the above categories, though there will be some who will end up susceptible in the not obviously sick adult population. To put it into perspective, the flu kills as many people every year as 6 world trade center bombings. The fear and panic over the swine flu seems to come in large part from the fact that a number of young people have died in Mexico, and the strain seems to be a swine strain of the flu that has crossed over to humans and mutated so as to be transmissable from human to human. I agree that this little set up is exactly how most pandemic flu starts. However, there are some things which make me seriously doubt the current degree of crisis, and more importantly, make me doubt whether this will really turn into a global pandemic spreading death in its wake. The official number of deaths in Mexico currently sits at ~7 per the WHO, with estimates up to 200 of actual victims. There are over 27 known cases of flu in the country, but we all know that the majority of cases probably never sought medical attention or made an official statistic. This 27 number is only laboratory confirmed cases. These are people who made it to the doctor, had signficiant symptoms, were suspected, were tested, and came back positive. We have no way to assess these numbers. Were these really healthy people? Is the number really 7 or 200? Is it possible that there is a group of immunocompromised patients that got hit with the flu? The flu has currently spread, per the WHO website yesterday, to 9 different countries. The CDC this morning has confirmed 109 cases of swine flu in the US, with one solitary death. This death, though unfortunate, was in a toddler in Texas at the age where children may be susceptible to death from any flu. No one outside of the US and Mexico has died of this flu. The fact that no is dying of this flu outside of Mexico alone makes it suspect. Are we dealing with two different strains of swine flu? Is there some other contributing public health disaster in Mexico to which we are not privy? Are the official deaths all immunocompetent hosts? Lastly, this whole thing is essentially starting after flu season. The flu just doesnt generally reach peak virulence when it starts this late. Heres what we should do (this is if someone were to ask me of course). We should work on a vaccine for this particular strain to offer to individuals who are likely to be susceptable before next flu season. This flu very well may attack at the beginning of the next flu season. We should also make sure that there are sufficient stockpiles of anti-viral drugs to address the possibility of a bigger problem. We should really only use these when people are very sick or start dying in real numbers. The last thing we want to do is breed resistance in a virus that seems to be doing minimal damage in our country now, only to have it develop resistance if it becomes more virulent. We should also watch the virus and its spread. A sudden rise in flu related death should necessitate further investigation. Heres what we should NOT do. There is zero reason to panic at this point. We should not be stockpiling Tamiflu and N95 masks. We should not be cancelling events. We should not be living in fear. Every few years, we panic over a new strain of flu (remember the whole avian flu debacle). These sorts of things happen all the time, and it would really be ridiculous if we all shut down ever couple of years over the small possibility that a flu could become a pandemic. Most flu pandemics come from the crossing over of flu from species to species, but most crossing over doesnt result in a pandemic. As I have always maintained on this blog, one of the few legitimate roles of government in healthcare is the control of infectious disease. It is appropriate for them to watch this. Remember however, that the vast majority of infected people have probably never been tested or even sought medical attention. The death rate amongst known cases (which are probably the worst cases) is still less than 1%. Lets keep this in perspective. Right now, the rate of death from this swine flu in the US really isnt any higher than the rate from regular flu (something we do not panic about). It is smart to stay vigilant, but we cannot panic over every potential problem, because the possibilities are endless. If I see one more person walking around in a surgical mask (a mask that probably doesnt protect the person) here in my state where we have no cases of the swine flu, I may lose it. The panic is ridiculous. Could a swine flu pandemic occur? Sure. So could a nuclear war with China. At this point, I dont see a reason to panic about either of them.
In case it isnt obvious, Ive had a little bit more time to post lately. As I wind down my medical education, my responsibility is approaching zero. I dont start residency orientation until June, so Ive really encountered an unprecedented amount of time off. With some of the responses to my posts in the past, I realize that there is some distinct confusion as to the meaning of words related to politics and economics. Consider this post a bit of a dictionary for economics, government, and healthcare. Ive got the time to clear it all up, so why not. Economic Schools and Political Terms- Capitalsim- Private individuals own the ways and means of production. Property is all or largely private, with the individual owners having sovereignty over the use of what they own. There is zero central directing of production or the use of resources. Free individuals trade without any central interference with the rules or prices, with all of these things being set up on a cases by case basis based on uncoerced agreed upon contracts by individuals. There is a lot of capitalism in the US system, and it is not dependent on large corporations or well connected business entities. Every time you walk into a store and buy something that the owner decided to sell without being coerced, you are engaged in capitalism. There are no good examples of capitalism in modern healthcare. Fascism- Individuals own the ways and means of production, but the government controls what they can and cannot do. Usually, companies are directed to engage in activities for the "socal good." If a factory owner is ordered by the government to build tanks, but the government then reimburses him and lets him keep the money, that would be fascist. In medicine, the best example of a system that is defacto fascist is the Canadian healthcare system. Individuals (private physicians) own the ways and means of production, but the government controls what they must produce and how much they get paid based on the "social good." Socialism/Communism- Socialism and Communism are really two peas in a pod, with one simply being further along the spectrum than the other. In a socialist system, the government owns some of the ways and means of production, where in a communist system, the government usually owns all of the ways and means of production (though it supposedly does so on behalf of the workers). The British healthcare system is a form of socialism, in which the government owns the healthcare system, but private competition is allowed. The Cuban healthcare system is a communist system, in which the government owns the healthcare system and private competition is not allowed. These systems often rely on strong unions as representatives for the workers collective interest. Austrian School- The Austrian School of Economics is a purely libertarian school of economic thought. Though there have been a number of famous economists that subscribe to this school, the most famous is Ludwig Von Mises, who was an Austrian Jew who fled Hitler to the US during WWII. This is the only school that supports complete unhampered free market capitalism. Austrian economists tolerate zero government involvement in the economy. This school subscribes to the theory that peaceful trade between willing participants with no coercion from the outside is the way to prosperity. Austrian economists tend to be extremely fond of open borders, no tarrifs, and private production of everything. They vehemently oppose the existance of a central bank. The Austrian School blames the Federal Reserve for the business cycle and would like to see the bank dismantled and replaced with a gold standard. Chicago School- The Chicago School has a lot in common with the Austrian School. It was named after the group of economists that founded it, who were largely based out of Chicagos Universities in the mid-20th century. The Chicago school supports an economy mostly based on capitalism and free trade. They do differ in the sense that they are much more tolerant of a central bank, and there is no striking desire to return to a gold standard. The chicago school tends to be more tolerant of collectivisation in both bargaining and production. You could still probably call the Chicago School capitalist, but it isnt as pure in that respect as the Austrian School. Probably the most notable Chicago economist is Milton Friedman. Keynesian School- This school is based on the writings of John Maynard Keynes. Keynes was a somewhat flamboyant academic economist who lost a significant amount of money in the stock market crash of 1929. Keynes believed that the business cycle (boom-bust cycle) was caused by an inappropriate supply of money. The Keynesian system is completely dependent on a central bank. Keynesian economists believe that a recession is caused by too little money in the economy, often due to hording of capital. It is then the central banks responsibility to increase the money supply during a recession (lower interest rates, print money, etc...). On the flip side, inflation is caused by too much money supply according to Keynesians, so the bank must decrease the money supply during a period of inflation. This is the general system by which most modern economies operate, though there is some pressure to change it. This is NOT a capitalist system, though it clearly relies on some elements of capitalism to function. It is sort of like capitalism blanketed on top of central control. Individuals own the ways and means of production, but all trade goes through the filter of very strictly controlled money supply. When everyone keeps talking about how capitalism has failed, this is really the system that is in place. Marxian School (Communist Theory)- This economic school is based on the writings of Karl Marx. Marx believed that the ways and means of production had been co-opted by a few wealthy individuals who were taking advantage of everyone else. He believed that the way to economic prosperity (and the natural course of mans evolution) was to a world where everything was collectivised and owned by the workers who worked within it. There is no real free trade under Marxian theory. Prices are set at the "appropriate" price. All businesses are owned by the collective of the workers who work within the business. There are no entrepeneurs, and there is no private ownership of property or business. There is no real reason from a purely economic perspective that a strong governmnent would have to be involved in Marx theory, though all attempts at creating it on a national scale have provided exactly that. The US Healthcare System The US healthcare system is a strange hybrid of a number of different systems. It is NOT a capitalist system, though it does incorporate some capitalist elements. It is capitalist in the sense that people can pay cash for services, and that a lot of contracting with insurance companies is private and uncoerced. However, all of these entities are heavily regulated. Insurance companies are private but heavily regulated and directed by the government. This makes the insurance system fascist. Medicare/Medicaid is really a socialist system. These are owned by the government. The fact that they contract with outside firms for both the distribution and occasionally the regulation of healthcare place them a bit in the fascist category as well. The government owns Medicare, funds it with tax dollars, provides it as a regulated benefit for whom it sees fit, but then uses the money as a way to force providers who take those funds to adhere to a plethora of government regulations. The government also directs, controls, and largely funds medical training. I will repeat, it is NOT a capitalist system. Medical Training Terms The Path to Becoming a licensed physician- In the US, to practice medicine, the most common path after high school is as follows: College (4 yrs)-->Medical School (4 yrs)-->Residency/Fellowship(3 to 10 yrs) There are some variations on this path, with some people completing differing numbers of years in college or medical school (though all medical school is ATLEAST 4 years). College- There is no requirement as to what someone must major in to become a physician. As a general rule, applicants to medical school must have completed one year of general chemistry, one year of organic chemistry, one year of biology, and one year of physics. Common extra requirements that vary between accepting medical schools are a course in biochemistry, calculus, or some amount of english or literature. Students at the completion of college have no medical training, but they should have the scientific background that allows medical training to make sense. Those that wish to apply to medical school must complete and exam called the MCAT. In general, the applicant pool is reasonably self selecting, and from year to year, 1/3 to 2/3 of self-selected applicants will fail to get into medical school. Medical School- To be accredited, a medical school must offer 2 years of basic and medical science and 2 years of clinical training. Some schools offer or require additional years of research. The first two years is largely classroom based. Different schools will also teach clinical skills in a variety of different ways during this time. The curriculum may be based on broad concepts (ex: courses in anatomy, physiology, pathophysiology, etc...),on specific organ systems (ex: cardiac system, renal system, etc...), or as a hybrid of the two. All of the same material must be covered, however it is presented. After the first two years, a medical student must take, and pass, the first step of a three part licensing exam called the USMLE. In the second two years, students must work at the hospital. While curriculums vary, ALL schools require some amount of medicine, pediatrics, surgery, psychiatry, obstetrics and gynecology, and general primary care or family medicine. Electives are also usually available to tailor the education to the needs of the student. Students then apply to residency through a process called the match. When a medical student graduates from medical school, he is given the title of Medical Doctor (or Doctor of Osteopathy at a limited number of institutions). This, along with the completion of the second step of the USMLE gives him the right to practice medicine only under the supervision of a residency program. Residency- All residents are physicians. Post graduate training used to require an internship, but this internship has largely been incorporated into residency. The rules vary by state, but as a general rule, physicians may complete one to three years of residency and drop out to practice independently as a general practicioner (assuming that they pass the third part of the USMLE). No one ever does this anymore for practical reasons. Upon completion of a residency (which is 3-7 years depending on the residency), physicians become board eligible or board qualified in a specialty. At this point, the physician may practice independently as a specialist, though physicians increasingly need to complete a seperate specialty board to qualify for compensation as a specialist. Residents work in hospitals and practice medicine under supervision. Depending on the program, they can prescribe medication, perform surgery, and complete documentation. Fellowship- For those who want further training as a sub-specialist, fellowship training is also available. This generally takes 1-3 more years. For example, if a general surgeon wants to become a cardiothoracic surgeon, he may train for 2 more years in a fellowship dedicated to teaching this type of surgery. The same would apply to an internist who wanted to become a nephrologist or a cardiologist. USMLE- Each state licenses its own physicians, but all states now accept a single licensing exam called the USMLE. The USMLE is broken down into three parts (or steps), with the second part having 2 sub-parts: Step I- This is an ~300 question exam taken at a computer center that covers all basic science as it pertains to medicine. The minimum passing score is changed every so often, but at this point in time, a score of 185 is required. This does NOT mean answering 185 questions right. The test is on a scale that no one really knows or understands. As of now, a score below 200 is acceptable but poor, a score of around 215-220 is average, and a score over 240 is really good. Residencies rely heavily on Step I to screen applicants. It is taken between the second and third year of medical school. Step II- This test is broken down into two individual parts, a CK (clinical knowledge) and CS (clinical skills). The CK portion is similar to Step I, though it requires ~350 questions, and those questions are more clinical. CS is administered at one of 5 testing centers in the entire country. Students have to interact with 12 different standardized patient actors, treat them in a manner deemed appropriate, come up with a differential diagnosis, and write a coherent note on each actor. This portion of the test is a bit subjective and is generally abhorred by students required to take it. It was originally started as an exam for foreign medical graduates coming to the US to prove that they could speak english. These tests are completed some time in the fourth year of medical school. Step III- This test is also similar to Steps I and IICK per my understanding (Im not quite at this one yet). It tests patient management, and completion is the last testing step towards medical licensure. If youve completed all of the steps and completed your states residency requirements, you can be licensed as a doctor. I hope all of that helped. Feel free to ask any questions if you still have em.
As a senior medical student, I recently went through the match. The match is the process within which senior medical students (or anyone else seeking a residency) are accepted to residency positions. Most schools release a match list after the process is over. This allows everyone to see who is going where and which specialties are being pursued. A match list is more than just a list. It is largely a reflection of the interests, philosophies, priorities, and successes of a medical school class. It is also interesting to see how these things change over time. Specialties come in and out of vogue, priorities change, and what was once the pinnacle of achievement becomes a dumping ground for those that couldnt get in to some specialty that was itself a dumping ground not too many years before. While these things wax and wane, it is clear within my own institution (and really amongst medical school graduates at large) that the last decade has ushered in a paradigm shift. The traditional medical school class is full of extreme type-A overachievers who have been at the top of everything forever. The traditional rank order list of the years of yore reflected a desire amongst the top applicants to continue this type of function. Once upon a time, and this was before any work hour restrictions or attention to resident health, students fought over the right to work 120 hour weeks in surgery pyramid programs or gruelling schedules at large academic medical centers as internists. Those days are clearly gone. In those same years of yore there were a cadre of specialties which brought a more friendly lifestyle, sometimes without a significant pay cut. Specialties like Dermatology or Radiology were a way out for those who were done with the intense pace that categorizes life in the hospital. From the medical community at large, these jobs have often been considered secondary. I know an old internist who referred to essentially all radiology as "scut." No one speaks like that anymore. In fact, the current shift has made these specialties the most attractive to the creme de la creme. The classic lifestyle specialties are known as the ROAD specialties (Radiology, Opthamlmology, Anesthesiology, and Dermatology). My class, which is graduating around 170 people, had over 20 people apply to radiology in the match. 5 More matched (with several not matching) into Dermatology. We had 15 match into anesthesia (which is also become more attractive to the lifestyle conscious). We had 9 match into Opthalmology, with atleast 2 more taking time off for research to apply more competitively next year. If you add it up, this puts 49 people (or nearly 30% of the class) in the four most most lifestyle friendly specialties. We have more people in all but one of the ROAD specialties than in General Surgery. We have more people going into these specialties than internal medicine. The other divergence of members of the top of the class is to direct surgical specialties. It seems like in many cases, the broad training of general surgery is no longer required or desired. It really does make some sense. You can cut a year or more off of your training or enter a field which requires the same number of years it takes to train to be a generalist to be a specialist. Applications to Neurosurgery, Orthopedics, ENT, and Urology are stable to way up. Direct Plastic Surgery and Direct Vascular Surgery are insanely competitive, with the former being the most difficult to attain residency in all of medicine. So why the paradigm shift? We could blame it all on our lazy generation, but I think that theres much more to it. This is still the extreme Type-A hardworking group at the top of the academic curve that it was in generations past, and Im not sure that the true underlying ethic is extraordinarily different in this group. Ive heard different explanations: 1.There are more women in medical school today, which tend towards more lifestyle friendly specialties. This is however nowhere remotely true in a universal sense (In fact, the fields with the fastest growing percentages of women residents are the difficult surgical fields, and the field with the highest percentage of women is OB/GYN, which is hardly lifestyle friendly). This also doesnt explain why a higher percentage of men are going after lifestyle friendly or direct specialty fields. 2. Medical School isnt as hard as it once was. This is probably true in the sense that the total number of hours required of a student is probably less than it once was. On the flip side, the amount of standardized testing and the overall knowledge requirement has probably increased. This is certainly not a slam dunk. 3. Medical Students arent socially conscious anymore. This one is almost hilarious. Students today flippantly bounce from one cause to the next, with what has to be record enrollment in every social justice group in existance. Medical missions and the "underserved" are the thing du jour. Students are so busy being socially conscious, that it is sometimes hard to figure out when they have time to study medicine. Students in the 70s were probably far less likely to intentionally pursue something in order to fall on the grenade. So youve probably guessed that Ive got a theory. Youre right. You may have also guessed that it boils down to simple economics. Youre also right. Atleast, it boils down to a combination of respect and simple economics. Physicians (nearly all physicians) used to tower above their communities in a financial sense. From the ushering in off Medicare until the HMOs of the 90s, physicians were usually amongst the wealthiest in town. This was true of all specialties. Student loans were relatively low, and training time was MUCH less in many specialties than it is today. Money, while important, was really much less of an issue. Everyone could do well. Today, through a combination of declining physician reimbursement and everyone else getting richer, the relative wealth of medicine is much less. It makes the financial factor more important in specialty selection. Students are often sitting on a student loan mortgage or two. My personal student loan payments will be approximately 5 times my mortgage payment, and still 3 times my mortgage payment if paid off over the same 30 years. Additionally, specialty physicians (while often of questionable added value compared to their generalist counterparts) are virtually always better compensated. Lifestyle specialties are usually much better compensated on an hour to hour basis. Today, there is an inverse relationship between pay and value. The most critical jobs, or atleast the most logistically complicated important jobs (middle of the night emergencies and such) pay much less than most of their elective outpatient counterparts and come with the added sucker punches of higher rates of being sued and an inability to select your patients. In the past, the difficult specialties like primary care or general medicine were respected. They are clearly not in the same way today, and they are clearly not respected above their counterparts in more friendly specialties at all. To be a generalist today is to be hit with a mountain of paperwork (no reimbursement for completion), lower reimbursement, higher rates of lawsuits, and generally poor public opinion. No wonder people are running from it. Reading CT scans from home at 10:00 AM is a heck of a lot less demanding than poring through dead bowel at 2:00 AM, and you get paid a lot more to read the scans. There will always be people who find some sort of inherint satisfaction outside of direct specialization or lifestyle friendly specialties. I for one chose general surgery because the I am personally satisfied with the idea of being able to handle everything, having a broad scope, and being the last line of defense. There is a growing number who agree with me, but that number is dwarfed by those running for the hills away from all that really matters. Heres the problem. You cant run a medical system in which everyone is a Radiologist. You cannot operate a system in which everyone wants to perform cool new endovascular or robotic surgical procedures, but no one is willing to do the midnight appy. People really need a PCP to be the first line of evaluation in many cases. I am in no way denigrating specialists or radiologists, but we need all types. The problem is that the train has left the station and is moving. The medical system will continue to move people away from where they are needed most. Our skewed payment system in concert with a confused legal system and laws limiting physicians ability to be compensated in a relatively appropriate manner have taken the medical train on a one way track towards a cliff, and we all know its hard to stop a moving train.
I have apparently been identified as a spam blog. I cant begin to speculate as to why this is (I have no links to commercial sites, I post consistently but rarely, etc...). If you are receiving a message that this is a spam blog, please disregard it. I have no idea how what I write could be identified as this, but Ive contested the label. I sincerely hope that Blogger corrects whatever flaw is in its software that would identify my blog as spam.
I saw my last patient as a medical student on Wednesday last week. Im going out on a two week rotation in pathology (so Im not quite finished yet), but I have seen my last live patient as a medical student. As I approach graduation, I have some time to reflect on the experiences that have defined the last four years of my life. If I had to guess how all of this would end four years ago, the only thing I know is that I would have been off the mark, possibly about 180 degrees. Four years isnt that long, and it seems that every four year cycle in my life takes just a little bit less time than the one before. Yet, the changes in my own life are profound. Ive more than doubled the size of my family, changed career trajectory, watched people die, been exposed to tuberculosis (fortunately never became positive myself), and Im ending the whole thing off with a move from a hot steamy flat metropolis to a small cold mountain town. I would have never guessed it, but I couldnt be happier in the end with how it all worked out. Its strange how a few years in medicine changes your perspective. I suspect that this was uniquely exacerbated in my case by my near omnipresence in a large county hospital and level I trauma center. There was once a point in my life where being cursed at in spanish by a drunk guy who showed up at my door via helicopter with numerous pieces of long bone protruding through the skin would have been a bit odd. Now it really feels far too normal. In fact, I started this journey by dissecting apart a decaying corpse. I watched my wife lie in the same beds on the same wings of the hospital where I rounded on some of the nameless, nearly faceless, morass and prayed to God that someone who knew more than me was watching. I watched my son go into full respiratory arrest and drop his O2 saturation to 19% after extubating himself in the ICU. The second time I saw it, it seemed eerily normal. As those close to me suffered, I was still bombarded by strokes, gunshots, heart attacks, cancers, and more. I feel as though I ought to be suffering from some sort of post-traumatic stress syndrome. Yet, it all feels quite normal. Its as though it was meant to be the way it was, and Ive made peace with the whole thing in a way in which Im starting to forget how disturbing it all once was. Its not that Im jaded. Im not. I appreciate the gravity of what Im seeing and what I saw. Some of my patients still tug on my heartstrings in a way that makes me reflect on the meaning of it all. Its just that Im used to it. I guess that this is one of the successes of my training. I have also learned such an incredible amount, that I can no longer remember what its like to not know some of it. Ive picked up some amazing skills. I can safely pull fluid out of a swollen belly or out of an infected spinal cord. I am comfortable closing relatively complex lacerations and stab wounds. I feel comfortable assisting a surgeon in those which are even more complex. I actually know what all of those weird numbers written between strangely constructed lines mean, and I can identify whether they signify a problem. Ive had to hold the hands of patients when I was in the unfortunate position of telling them that the problem those numbers signified was severe. Ive also picked up and distributed this data in English and Spanish. Ive also done so with written notes, through interpreters, impromptu sign language, and sometimes Ive had nothing to go by but old notes on an indigent comatose patient with no family to be found. Different people come out of this experience with extremely different perspectives. People enter medicine for a variety of reasons (save the world, make money, love science, etc...). Peoples expectations for medical school are all over the place and rarely on the mark. There is no consensus on the quality or value of this education. I have classmates that would take any offer of student loan repayment and take a job at Starbucks over another day in the hospital. I am one of the people who would absolutely do it all over again. If I knew what I know before medical school, I would still absolutely sign up and do it all over again. The experience is incredible. The extraordinary becomes ordinary. Even with all of the paperwork, beauracracy, physical strain, and student debt, there are still very few other fields where everyday is part of an epic struggle between life and death. Its not all exciting, not all of your patients are good people, and it seems like all of the problems of society have been dumped down upon the decaying structure where you spend 80+ hours a week, but the upside is incredible. Every encounter brings a true window into someones life. People trust you, often because they have no choice. Whether that patients are sitting in a clinic for some medication adjustment or flailing, screaming, naked and bleeding, in the trauma bay, they have at that moment put some portion of their lives into your hands. It is an awesome responsibility. I do not regret taking it. Im looking forward to the next step. While the location is quite different, much of the struggle will be the same. I will continue to compete in this epic struggle as long as the patients continue to bring me something worth fighting for. When I speak with my next patient, it wont be as a med student (or student doctor, or trainee, or whatever). I will introduce myself as Dr. Miami, and the title will be appropriate. Ive earned it, and I never intend to lose it.
The following quote is directly from an MSNBC article: "France and Germany especially have suggested that the better response is not more government spending but tighter regulation. The Obama administration has urged European nations to do more to restart their economies through financial stimulus. Mr. Obama is hoping that by showing a serious commitment to tighter regulation he can more easily persuade other countries to increase government spending and stimulate demand by consumers and businesses that would help pull the global economy out of a serious decline." This would almost be hilarious if it werent true. This argument is sort of like this: Obama:You all should print a bunch of fake money. Europe:We would much rather strangle the economy to make sure it never totally recovers Obama:If I strangle the economy too, will you print a bunch of fake money?
OK, OK, so I said that I wouldnt write about any of this. I wont get too specific. Yet, I really couldnt write on much of anything if I didnt address some current issues, so Im going to write a primer on why booms and busts occur. How Money Works There is no such thing in this world anymore as sound money. I would define sound money as a tool of trade available in a relatively stagnant quantity that retains similar value amongst all participants in the economy that allows for the determination of relative valuation of goods and services while they are traded. Sound money is usually tied to a resource in relatively static supply (often gold) to maintain this stagnant quantity. The actual numbers are irrelevant in a sound money system. If Paul and John each have one dollar, and it costs one dollar to buy a loaf of bread, each of them can afford a loaf of bread. If they each have ten dollars, and a loaf of bread is ten dollars, then they can each afford a loaf of bread. In this case, ten times the amount of currency in the system didnt impact the relative value of each mans wealth. The trade for bread could be carried out just as easily in a one dollar world or a ten dollar world. If the value of the money is unchanging, the breads value may fluctuate over time due to changes in supply and demand, but the specific digits dont matter. Similarly, If the US suddenly stated that every dollar is now ten dollars and in real time multiplied everyones savings by exactly 10x, and multiplied everyones debts, obligations, salaries, etc... simultaneously, the result would be inflation to 10x current value with no real change in the economy, as everyone would have 10x as much money. When we think about it logically, there is no reason to ever multiply everyones money by 10x. The simple act of creating more digits does absolutely nothing to improve the economy. In the above scenario, the printing of 10x the money supply will simply cause goods and services to increase in cost by 10X. It doesnt make everyone 10x richer. Lets take what happens in the US. There is no sound money. The US dollar is not attached to a gold, or silver, or platinum, or anything standard. The money supply can change on a whim (and the quick turning on of the printing press by Ben Bernanke at the Federal Reserve). If all of the money that they printed hit the market at the same time equally in relative proportion to current wealth and all current obligations were changed accordingly, the only impact on the economy would be the excessive waste of paper. However, this doesnt resemble the current situation. Currently, money enters the system at certain points. It goes to banks in the system of fractional reserve banking, and it goes to the government. This gives both banks and the government (along with those well connected to the government) a real time financial advantage. To use the previous example, the banks and the government see their 10x increase before the cost of the bread goes up 10x, giving them greater relative purchasing power. Eventually, the money will even out in the economy, and the bread will rise by whatever percentage increase has occurred, but not before the banks and government have had the opportunity to purchase bread at the lower price with the new money. This allows them to accumulate resources with greater ease than other players in the economy. The other thing that does not change in the current system is debt obligation. If I make a loan, and I charge a 10% interest rate, the balance of the loan isnt impacted by relative inflation. In other words, if I lend you $100, and the money supply goes up by 2x, you still only have to pay back $100, which is now only half as much money in a relative sense. Lenders cannot automatically determine exactly how the government will mess with the money supply, but an adjustment for inflation must be built into any banks lending portfolio if they want to stay in business. In other words, if I determine that with all risk and cost, that I can profitably lend at 5%, and I assume 3% inflation, I must actually lend at 8%. Another point is that the relative change in wealth due to the entry points for a non-sound money supply cause distortions in the value of different goods. Things for which the banks lend money, or the government supports, will increase in relative cost to other goods. This can cause all sorts of confusion in the economy. As the money makes its way into the economy at large, inflation will occur in some sectors, while deflation occurs in sectors where the money first hit. In other words, lets say that the government prints an extra 10% of the money supply, which goes primarily to banks through the Federal Reserve. Banks use this to invest in abandoned mines. The relative increase in money initially going towards abandoned mines will be much greater than 10%, as all 10% increase in the money supply is going towards something that was much less than 100% of the economy. This may triple, quadruple, quintuple the cost of an abandoned mine. However, assuming no further distortions, that 10% will eventually work its way into the economy, and the value of all other goods and services should increase in relative proportion by 10% while severe deflation hits the abandoned mine industry and knocks prices down to only about 10% above the original price (and possibly even lower for a time, as its hard to tell exactly where this bottom price is.) How Does This Apply to the US? Ever since the creation of the Federal Reserve, the lack of sound money has caused ebbs and flows in economic distortion. As Ive already pointed out, increasing the money supply eventually leads to inflation of cost by a similar amount with distortions brought about along the way as the money balances out in the economy. The concept that the government could in some way increase productivity and wealth by printing lots of money is based on the writings of the economist Keynes. It has never worked in action ever, but it is the basis for everything that weve done to bring us to this point. The economic distortions, followed by their corrections, are the likely cause of the business cycle, and the current situation is nothing but an overblown business cycle. After 9/11, the economy went into a recession. It was not severe, and it made sense. Total economic activity declined in the environment. A sound money economic theory would tell you that the recession was due to a real decline in national productivity and that the solution would be to wait for productivity to pick up again. Instead, Alan Greenspan (chairman of the Federal Reserve at the time) implemented a loose monetary policy. This means that he essentially flooded the market with new money to "jump start" the economy. The tools that he used were primarily aimed at banks. Banks make loans, and the increase in economic activity went towards large capital investments that banks make loans for. This caused a relative expansion in business (one for which there may not have been demand to justify), an increase in cost for higher education, and more conspicuously began to drive up the price of real estate. This policy became a runaway disaster. At one point, 8% of the entire money supply was being printed annually. As with my example of the abandoned mines, the relative increase in cost in certain industries that are primarily financed by lending was MUCH greater as the relative distortion hit those industries first. In other words, instead of everyone getting 8% more money and paying 8% more for goods, we saw huge increases in wealth amongst a few people and hyperinflation in real estate (and to a lesser extent higher education and large scale business investment products). People well connected to the government or close to the banks made HUGE amounts of money. Meanwhile, regular individuals became priced out of many housing markets, because they were now trying to purchase the more expensive houses without any increase in personal wealth from the original dollar. The only way to access these things became to borrow money. Real estate debt skyrocketed. Student debt skyrocketed. The relatively loose lending standards that came to exist due to the excess of money at the banks led to easy loans and this brought the distortion into other luxury industries, as people were able to borrow against their hyperinflating real estate values in the form of HELOCs (home equity lines of credit). Labor was also misallocated, as jobs in real estate, construction, lending, etc... took workers that might have worked in other industries. Then it all ended. As the money has made its way into the economy at large, we are seeing inflation in some goods with a severe deflation in real estate. This is just like the example of the abandoned mines. In many ways this is GOOD. Houses will no longer be out of reach for normal people without exotic bank loans. People will be able to work in industries where they will produce things for which the demand is not artificial. The problem is that there is a period in which these distortions have to work themselves out. These distortions were severe, and thus the correction is also severe. All of the people who were employed in real estate, construction, lending, etc.. are now unemployed. They are now not producing anything, which causes a real loss of productivity and growth. This reduces demand, which causes other industries to suffer, and the whole thing ripples like a wave through the economy. This doesnt even take into account the simultaneous correction of distortions in business investment and the loss of the luxury distortions due to the HELOC falling out of favor with declining real estate prices. The point really is, that the economy was severely distorted due to the departure from sound monetary practices and now the correction is severe The end result of all of this should eventually be the new money working its way through the economy and a return to normal relative valuations with the natural fluctuations due to changes in supply and demand. The recession is not permanent. The loss of productivity is not permanent. People will eventually find work again, and the recessionary cycle will unwind. Except.... The Wrong Solution The current philosophy of the US government is to attempt to print more money to "jump start" the economy again. Bail-out, stimulus, TARP, whatever, it all means the same thing. These pork-laden bills funded again with fake printed money will only cause distortions in favor of whatever is in them, which will lead to future recessions. Instead of a real estate bubble, we could have a green jobs bubble, or an infrastructure bubble, or a health IT bubble. Its all the same concept. Attempts to save the artificially inflated housing market will only distort the market further and delay a recovery to natural prices. One only needs to look at what other nations have already done. Japan spent trillions trying to save itself from its own real estate bubble in the 80s, and parts of the economy still havent recovered. In many European nations, with France being my favorite example, bad monetary policy coupled with severe restrictions on business, wages, hiring, and firing creates a distorted environment in which the economy isnt even allowed to adapt. This leads to perpetually high unemployment, as cycles and corrections lay one on top of the other, and nothing resembling a steady state is ever reached. We should be looking in the exact opposite direction for an effective public policy. STOP DISTORTING THE MARKETS. STOP PRINTING NEW MONEY. Let the economy correct. Thats whats happening in the US, and that is why the economy is performing so terribly.
Ive been busy lately. Really busy. Home is chaos, the match (the date on which Ill figure out where Im going to move for the next few years) is less than two weeks away, and Ive been pretty busy on my rotations. This has of course occurred just as everything that Ive been predicting on this blog for a couple of years has started to happen. I could write 100 posts on what has occurred in the last six months. Weve essentially nationalized the banking industry. The presidents forum on healthcare was nothing more than a room full of yes-men parroting back to him the same tired garbage. Things are changing. The inevitable recession is starting to heat up, and weve got atleast 3 different stimulus or bailout packages, aimed at doing everything from bailing out autoworkers to developing Healthcare IT mandates, encompassing trillions of dollars of non-existant money. I could write about these things, but I wont. Its frankly overwhelming. I couldnt explain it all in any reasonable period of time, and I really dont want to throw up a bunch of rushed half thought out posts on really complex material. Instead, this is going to be a little bit personal and very non-scientific. As a man, Ive always wanted to control my own destiny. You could say that I respond poorly to authority as well. I want to live in the real world. This is a world of consequences and rewards. It is a world in which one has the capacity to reach the peaks of the highest mountains as well as fall into the depths of despair. It is a world in which some people are great, not everyone is created the same, and in which a purpose drives the action of people. The maddening descent into socialism, that has been slowly gripping this country with an ebb and flow for a century, is disturbing to me on multiple levels. It is a failed system by every objective and nearly every subjective standard imaginable. Yet, this sort of logic becomes circular in a way. Its like saying that the only thing wrong with socialism is that its bad for society. That sort of argument leaves the door open for new attempt after new attempt to create a more perfect society. No. The reason that socialism bothers me is because it takes away my independence and creates a strain on all of my personal autonomy. My work in both action and reward is never really my own in such a system. It is dehuminizing. The real world isnt pretty and perfect. No amount of central planning, "hope," or "change" can begin to make it so. The real world is dirty. It is a place where people lie, steal, and cheat. It is a place in which mothers get cancer, people become dependent on drugs, and storms rip away homes and communities. The real world is a world in which people face hardships, struggle, and sometimes fail. Yet, the real world is also wonderful. Greatness is born out of overcoming hardship and adversity. Generosity and charity are endemic to the human condition. We create new technologies to fight cancers and disasters. We can treat the sick. We can rebuild what is lost. Humans, individual humans, can take this initiative. People fight to better their condition, and they fight to better the condition of their fellow man. I love living in this horrible, twisted, beautiful world. What disturbs me so much about the events of the last few months is not that I dont know what is going to happen, but that I know exactly what is going to happen. History is riddled with the corpses of attempts to create more perfect societies. We may persistently redefine what it is to be perfect, but outcasting and extermination inevitably follow any attempts at perfection. Many people will inevitably be outside the mold. The USSR, under the premise of trying to create perfect equality, destroyed wealth, killed a few million people, and then became a nation of haves (the politically well connected) and have nots (the not so well connected). The Nazis tried to create the perfect race. A drive for equality inevitably begins to attempt to create sameness, and because no one is the same, chaos inevitably follows. Individuals can never rise to that level of terror, in all of their zeal, without the backing of a coercive state. In my personal life, I have always strived to do well. I seek excellence. Its not that Im perfect, but I want to be better. I want to make my situation better, and I want to improve both my personal lot, and that of my family, in this world. I would say that the party line of any socialist government is exactly what I want, but the actions speak differently. Higher earnings become something that needs to be redistributed for equity. Any desire to accumulate wealth or luxury is seen as evil. Its as though the failure of someone to achieve what I have achieved somehow makes me indebted to him. In this world, the only person who is truly free is the one who has nothing. In fact, he is much more free than nature would let him be, because he is fed from the labor of others. Only in this world is the man who works for what he is given judged a villain, an outsider, the one who needs to be cleansed, in favor of every glamorized dreg of society that has wasted his life. Socialism IS wealth redistribution on all levels, monetary and personal. It is throwing success down on the rocks in favor elevating the bottom of the barrel. In this system, I have been a hero while I have been poor, receiving bounties and gifts for my struggles, all the while knowing that I will soon be the villain if all of my struggles allow me to succeed. Likewise, I have found a passion in medicine, which has become the center of EVERYTHING that is wrong with the way things are done in this country. The only person who is not allowed to make decisions in the current healthcare system is the one who trained appropriately to be able to make them. Autonomy in both training and practice is disappearing at an alarming rate. Physicians have become so worried about their incomes, that theyve lost their souls. This was once a profession in which greatness was expected. It was not flawless by any means, and it suffered from all of the problems that training and practice monopolies create, but greatness was expected. Arrogance was common, but it was also often deserved. In a few generations, we went from giving people arsenic for malaise to being able to successfully sew in synthetic pieces of an aorta. I will defend my desire to earn a good income, and I agree with most of my colleagues that this is a problem. Yet, we cower on the steps of the capitol lawn every year begging for a pittance. Our masters usually give, usually with an attached condition that further erodes the profession. Our training now teaches us to pass along responsibility, and weve taken on the mentality of employees. No longer the masters of the hospital domain, we will soon be employees of a government beauracracy led by a nurse or some beauracrat who will dictate exactly what we do in order to continue to collect that ever shrinking pittance. I want a stimulus. I dont want money beyond what I deserve. The only thing that I want from the government is to stop interfering and let me do my job. If I must be robbed, make it predictable. Take some consistent percentage of my income. Atleast stop trying to make me an outcast for working hard enough to earn enough to have something worth taking. I am a charitable person, but let me decide where to be charitable, and stop creating a situation where generous people cannot afford to be generous because their extras are being confiscated for the "public good." Let me pursue my passion. Let me work towards higher levels of success, discipline, ability, and function without telling me how it must be done. I want to impose nothing on anyone. Im sick of being a slave to student loans and government payments. I simply want to be able to contract with my patients and engage in mutually beneficial agreements. That would be American. That would be antithetical to socialism. That would truly be a stimulus for my soul.
I realize that this is an opinion piece, but it gets the point across quite nicely: http://www.bloomberg.com/apps/news?pid=20601039&refer=columnist_mccaughey&sid=aLzfDxfbwhzs To summarize, written into the new "stimulus" package is money to create what is essentially a big government electronic medical tracking system. Now your doctor (as well as any court, politician, computer hacker, etc...) will have access to all medical records on all Americans at any hospital. There will be significant penalties for not adopting this electronic system within a limited time span. In addition, there is the creation of a new position, some sort of Director of Health Information Technology. This individual will eventually oversee a large beauracracy that oversees all healthcare administration and has the right to impose financial penalties on physicians who dare to stray (regardless of patient preference). In addition to how scary a centralized database of everyones personal information is, there are also a lot of unforseen consequences of this setup. I was speaking with an attorney that, I know quite well, who mentioned that there may be serious implications to the "standard of care." Prepare for lawsuits whenever anything goes wrong with medical care that doesnt have access to these electronic systems. What, grandma had a drug reaction in your makeshift clinic inside the hurricane disaster zone. Had you read through all of the data in her electronic medical record at some place with electricity, you might have noticed the same reaction ten years ago in a physicians note from another state. Lawsuit! The director with his minions overseeing all clinical decision making is self explanatory. I dont have a lot of time to write today, but I thought that this needed to be exposed further. This is scary stuff.
While this post may be peripherally related to some of the content on this blog, Im going to admit that this is me just taking some liberty and putting down some sleep deprived thoughts. Its my blog. Ill do what I want. Before I entered the hallowed halls of what has become the modern hospital, I lived another life. After some youthful impetuousness (and a lack of capacity to get rich quick in my late teens), I decided to go to college. I was a pretty big screw up in my earlier years, and I had to spend a couple of years at a community college to develop the credentials necessary to attend any sort of quality university. Going into medicine was not something that I really thought about, and I would have questioned whether such an undertaking was even possible considering my background. In those early years of college, I studied some programming, some music, and some other hard disciplines. In the end however, I pursued a degree in anthropology. My wife earned a degree with a double major in anthropology and classical studies. This made for interesting dinner conversation, absurdly intellectual with very limited life experience to back up anything that we talked about. I will now, dear reader, impose some more of that esoteric armchair theory on you. At least Ive now got some genuine experience to back it up. The Giants of History Leadership is really a very treacherous thing. If you think about it, the rate of assassination amongst kings, presidents, etc... is so high that one wonders why anyone would want that sort of job. These individuals are often in very complex political positions. They have to promise things to attain their positions that they have limited capacity to deliver. They become figureheads for blame when things go wrong. Leadership styles and the power in the hands of an individual leader vary, from the dictators to the "presidents" of countries largely in anarchy. The thing about being a dictator is that you get to have what is essentially absolute power. This isnt just economic. Dictators largely attempt to alter the very fiber of the culture over which they rule. Most first generation dictators encounter a people with some measure of independence interwoven into the common fabric of society, and this is the sort of thing that an iron fisted ruler would want to eliminate. If not, these people might very well take it upon themselves to form a revolution, and that sort of thing is looked down upon by the type of person that wants absolute power. This is of course the downside to being a dictator, because youre the only guy with enough power to blame when things go wrong. This can easily be contrasted to the modern democracy, where who gets the blame is not so clear. Every errant happening results in a chorus of "its not my fault," followed by a game of circular finger pointing. Who actually did what becomes secondary. People take the fall for things that they had no capacity to control, and people who can be directly connected to a crisis often walk free. Theres always a way out, but no one is safe. Staying in power is usually a struggle, even for one individual. When a similar line of rulers keep in power for an extended period of time (or when a single kingdom maintains dominance over other kingdoms for an extended period of time), it usually comes about as the result of some unique circumstances. There are some common threads that unite these sorts of rulers. Well call them the giants of history. The Romans The Romans (of the famed Roman Empire) started as little mini-kingdoms dominated by hovel dwelling farmers in an earthquake prone peninsula. The degree of dominance exerted by the Romans was so profound and longstanding, that one might argue that we still havent really outlived it. The Romans eventually came to develop this international power with the convergence of a couple of unique philosophies and the right circumstances. Roman prestige was largely tied to military success. There was also a dominant sense of the local culture. The Romans, like all great conquering empires, were extreme xenophobes. This success however did not drive a huge amount of micromanagement. Content to simply be the dominant culture, the Romans never really attempted to make all of their conquered territories Roman. Conquered states became sources of revenue and sometimes slaves. Romans often set up local figureheads who shared a common culture with the local population (think King Herod of Israel). Over time, Roman influences worked their way into some of the farthest reaches of the empire, but this was a natural acculturation developed due to the relative ease and free flowing of ideas through the relatively safe avenues behind the Roman front line. In this respect, the empire made everyone more Roman precisely by not trying to make them so. Future conquerors were successful for a time using variations of this technique. Genghis Khan used to present an ultimatum to cities as he would pass. Submit to me and pay some taxes or die. Those that agreed would live life the same as they did before, accepting the mild hardship of a foreign imposed tax. Those that disagreed usually found themselves decapitated with a large pile of skulls set at the front of their respective villages. After each scenario played out a few times, people decided that they preferred the former and the empire exploded. The relatively short life span of the Mongol Empire was largely due to the inability to pass on leadership. The nomadic pastoralist ideals of what makes a good leader and succession were simply not in keeping with controlling a vast stable empire. The British The British Empire took control of vast lands through military conquest in the same manner as the Romans. These lands were even more spread out and vast, owing to advances in technology between the two time periods. The primary difference between these two empires was in what happened after the conquest. While the Romans were relatively hands off, the British were very hands on. As was said, the goal was to, "make the world England." While the Romans taxed remote regions to bring wealth back to Rome, the British would largely use pilfered resources to continue campaigns to make more of the world British. In keeping with the general social shake-ups of the early industrial revolution, London became full of sick people, living in filth. As the capital got sicker, Britain got bigger. As the saying went, "The sun never sets on the British Empire, but it never rises over the streets of London." Unlike the Romans, who could often maintain amazing degrees of social harmony between conquered groups behind the battle lines for centuries, the British were constantly dealing with uprisings. As illustrated by the American Revolution, the British social policy was stifling enough to turn colonists of British descent against England. Thus, the very large British Empire really didnt last all that long by historical standards. Remnants of British culture are unmistakeable in countries from the US, to South Africa, to India, but these places did not remain England. The Leaders Themselves Living in South Florida, I have spent the majority of my life very close to one of the most enigmatic dictators in recent history. Fidel Castro led an internal revolution, consistent with values (right or wrong being irrelevant) that were found within portions of the local population. In this case, Fidel attempted to internally change pieces of the Cuban culture, but there was no attempt to impose an outside view on the people. In a funny way, his leadership remained so dominant because the aggressive sorts who might have challenged him escaped here to Miami. As he has become frail recently and passed off some of his power to his brother (also old and not too far from becoming frail as well) it will be interesting to see how it all plays out. I suspect that change is brewing, though only time will tell. One can come up with a veritable laundry list of ruling parties or dictators that came to power, attempted to control the dominant culture, and fell as a result. Sometimes they never really took power completely, other times it took 50 years, but the result was the same. The USSR, the taliban, etc... The United States There is one final approach that we havent mentioned. At its inception, the US began to grow rapidly. The US largely didnt conquer. They displaced. Early American conquest had less to do with subjugating people and more to do with killing them or moving them out of the way. Good or bad, this is a relatively effective approach if you have a population with enough size (or growing at a rapid enough rate) to control the lands youve taken. It is very clear that much of the US in the future will never return to being Seminole land or part of the Iriquois Nation. The modern US approach to conquest (both internally and externally) is very different. The export of American ideals (democracy by force) and the internal conflict between similar (though not so similar) sub-cultures within the US have created a very different America in the last 60 years than the one that existed before. We implement policy from Washington DC where we attempt to tell people on the other side of the world when they can walk outside. We implement policy from Washington where we tell people in Spokane, Washington how much water they can use when they flush the toilet. The degree to which we attempt to centrally micromanage every facet of daily existance is frankly unheard of in human history. Historically, most dictators just didnt care enough about these little nuances of daily living. The Conclusion to the Ramble Just remember that the Romans had wild success with economic subjugation utilizing military force. The British on the other hand (and the Russians) couldnt hold social subjugation together. While people will complain about, but tolerate, virtually every tax or economic burden that is imposed which does not impair their ability to attain a reasonable standard of living, people defend their culture vehemently. The rambling point is that the unless the US intends to anhialate everyone both internally and externally who disagrees with the prevailing politically correct point of view, history tells us that our current policy might not bring us down the road of ongoing leadership.
The only thing universal about individual morality is that it is in fact individual. Regardless of whether you believe in absolute truth, a religious code, the golden rule, etc, the fact of the matter remains that everyone interprets these things differently. What is or is not truth is secondary in the scheme of what actually happens. This becomes even more complex when considering that certain codes diametrically oppose each other, and some codes clearly sacrifice other people who do not subscribe to the code. Theres an interesting conversation about abortion for example over at http://studentdoctor.net in the topics in healthcare forum debating this very topic. One side essentially claims that failure to perform abortion or refer for it is in some way a direct violation of medical ethics. The other side claims that doing these exact same things is a violation of the will of God. Often, neither side believes in an allowance for a differing opinion. There is one thing however, that holds universally true, and no amount of denying it changes it. Resources are scarce and limited. They may be used ever more efficiently and/or prudently, but there is only so much matter in the universe. This example can be applied to medicine. In its current state, medicine in its attempt to be all things to all people is rapidly becoming nothing to anybody. Its becoming a sort of service black hole and an expensive black hole at that. Rising from ~5-6% of GDP in the days before Medicare, healthcare costs are now estimated at about 16% of GDP. Sure there have been some technological advances, but there are clearly some changes in our approach to healthcare distribution that perpetuate these deficits. Lets consider some of the paradox into which we are plunging ourselves: Many people advocate doing everything that people need. This is all well and good, except that defining "need," is not all that easy, and there are many times that the "needs," of two individuals conflict. Lets look at the following example: Two middle aged men walk into a community emergency room. One has terrible crushing chest pain, radiating to the left arm. He has specific V2-V5 ST-segment EKG changes consistent with an anterior wall MI (heart attack) and his cardiac enzymes are through the roof. The other man has a non-compound humeral fracture (a broken arm) and is in considerable pain. The ER is currently full of other patients. What to do? In this case, the usual response is that we have to triage. Even with the full ER, we will make room for the man having the heart attack. Well start treatment. The man with the broken arm will have to wait. In fact, some of the patients who were already brought back will have to wait. In this example, it is clear that the man with the heart attack is receiving a benefit at the expense of the man with a broken arm. In this particular example, weve sort of attempted to minimize "badness." The long term outcome for a delay in treatment for the fracture of a few hours is probably nothing, while a similar delay in the treatment of the heart attack is potentially fatal. Triage is sort of the original approach to the collision of scarcity and morality. If we cant do right by everyone, well attempt to get the most right out of the situation. The original concept was that we would go after the sickest who could be saved first and then in order of decreasing severity so as to maximize the chances of the most good outcomes. In modern day medicine, we have no real triage outside of the emergency room. We attempt to just give everyone everything, the cost be damned. Its the equivalent of building another room onto the ER and hiring another doc every time another broken arm comes in the door. Its incredibly expensive, and it doesnt necessarily improve anyones long term outcome. We also tend to ignore the most important rule of triage, in which we do not waste scarce resources on individuals beyond repair. Every severely demented nursing home resident who takes a two week vacation in the ICU costs the system tens of thousands of dollars in order to "save" a person who is beyond repair. This is partly why healthcare is 16% of GDP. Another question is how far do we go to ensure 100% accuracy. In other words, how much money are we spending over progressively smaller benefits in diagnosis and treatment? This is the question that comes about in the current predatory legal environment. The added cost of testing in an environment in which every individual miss imposes a nearly insurmountable burden on the system creates a system in which every low yield test in the system is ordered in order to avoid error. Is it right to tax a large segment of the population solely to fund CT scans of the brain for low risk falls in alcoholics? We surely pay for them now. Its very simple. In any type of collectivist system, the "morality of what is being provided will eventually bump into scarcity. There has to be some sort of rationing. There has to be some sort of triage. Someones rights will have to take a back seat to the more critical or those with better long term potential. Period. If not, any system goes bankrupt and no one benefits. The alternative is a non-collectivist system. In this system, individual choices determine what happens. Individual morality may factor in, but there is no "morality of the system." This is where competition comes into play. In our above example, the man with the broken arm may go to a different hospital. He may have to pay more. Perhaps the number of people waiting for treatment with broken arms leads a group of entrepaneurial orthopods to open a special orthopedic ER that caters to broken arms and doesnt treat heart attacks. Then theres no conflict at all. If someone felt morally compelled to treat heart attacks, they could go around treating all heart attacks, regardless of ability to pay. This is rationing in a way, but the rationing is done by individual preference and morality. In this particular approach, no one individual can impose a specific morality on everyone else within the system. Regardless of how you approach it, no code or philosophy on whom is entitled to treatment can overturn the laws of nature. Matter can neither be created or destroyed. Resources have to be rationed. The only question is whether that rationing occurs by an imposed centralized system or the individual codes of the individuals involved.
Everyone from my classmates to some of my family members has recently been talking to me about universal mandated health insurance as the solution to our healthcare woes. Apparently, we can solve our healthcare crisis by making everyone give money to insurance companies. The two things that I find most amazing about this argument are: 1)It is almost universally proposed by individuals with socialist leanings 2)The argument as to why it should work is usually that it is a capitalist or free market solution. I think we need to get a couple of things straight. There is no such thing as a federal free market mandate. By being a federal mandate, it automatically ceases to be free market, which is essentially defined as being devoid of government interference. You cannot mold capitalism to your personal whims. Capitalism is why some people dont buy health insurance now. Forcing everyone to buy it is a strangely crossed socialism/fascism hybrid in which we force everyone to subsidize each other while simultaneously creating a profit for a private financer that is controlled by public regulatory bodies. This system cannot work effectively. Ill explain why: As a precursor, let me point out a couple of universal points that are argued to achieve the mandated health coverage utopia: 1)People who can "afford to pay," are required to buy health insurance or face stiff tax penalties 2)People who cannot "afford to pay" are subsidized to some degree in the purchasing of insurance, with some groups inevitably being fully funded. 3)Insurance must cover pre-existing conditions 4)There is some control on insurance rates 5)Your insurance cant "drop" you 6)There is some continuing tie to employer funded insurance Heres why these things dont work: 1)It doesnt address the overall cost at all 2)By forcing those who can "afford to pay" to also pay the taxes that subsidize those that cant "afford to pay" you are creating socialized medicine with two middle men. You have the government AND the insurance company. Far from being a free market solution, you get a government beauracracy and a company that largely generates profits by lobbying the government beauracracy and denying payment for things that the government gives it money to pay for. 3)If insurance covers pre-existing conditions, rates have to go up. 4)If rates are controlled, they cannot go up to cover pre-existing conditions. Companies will have to lobby for a rate hike that no one can afford, receive subsidies (a second knock against those who can "afford to pay") , or operate in the red. 5)By requiring insurance companies to keep patients for life, you require them to charge everyone more up front to deal with the inevitable risk factors that will appear later 6)You continue to rely on the employer based insurance model which is itself a relic of the New Deal Era as an attempt to avoid wage control policies. 7)The new system does NOTHING to address malpractice problems 8)The new system does NOTHING to ration expensive care 9)The new system becomes a hindrance to a new system, because it is now a mandate. If people had to pay for their own care, there would be no medical cost crisis, because contrary to popular belief, the cost of medicine would over time come down to the price that people could pay. If you eliminate some of the malpractice incentives to over test, people will simply not want to pay for low yield testing. Insurance companies couldnt sell insurance unless it was cost effective. In a system without EMTALA, the cost effectiveness of the insurance system would be better assessed, as people would actually have an incentive to buy it. Something doesnt become capitalism because a corporate entity is making a profit from it. Something is capitalism when it is the product of the natural adaptations of the free market based on the individual preferences of the people within the system. I am actually going to take a stance here that most people will be shocked by. I believe in a two-tiered system. I believe in a locally funded county system that adapts to the local needs of its area and provides safety net care to prevent the spread of infection and control disease. I believe that the proper role of a lot of residency training is in these institutions as it was originally designed to be. I want the feds out of 99% of medicine beyond the prevention of nationwide disease epidemics and bioterrorism. I want the local governments out of the private medical system. A private system with a small safety net is better than than the hybrid mess we have created, and seem to want to perpetuate, any day of the week. That would be a true capitalist solution to healthcare. Entitlements at the public institution would be limited by budget considerations, and private healthcare would function like all capitalist systems, providing what it can based on the preferences of the people with the most cost effective solution that meets those demands outshining the others and taking market share. Only in a dream.
So it appears that the IOM has released its recommendations to congress regarding duty hour restrictions for medical residents at programs receiving federal funding (AKA every program). Id like to take a minute to go over some of these recommendations, discuss the potential impact, and then explain why this is a bad idea. I do not have a comprehensive list. All of my information is second hand, as the report itself is rather expensive to access. If I write anything inaccurate, or anyone else finds something in the report worth mentioning, please let me know. 1. Maintain 80-hour workweek. There is no recommendation in the report as some program directors feared limiting resident work hours to 56 hours a week. The cap is however limited to 80 hours a week, which would eliminate averaging. I guess the big change here is that a congressional mandate would be accompanied by stiffer penalties than the RRC can possibly impose. It would also make some efforts to balance rotations a bit more complex, as the strict 80 hour max is currently not the norm at many surgical programs that have residents cover >80 hours some weeks and make up for it during others. 2. Maintain 30 hour shift max. There is no change here, except for the fact that they now want to require a 5-hour mandatory sleep break within the shift if it supercedes 16-hours. This would be the first required nap at any adult job in the history of working in the United States. It also makes it virtually impossible to cover a night call with only one resident. 3. 1-full Day off per week with NO AVERAGING. For those that suddenly found that weekends could exist again as part of the 80 hour work-week, no more. There are no more golden weekends under this report without extra days off (Something hard to give in a system in which there are multiple residents required for each call and residents cant alternate going above and below 80 hours each week. Whereas now some programs have residents alternate weekend call, this system will soon be a thing of the past if the recommendations are implemented. 4. Call no more often than q3. There will be no more q2 call even for a small stretch. This means that you cannot alternate q2 call to cover vacations. You also cannot do a Friday-Sunday call to give another resident a weekend off. 5. No more than 4 night shifts in a row. Of all the recommendations, this really makes the least sense. The obvious adaptation to all of the above call requirements is to establish a night float. This would only be possible under the recommendations by having people switch onto and off of the night float every few days, making establishing a circadian rhythm impossible. 6. Interns cant be the only MD in-house. This is really ridiculous, as many hospitals currently have no physician in house. In other words, it is legal to have the intern go home, leaving no one in-house, but it would be illegal for him to stay alone. It also makes the call schedule even harder, as junior call interns couldnt cover a potential nap break for the senior residents in a junior-senior joint call system to adapt to the required nap under the new recommendations. There are some more, but theyre escaping me at the moment. Feel free to post them. Why are we here? For those of us in the trenches, this really makes no sense on so many levels. I am going to be the resident who should theoretically be receiving the benefit of these work-hour restrictions. By continuing to have 80 hours weeks with a bunch of crazy rules implemented on top of them, compliance costs will go through the roof, AND it will do nothing to solve resident fatigue. There is no evidence base for any of these changes. Why would we even consider implementing these changes as a requirement for every program with no reason to believe that they will work? Again, this smells an awful lot like what happens when we start to dismantle the free market. Medical licensing is a hot-button issue to bring up in some groups, so I will not go all the way back to that point in terms of market intervention. If we assume that the government should require some minimal level of proficiency in order to practice medicine, then we have to question how that level should be obtained. In other professions with similar requirements, the usual course of action is to have a some degree of professional schooling as a primary requirement. This is the case in every profession from law to architecture. After these different types of schooling, there is usually a state sponsored exam. The bar exam might be an example of this. There is some variability between states and professions, but the concept is the same. This is also true in medicine with the USMLEs, which every state has now simply adopted as its state licensing exam. After this, certain professions require different things. Most people agree that hanging a shingle immediately after school is complex. In law, this is legal but relatively rare. Most people will go work for a firm, where they will get real world experience. The firm can be a solo attorney or a Devils Advocate styled enterprise. Nevertheless, it is not formalized. The formal background is found in the schooling itself, and everything thereafter is variable, creating a vibrant heterogenous market. Pay ranges are wide, with some new associates demanding >$100k/year and some receiving $25k. People migrate towards positions that meet their needs for work, pay, hours, training, environment, location, etc... Everyone is a potential employer and a potential trainer. In medicine of course, we have a formalized residency requirement. Without belaboring the entire history of residency training, some of the following things are true: 1. Every physician who wants a license has to complete somewhere between 1-3 years of formal residency depending on the state. 2. Every physician who wants to specialize must for the most part fit his interests into a series of predefined specialties that require anywhere from 3-10 years of formal training depending on the specialty 3. The government funds almost all of these positions through Medicare 4. Residents receive a surprisingly similar percentage of this money at most programs, with almost all salaries falling between $40k and $60k depending on location, specialty, and post-graduate year. This set-up is why we are currently dealing with the IOM report. The government funds these positions, which gives them excessive amounts of power in regulating them. Work hour restrictions get tricky on a constitutional level when the government isnt funding things. When they are, its very simple. They can simply take the money away. It is also very clear that different people want different things from their post-graduate training. Some people like to work hard and often. Others have different priorities. Different people have different tolerances for stress, labor, sleep-deprivation, etc... Yet every program is regulated to be similar. As an attorney looking for a niche, lets say tax law, there are a million different ways to fall into the niche. You could get a masters in tax law, you could go work for a big tax firm. You could work for the tax division of a big firm. You could enter a small firm looking to expand into tax law and become the specialist by doing the job. You could hang a shingle. Its all possible. In many ways, the only difference between one of the big firm jobs and a medical residency is that the big firm usually pays better up front with less remote guarantee. However, all of these entities are possible. They are regulated by their ability to get people willing to work in the way that they require, and they are limited by the ability to provide for their clients. A few botched cases, and a big firm doesnt remain as big. In residency, the only way to change things is to lobby congress. You cant quit easily. You cant find a different training model for your specialty. You cant just go out and work with someone who performs your specialty until you start to feel proficient. One might say that this isnt so bad. I mean, at least well set a floor on quality. Close reflection (and the recent IOM report) show what the problem is. There is no natural adaptation. Residencies functioned until the original RRC changes in nearly the same manner in which they worked 50 years before, with the only major changes being the progressive addition of more years in order to be qualified to do the same thing. The 80-hour work week was the first real new step in 50 years. Normal markets, including those in training programs adapt slowly over time. These changes were abrupt, and rather than being based on mutual preferences, they were imposed from the outside. There was an extremely hetergenous response. Many couldnt or wouldnt adapt at first. Many residents loved it. Many hated it. Many ignored the rules. Many programs made them ignore the rules. Some of the biggest programs in the country went on probation. Eventually, people mostly adapted to the new rules, but there is no evidence that theyve done anything to improve outcomes. Many residents are happy with the changes, but some are certainly not. One must question the decision to force people to work less who want to be more productive. The new restrictions are not based on anything. There is no logic, no evidence, and no real objective standard behind them. However, we are a federally regulated and funded enterprise. Arbitrary rules and compliance are the lot in life of such an entity. Medicine has evolved over so many years within the system of federal funding and control, that it is hard to see how it would work without it. This doesnt however, mean that it couldnt, Residency would probably have to continue to exist in some form. No one is ready to be a surgeon at the end of medical school, but the rules would have to be different. That may not be so bad. We might eliminate some the conflicts that we have now, where a PA straight out of school can bill as a first-assist on an operation where a chief resident with over 1000 cases cannot. We might change the system in which it costs more money to hire a secretary to do scut work than an intern. Many programs are dedicated to training, but the natural evolution of a training system would slowly lead to a proper balance between training, service, and the reimbursement that is proper to achieve those balances. Service could be exchanged for teaching, and resident reimbursement could be based on what they actually produce. Some could forego residency, though hospital credentialing for all but the most simple practice would require some demonstrated competency, setting a natural floor on the system. That was probably a long incoherent ramble. Whatever the case, I am scared. As a new trainee, I know that the system is flawed, but it does produce competent physicians in large part. If implemented, these new rules will hit hard. Because weve let ourselves become beggars at the foot of congress, we are essentially powerless to stop them.
I do sincerely apologize for being absent for so long. After completing some difficult clerkships, Ive basically been living out of town. You see, Im currently on the interview trail. Im looking for a surgical residency. The current system is a bizarre one, in which I pay huge sums of money to fly around the country and try to impress people. In return, they usually put together a pretty nice looking package with which to impress me. A typical interview goes something like this. 1. Night before event- All but one of my interviews has been associated with a night before event, at which applicants meet the residents and possibly some of the program leadership. These events are highly variable. They range from being told to show up at a bar and buy your own drink to a formal reception complete with Chardonnay and Filet Mignon. 2. The Interview day- You usually come to the hospital where there will be some sort of introduction to the program. You will then attend some sort of academic conference (a way for the program to show off its academic credentials). After this usually comes some combination of talks, one-on-one interviews, and hospital tours. Then there is usually a lunch. Residents are usually invited to the lunch for one last chance to ask questions. Some programs also spice lunch up a bit. One program gave lunch against the backdrop of a talk from an eminent trauma surgeon. One program broadcast a live gastric bypass during lunch (probably the most amusing). There may or may not be an afternoon activity. There are some variations on this, but it really is pretty consistent. In keeping with the major theme of this blog, Ill tie a little bit of practical economics into the discussion: This process is expensive, especially if you are applying to a competitive specialty. More competitive specialties require more interviews and are less likely to help you financially. In other words, some of the less competitive programs will partially fund flights or pay for a hotel. This is often not the case in the more competitive programs. Thus, a plan is paramount in figuring out a way to go where you need to go. If you are applying to a less competitive specialty, a few interviews at dream programs, a couple of programs of even qualifications, and perhaps a safety school is all that is really necessary. I would recommend no less than 10 if you are applying to surgery or Emergency medicine (15 is better). For Plastics, Ortho, Derm, etc... I would take every invitation that I received. There are two major expenses in the process. The first is the cost of travel. With gas down to under $2/gallon, driving is often a very viable option. If youve got a bit off time off and friends in multiple places with a couch, driving can easily make your trip cheaper. If you are staying in state, it really makes no sense to do anything else. I pulled off one interview in which I drove across the state and stayed at a friends house for under $100. If you absolutely must fly, comparing flights is very important. Orbitz is sort of my favorite site, because it is pretty convenient to book flights and car reservations at the same time. I also find it to be a little more user friendly than Priceline or Travelocity, though these are also viable options. If possible, non-traditional carriers such as Southwest or JetBlue can be an amazing alternative. Southwest has the best service of any airline in the country. It is not close. Southwest also sometimes has some amazing flight deals online at Southwest.com known as "wanna get away" fares. Ive actually flown across the country for $70 including fees this way. It may or may not be more expensive to link your rental car reservation to your flight, so you should check both ways. It may also be cheaper to fly into a regional center and drive to a close by program than to fly to a local airport. As an example, flying into Detriot and renting a car in order to drive to Lansing or Ann Arbor is a lot cheaper than flying direct and finding local transportation. The second major expense is lodging. As previously alluded to, staying with friends or family can really cut down on costs. If this doesnt work, cheap hotels are a must. Many programs will recommend a hotel. With a couple of exceptions, this is usually NOT the best place to stay. Including taxes and fees, most cities outside of the biggest will have numerous hotels in the $50/night range with fees and taxes included. I made the mistake of staying at the recommended hotel once. Then I found the hotels.com deals to be excellent and I never looked back. Other price saving tips might include using only carry-on bags. Many airlines will charge fees to check baggage. If you are willing to sacrifice a bit of comfort, a suit can often be worn onto the plane, freeing up storage space. I took week long trips with multiple one-way flights while doing no laundry out of a carry on roller bag using this method. Airport food is also expensive. That being said, you may find yourself stuck in a terminal for 3 hours waiting on a delayed flight. An overpriced hamburger or drink may not be too much of an expense at this point. For those in the early stages of medical school, I would recommend that you start putting money away early. I managed to underspend by a couple of thousand dollars each year, and that has basically funded my residency travel. Even though youll be paying some extra interest, these loans are often deferrable, often qualify for a forebearance, and will definitely be included in any calculations of income based repayment. This makes them VERY friendly compared to conventional loans. There is an interview and relocation loan available for up to $25k. This is a private loan that requires good credit on the part of the borrower. It also comes with variable interest rates and no promise of future deferment. In other words, it is not the best loan. Its also not easy to obtain in todays credit market. All in all, it is better than not doing enough interviews to match. Its about time for Thanksgiving dinner, so Ill be going. If anyone has any questions, feel free to post them and Ill do my best to answer them. Happy Thanksgiving.
Its official. The US Congress passed a $700 billion dollar bailout package today, and it was later signed into law by president Bush. In one of the most interesting moves by government that Ive ever seen, this thing actually failed earlier this week. I was a little shocked, though a little bit cynical. I suspected that it would lie dormant until after the election, after which it would be shuttled through in pieces with little fanfare. Instead, greater than thirty congressmen changed their votes in less than a week with the only changes being some random tax cuts and changes to the FDIC limits on accounts that will barely impact anyone (who the heck with >$100k in liquid cash keeps most of his money in a savings account?). Apparently two more days of talking about "greedy" people and a couple of woe is mes on the stock market is enough to convince more than half of the representatives of the people to do something that is both stupid AND clearly against the will of most of the people. This isnt even a case of the people being stupid. In other news, Wachovia is attempting to ditch its still incomplete deal with CitiGroup in favor of a deal with Wells-Fargo of San Francisco. The CitiGroup deal required a large influx of capital from the FDIC (aka the taxpayer). The Wells-Fargo deal gives Wachovia and its shareholders more money, more autonomy, saves everyons deposits, AND requires ZERO government intervention. Wells-Fargo, which is an entity that didnt invest heavily in sub-prime mortgages and is not going under, is currently in a market position to gain market share, largely due to more intelligent financial decision making. The FDIC is challenging the change, apparently ever eager to waste taxpayer money. First we blame the market for central bank failures, then we prevent the market from fixing the problems that we blame on it. Brilliant. Im still not optimistic.
Im sorry for the last couple of posts, which have very little to do with medicine. Were in the midst of one of the most crazy reform periods in modern business, and I feel compelled to document what is wrong, what will go wrong, and why it will happen. As of right now, the republican (supposedly conservative) plan to deal with the economic "crisis" is a 700 billion dollar bailout package. This package in perspective, is greater than $2,000 for every man, woman, and child in the US. It is more money than the entire cost of the war in Iraq. One might say that this would be a logical point for the democrats to crack down on corporate welfare, which is one of the few places that I tend to agree with democrat economic philosophy. Forget it. The democratic plan seems to be, "OK, well give $700 billion to corporations that we dont have, but only if we make even more fake money and give it to people who took out stupid loans over the last 5 years to buy houses that they couldnt afford." As I said at the end of my last post, Im not optimistic. There are three highly irresponsible parties in this mess. Number one is the government, who created that housing bubble with fake money and cheap credit from the federal reserve. Number two is the individual running the companies on Wall Street that used the fake money to make bad loans. Number three is the individuals who took the bad loans and used them to overpay for housing. Everyone agrees that this failure is largely secondary to the housing market failures. The solution seems to be to bail out parties #2 and #3 with party number one, using money from people who are not responsible. Brilliant. However, this is becoming far more sinister. Rather than letting companies fail, the government is instead buying them with taxpayer money. At first, Fannie Mae and Freddie Mac were nationalized. Because they started as nationalized institutions that were later privatized, I wasnt sure what the effect would be. However, we have now effectively nationalized AIG, and as part of the bailout plan, we may look at a significant amount of nationalization on Wall Street, primarily in the insurance and financial sectors. This puts the government in charge of the majority of home mortgages and the largest insurer in the country. As opposed to allowing for a short term shake-up in the market, with some firms going under, we are going to insure that the government will compete within the private sector, using taxpayer money to prop itself up. In other words, this is the end of anything resembling true free market capitalism in the financial markets. For those of you who think that this is a good thing. I will point out that the two periods with the greatest amount of new government intrusion into the market were the Great Depression and the Stagflation Seventies. To rescue the market the first time took a world war. The second time took deregulation. What we are witnessing is an Atlas Shrugged style takeover of the financial markets, with Henry Paulson playing the role of Wesley Mouch (a reference to my objectavist readers). As the inevitable slide continues, we will continue to do things to "fix it" that will infact promote future slides. The right thing to do is let the insolvent firms fail. This will lead to short term chaos but long term stability. It will also make people much less likely to engage in this sort of behavior in the future, because every who does it now expects a bailout (we seem to never disappoint). It will also prevent the government from destroying the future of the financial system by nationalizing companies and then competing with an unfair advantage against private firms (causing greater distortions in the market). As I said before, Im not optimistic, and nothing in the last week has done anything to change my feelings on the matter. You know, as much as I really detest both major political parties, I never thought that the first steps towards true nationalization of the economy would be undertaken by the republicans.
Ok, Ok, so I havent published in a long time. Between the kids, and a surgical Sub-internship, Ive been a little bit preoccupied. However, I think that its time for another post. This will lean a little more towards economics than healthcare, but I think that its crucial at the current juncture to understand what is happening in the economy. Yesterday, the Dow Jones Industrial Average (DJIA) crashed down over 500 points, the worst loss since trading resumed after 9/11. Financial giants Lehman Bros. and Merril1-Lynch essentially fell apart, with one going into bankruptcy and the other being bought out. AIG and Wachovia are still looking a little shaky. The government has actually seized control of Fannie Mae and Freddie Mac, the largest government takeover of companies ever. Gas spiked again in the face of plummeting commodity prices, especially oil. Whats the deal? The History I think that many people fail to grasp the significance of what has just happened. If you put all of the above problems in with the current real estate crisis, youve got a recipe for financial disaster not seen since the late 1920s. Im not saying that I think that were in for another great depression, though I dont think that anyone thought we were in for a great depression until the great depression occurred either. It is clear that something is terribly wrong. That something is a little bit complex, but I will do my best to elaborate on the problem. In the early part of the 1990s, as Bill Clinton was taking office, and the DJIA lived below the 3000 mark, a couple of interesting things were happening. It was the beginning of a revolution in the way everyone did everything. Computers were becoming integrated into the fabric of day to day business. The internet was becoming available to private individuals, and communications were becoming cheaper than ever thought possible. We were also in a small recession. The Fed reacted to this small recession by pumping liquidity into the financial markets (fancy jargon for lowering the rate at which banks can borrow money or allowing banks to keep a smaller percentage of their total portfolios on hand so that more money can be lent out). This coincided with a progressive explosion communications and then the rise of the dot-com boom (anyone remember Silicone Valley?). It was the beginning of a recipe for an economic boom. There was cheap money, new technology hitting the business world, and after the failure of the democratic party to keep congress after the first two years of Clintons leadership, government gridlock with minimal regulation on the booming economy. Oil went down to close to $10/barrel. The DJIA soared over the decade, jumping from less than 3000 to well into the five-digit range. Small booms rippled through many industries. There were small spikes in real estate. Some of the biggest transformations however, occurring outside of the technology sector, occurred in the financial sector. Banks, which had been de-regulated in the 1980s, were now able to fuel the growing demand for start-up capital. They were aided by cheap money from the Federal Reserve, which allowed them to operate outside of the normal boundaries that contain risky practices in a true free market economy. That being said, a great part of the boom was the result of natural market forces reacting to unprecedented changes in efficiency afforded by changes in technology. Around the turn of the century, this boom hit its peak, and a small recession took place afterwards. This was nothing catastrophic. Central bank liquidity always fuels a boom-bust cycle in business, as markets have to adjust from the distortions on the market imposed by newly printed money interjected into a market with no economic foundation to support it. Then it happened, 9/11. A couple of big explosions at the financial center of the US caused a number of big Wall Street players to become very concerned about the economic future of the US. When trading eventually resumed, there was a quick and fast drop in virtually all indicators of the health of the US stock market. What to do? If you ask me, this is the turning point where things went bad. All damage done to the economy before this was correctable in a relatively painless manner. The market wasnt more distorted than the usual state of affairs. The average person was able to afford virtually everything he needed. However, Americans are not very patient people. We may have set up an economy in a hurricane zone before this point, but the next maneuver was when we finally started building on the sand. Alan Greenspan, as head of the Federal Reserve, announced drastic rate cuts, allowing banks to borrow money at incredibly cheap rates. This essentially meant that banks could get money at far below market rates and lend it out at a profit. The Fed also began to print money the like the printing presses were gonna go out of style, reaching a rate of printing 8% of the entire US currency in circulation per year. This had two drastic effects on the market, and it was the sentinel time for setting up the current conundrum. Effect number one was to distort the relative location of money within the economy. Financial institutions were receiving this money first, giving them greater relative wealth than other businesses (and individuals). The extra cash caused inflation, but not until it was spent by the big financial institutions and infused into the market. Contrary to popular belief, this phenomenon is what caused the "rich get richer while the poor get poorer," or the "squeezing the middle class," conflict. The big guy gets money for cheap or free and spends it at pre-inflation prices. By the time that money gets to the middle class, it has to be spent at post-inflation prices and the value of the assets owned by the middle class has been devalued by inflation. Thus, effect number one caused a mal-distribution of overall capital into the financial sector while squeezing the average person. Effect number two was to put a huge amount of money into the market with zero justification. There was simply no logical place for the money to go. Interest rates were low however (a side effect of two much money), so the borrowing commenced. People bought SUVs, personal watercraft, fancy vacations, etc... However, the more unpredictable effect was that on the real estate markets across the country. The cheap money went into housing. At first, prices rose at a level that was proportional to the cheaper cost of borrowing, but it soon turned into a spiral. As prices rose, more money was pumped into the market to allow people to buy more houses. Speculative fervor took over, based solely on the back of fake money. At the end of 2005, most people couldnt afford to buy housing in the major metros of the US, and some of the speculators were left holding a financial hot potato. Prices couldnt rise forever. In response, the fed dropped interest rates again (yes, again). Inflation started to become a huge problem. Food was up, healthcare was way up, education was up, housing was up. However, as most Americans were now deeply in debt, the new fake money failed to have the same economic impact as previous infusions. The new money did find its way into the economy though. In concert with irresponsible printing across the world, new money chased commodity prices. Oil and foodstuffs went through the roof. However, the beleaguered economy didnt have the strength to support that boom long term, and were already seeing it fizzle out. One more impact of rising real estate prices. The government originally created Fannie Mae to provide mortgages to those who could get them on the private market. This of course has had the obvious effect of rising real estate prices (increased demand) and more irresponsible lending. When the government decided to privatize Fannie, they decided that competition was good, so they created a fake company out of thin air to compete in the bad loan business. This company is called Freddie Mac. As the fake money piled on, and loans got more exotic, Fannie and Freddie went from being involved in ~20% of real estate in the US to the majority. Whew.... That was long winded. What Happens Now The government takeover of Fannie Mae and Freddie Mac has now had the unintended (or perhaps intended) consequence of making the majority of mortgaged real estate holdings in the US indebted to the US government. It has also exposed the US taxpayer to losses sustained by these companies as they attempt to discharge all of the bad debts that they accumulated over the 3-5 years of real estate boom. This exposes every responsible investor to the excesses of the last 5 years. It also gives the Feds a huge amount of control over the private real estate market. Many of the crashing financial giants were also well exposed to sub-prime real estate. If you ask me, we should let them crash. The market is a mess, and they are at least in part responsible. However, we have seen a strange combination of letting them fail, industry bail out, and private bail out by the Feds. The crashing of oil is simply a correction of the short lived commodities bubble. We are really at a crossroads. The economy is distorted, but there is a lot of fundamental goodness in the US economy. We really do create and produce. There is a foundation to recover. However, we have five years of the distortion of capital towards financial companies and real estate holdings. Jobs have to be lost. Companies have to disband, and the economy as a whole needs to rearrange itself. Over a few years, in a good capitalist system, that should happen. Let the recession occur, let unemployment go up for a couple of years, and let business slowly adapt to the reality of non-distorted market demands. It really is better in the long run. We dont need to waste manpower that could be driving the economy forward building unnecessary houses in the Arizona desert. Removing the market distortions (by removing the infusions of fake money) will accomplish that. All of that being said, Im not optimistic. The feds are already talking about lowering interest rates again to "stimulate the economy." The average American seems to want a short term bail out a whole lot more than a correction to a sound economy. The fake money might was well be green cocaine, and we are seriously addicted. It felt good at first, but now we just cant let it go. Neither of the current major party candidates for US president opposed the takeover of Fannie or Freddie, and I suspect that both will promote economic policies that continue to let the Federal Government and the Federal Reserve meddle in the US economy. Many economists have looked back at the great depression, and even many of the more liberal members of this group have conceded that the policies of FDR did wonders to expand the length and breadth of the depression. The New Deal took real money out of the economy and put fake money to work on projects that really did nothing to solve any problems that faced the country. The US economy has a firm foundation, but it is what we do next that will determine how this plays out. Will we have a bumpy ride for a couple of years or will we bear hug the economy with such restrictive rules to fix it that we love it right down into another depression. As they say, hope for the best, but prepare for the worst.
OK. I realized after some lengthy comments on my last post that I want exactly clear in getting my point across. There were definitely some holes in my statements as I read them again. So this is the advanced version of why centralized healthcare in the US will fail: 1. Cost shifting- As was rightly pointed out to me, cost shifting doesnt bankrupt the economy in and of itself. It really is always less efficient, but there is much cost shifting in the modern US, and its mere existance hasnt destroyed the economy. However, what it doesnt do is lower the cost. People occasionally bring up administrative costs or duplicate tests. Compared to a government system (which largely shifts the costs to physicians as opposed to eliminating them), there is no reason to believe that these will change that dramatically. The problem with healthcare is simply that it costs too much. As healthcare approaches 20% of all US dollars, shifting the cost to "the rich" or subsidizing "the poor" will do little to stop it from running the rest of the economy over like a bulldozer. Giving the government the money first will simply mean that the money will spend more time out of the economy and then inefficiently be spent. 2. The US is not Europe. Europe is actually a conglomeration of numerous different types of universal healthcare systems. Here is the truth. Most European countries ration care. Some services arent provided, they dont have enough equipment, or people who meet certain criteria are excluded. This is how their systems stay afloat. This rationing is somehow seen as more moral because it is more "equitable," but I doubt that people who cant get lifesaving cancer drugs or wait long periods for imaging agree. More importantly, Americans wouldnt stand for it. We cant let 95 year old ventilator dependent with advanced dementia granny die after she becomes septic from one of a million bedsores. Were a long way from rationing. 3. Without rationing, prices escalate. Thats already what happens now. If we could spend $1 billion dollars to keep someone alive for 1 extra minute, is it worth it? Most would say no. The use of finite resources involves the constant weighing of cost and benefit. Our current system doesnt do that. We essentially expect everyone to be entitled to everything. Using other peoples money, everyone wants everything done. This is actually bad for society. Using the example from number 2, a family that wants everything done for granny may have second thougths if they were presented with the $5000/day bill. As an example, many people in my area are being priced out of housing by the cost of property taxes. This is the equivalent of taking all of those taxes for an entire year from a family each day to keep granny on the ventilator. Without rationing, prices will continue to climb. If someone else is always buying dinner, everyones always at the steakhouse. This puts Subway out of business, and the low cost options disappear in a sea of ever rising steak prices, due to the unlimited demand driven by people whos personal stake in the rising prices is trying to get as much steak as possible until the system collapses. Every double read film or "just in case" CT scan that comes along to avoid a lawsuit adds to this misery. 4. Americans will not accept government rationing, and it will not be feasable for the government to ration as a political point. Americans only cant afford healthcare now, because they expect everything. They will still expect it in any kind of universal system. We will thus see a system in which the government will cut payments, trying to spread out the money over an ever increasing sea of people. With cut payments, will come a reduced supply of hospitals, doctors, technology, etc... There will be no "rationing," but the waits will grow. Physicians, who will be largely the vicitims in this system, will be blamed for caring about money (ie keeping the business open and making an actual profit). The government will point the finger. Meanwhile, costs will continue to rise as everyone tries to become more (not less) expensive in order to claim that they deserve a larger percentage of the money. This is sort of the same concept that always drives beauracracies to grow. 5. If we are going to ration anyway, we might as well use the market, as the market atleast promotes efficient resource utilization. May some people be excluded? Yes. Will some people also be excluded in the universal system? Yes. People whos treatments are not covered exist all over the place. The government simply adds inefficiency. They will either ration less efficiently than the market or go bankrupt for lack of rationing. Our resources are finite. We cant give everyone everything. It doesnt matter whether we have a single payer, a socialized system, a subsidized system, a mixed system, etc... Without changing the entitelements and spending less money, we will go bankrupt. Centralizing the system will not fix the problem.
This sounds like it should be a long topic, but in fact, its very short. Every central system will fail because any sort of involuntary insurance is effectively a forced redistribution of wealth. Without eliminating the entitlements, it doesnt matter if there is government insurance, a choice of private plans, a hybrid system, a system that involves the easter bunny, etc... The system will eventually bankrupt itself because it will progressively demand more and more money from the most productive members of society in order to pay for things that are largely the result of either poor life choices or inevitable old age. Insurance only works if everyone takes the risk voluntarily. Throwing oneself into a risk pool with people who are much higher risk than you is stupid. Forcing this stupid decision is the only purpose of central insurance.
"Malpractice Reform!" "Malpractice Reform!" If you listened to the AMA (something that I do my best not to do), youd think that this was the holy grail of physician protection. As though all we had to do implement this reform (along with shoring up those Medicare payments that theyve supposedly been fixing for the last decade or so) and this country would become a medical utopia. We could practice without fear, pay for our malpractice protection at a reasonable price, and provide medical care to the patient at a reasonable price. Heres the facts. In states that have enacted this reform, there has been anywhere from a flatlining in insurance prices to real, but small, drops in price. While these reforms, which are primarily aimed at non-economic damage caps, have some impact, they certainly do not change anything significantly. They dont stop defensive medicine or let physicians sleep at night. Part of the reason is that an extra $250k to $500k on top of already calculated economic damages is still a heck of a lot of money. Part of it is that it doesnt change the underlying culture that produced the mess. You see, medical malpractice is part of a larger concept, which is professional malpractice. Professional malpractice is part of an even larger concept, which is consumer protection. You see, the concept of consumer protection, which is certainly a hot topic now, barely existed a century ago. It is new. While its earliest implementations were of a relatively benign nature, it has become the beast that is poised to destroy the modern world. It was once assumed that the use of products or services came with risks. Many of these risks were inherint, no one really questioned them, and it was common sense that the choice to use a product or service was to take on the obvious risks. If I bought a horse in 1890, it was poorly behaved, and I proceeded to fall off and break my neck, my family had no concept they should sue the previous horse owner for its behavior. Falling off of the horse is an inherint risk in riding a horse. No amount of protection, skill, equipment, etc... will ever make riding a horse 100% fall proof. Early implementation of consumer protection occurred when licensing went from being a pure tax to being a tax AND qualifying process. In the early part of the 20th century, a medical license could be had in a number of states for the price of $5, there was no real required or standardized training necessary to get one, and the purpose of the license was really for the state to collect $5. Licensing was one of the earliest implements of consumer protection. Early licensure rarely had anything to do with the state telling people how to do things. Early changes in medicine, law, architecture, engineering, etc... were really supposed to show that the people performing these tasks had actually studied their respective professions, not to tell them how to practice. This really goes back to the concept of a contract in common law and all throughout history. Free and competent adults could make determinations of risk and benefit and agree to essentially anything, as long as neither was coerced. In the past, to end up in court, one would have had to violate the agreement. Period. There were very few rules governing what the agreement could be. The same was largely true within licensed professions. The medical license implied that the doctor had studied medicine, but the contract for treatment thereafter was between the doctor and the patient. If there was an adverse outcome, very few people thought that it was the doctors fault if he honored the contract. Early malpractice concepts were largely contract disputes. These might include removing a mass that the patient had never agreed to have removed or giving a therapy never agreed upon. The concept of a standard of care came later. As the century progressed, the concept of consumer protection moved forward to include things that didnt work, then things that had unintended side effects, then things that did work but produced negative outcomes. All the while, the government got more and more involved in the business of telling people how to do things and violating rules of the government became a secondary source of liability exposure on top of violating the actual contract. Heres a rough scale broken down into 20 year increments (with some variation from region to region) on how liability impacted physicians over time. This is how one avoided liability implemented for consumer protection: 1900: Physician is a person who enters an agreement to provide medical treatment and must provide the treatment within the agreement 1920: Physician is a person who graduated from Hopkins style medical school in order to get license and then enters an agreement to provide medical treatment and must provide the treatment within the agreement 1940: Physician is a person who graduated from Hopkins style medical school and completes atleast a year of medical internship and then enters an agreement to provide medical treatment and must provide the treatment within the agreement. 1960: Essentially the same as 1940, though early concepts of negligence due to failure to follow standards of care periodically impacting physicians Late 1960s- MEDICARE 1980: Physician is aperson who graduated from Hopkins style medical school, completes medical internship, probably completes a residency, might complete a fellowship, and is then obligated to provide care both in keeping with an agreement with the patient AND in concordance with the concept of "standard of care," which is not explicitly stated anywhere, varies between region and specialty, and is oftened proposed by someone making a lot of money from the side that brought the suit. Non-economic damages are in full swing, so courts and lay juries attempt to attach dollar amounts to the value of having tea on the porch with ones now deceased grandmother or pain and suffering at the loss. 2000: Same as 1980 PLUS consumer protection now ALSO applies to government and third party payers. Improper coding, documentation, use of procedureal etiquette, etc... can result in civil liability as well as possible criminal liability. One might say that this has run up the price a bit. It has, but it really mirrors what happened in other industries. Why do you think there are all of the ridiculous warnings on products. If one spilled coffee on himself in 1900, he was a klutz. Today, he is a millionaire. In 1900, no one though that they needed a "hot when heated" warning. One can apply this concept similarly to the use of sleds as weapons, placing small objects in the mouths of infants, etc... The other major change is that in the past, the consumer would have been largely responsible for anything that did happen. Today, its the producer. It goes something like this: 1900: Consumer buys product after inspecting. It doesnt work. Oh well 1950: Consumer buys product after insecting. It doesnt work. He may recover the money he spent plus possble attorneys fees (if malicious intent is found). If it does work though, and he breaks it or uses it improperly, he may not recover. 2000: Consumer buys product after inspecting. It doesnt work. He may recover money spent plus possible non-economic damages, plus attorneys fees. He may also recover if it does work and he uses it improperly if not warned. If I use my sled as a weapon and hurt someone, I may argue that I didnt know that the sled being used as a weapon instead of a sled might hurt someone. If I use the product correctly, and it works, but someone gets hurt, I may still win. An example is firearm manufacturer that produced a perfectly functioning pistol that worked exactly as it was supposed to losing a suit when a victim that was shot by the pistol sued the manufacturer, as opposed to the guy who SHOT HIM. With examples like this, its no wonder that everything is out of control. You cant protect yourself when you are responsible for products and services that are made or done correctly but still produce poor outcomes. You cant agree anymore to have someone wave the right to sue for a poor outcome in a situation likely to produce one. This isnt just in medicine. It applies to anyone who produces anything. The current system ALWAYS punishes the producer over the consumer, wheras in the past, the concept was to put them on equal footing. It creates a system in which we progressively discourage production. Theres no quicker way to eliminate all of the technilogical gains that are producing the very things that consumers are now "entitled" to in 100% perfect working order all the time with no errors or less than optimal endings. By punishing producers long enough, society will simply begin to implode. In this case, the physician is just another producer, and malpractice is simply another symptom of a culture in which the consumer expects perfect outcomes from every producer with every product and service 100% of the time. Change the culture and you fix malpractice. Reform does little.
While completing an outpatient clerkship, I recently had a strange realization about all that was wrong with medicine at my preceptors office. A patient came in with a relatively simple abcess vs. cyst on the medial thigh. It was relatively superficial, not located near anything major, and my preceptor had extensive urgent care experience dealing with things just like this. In fact, Ive done the I&D on similar lesions in medical school. So I asked, "are you going to drain that thing?" with of course the glimmer of hope that I might be able to do it. He said no. This patients insurance wouldnt pay him to do it. As he put it, "I dont work for free." He instead spent 20 minutes on the phone referring to a general surgeon, getting approvals, etc... The patients insurance was willing to pay a PCP bill, send the patient to a surgeon and pay that bill, and then have the patient return to both for more billing. What a bizarre system in which we send the patient to two extra appointments, pay for both, and waste the time of a highly qualified practicioner on the phone all for a simple procedure that the med student could have done and the doctor had done 1000 times before. You wonder why we spend so much money. There must be no real competition in the local insurance industry, because there is no way such a stupid system could survive any real competition.
First of all, I aplogize for being absent for so long. Life has a way of keeping you busy, and Ive learned first hand how busy a person can be. Anyway, enough about me. It is absolutely clear that in modern medicine, a great deal of what we do is of marginal utility. We can look at this both in the sense of the utility of the treatment as to how it impacts whatever its endpoint may be AND the expense of a treatment versus how much value that endpoint actually provides. An example of the first might be the use of expensive MRI imaging on every nebulous back, knee, shoulder, neck, etc... pain. An example of the second might be $5000/day ICU care on a demented 90 year old with metastatic pancreatic cancer. In the first case, the cost is high, the yield is mostly low, and the data is often hard to interperet. In the second case, we can only hope to provide a limited number of days or weeks to the patient, with very little in the way of benefit in even the best case scenario. Being mainstream doesnt exclude something from being marginal. History is full of many marginal items becoming mainstream. As they improve, the world adapts to them, and they often cease being marginal. Before Henry Ford, the car was an exclusive oddity for wealthy people. Poor people walked, took horses, etc... as they had for millenia. One day, Mr. Ford had a vision of mass automobile production at a price that the workers building the cars could afford. In comes the assembly line, the Model T Ford, and over the next couple of decades we went from a nation that walked and rode to a nation that drove. Newer cities sprang up around in a world in which people could travel long distances with ease, and all across the sun belt, it is now hard to call a car marginal. Its almost impossible to get ahead without one. A good way to enter the mainstream is to cease being of marginal relative value. Television, the personal computer, and most recently the cell phone all fit into this category. They all have one thing in common. When they were of marginal utility, they were expensive. They became inexpensive first, and THEN they became the standard. The majority of people can afford these things, and that is why they entered the mainstream. Beforehand, the inaffordability prevented them from becoming common, and the people who sold them HAD to find a way to make them affordable in order to sell them. The incentive is to drive price DOWN. This is in stark contrast to medical care. In the past, medicine followed this model. Pre-1970s (read medicare age) medicine saw numerous technologies from X-rays to Penicillin go from expensive oddities for the rich and well connected to common and affordable in a few short years, following the same model that ALL OTHER TECHNOLOGY uses to become cheaper. However, since the 70s, weve had an explosion of technologies of progressively more marginal utility at increasing expense. We went from X-rays to CTs to MRIs to PET scans and other nuclear scans. It seems quite clear that there is progression in which each step adds progressively less than the previous step added at an exponential increase in cost. Going from no imaging to having X-rays is far more important than going from CT to MRI. When Medicare entered the market in the 70s, all incentive to drive down costs began to diminish. As government payment systems took over a progressively larger proportion of payments, incentives were turned on their heads. The incentive is to create MORE expense in order to claim a greater proportion of the pie. As is commonly understood, the incentive of a salesman is to claim he needs the least money for his product while the incentive of a beauracrat is to claim he needs to most. By allowing access to the rich first for newer technologies, it allows them take hold. The existance of higher stores of wealth creates incentive for people to create marginal technologies. The natural progression is for these technologies to become progressively cheaper as people find a market in progressively poorer classes. Taking higher technology and expensive goods and making them affordable has been the success model for companies such as Walmart. Almost everything starts as the domain of the rich and in a couple of generations becomes a seen necessity for the poor. We will use the CT and the personal computer as our example. Since the early 80s, CT have added some higher resolution and gotten quicker, but they generally do the same thing. Over that time, their price has gone up significantly. In fact, I cannot pay in inflation adjusted dollars the same amount for a 1980s CT scan than I could in the 1980s. The price is stagnant to significantly increased. The personal computer on the other hand was a dreadful box that required a long time to boot from a series of cards, could do little but word processing and simple calculations, and was hard to access. In the same period, it can now do what most supercomputers could do in earlier times and at a much lower price. The $300 Dell Cheapie is infinitely more powerful than the $5000 home PC of 1980. Over that time, the government took over the bulk of healthcare costs, and various hospitals, Radiology groups, etc... argued for the need for MORE money for CTs. They lobbied and they won. Meanwhile, the computer was sold in near free market conditions. Newcomers such as Dell and Gateway entered the market and competed against the old time Compaqs and IBMs. Prices went down. Over that same period of time Microsoft and other software giants emerged and provided progressively cheaper options for software that did progressively more. You see, now most people can get a computer and most people can get a CT scan. The computer is the right way to turn the marginal into the mainstream. A person goes, pays a price they can afford and takes one. There is no red tape, no roadblocks, no obstacles. The same person must beg for a CT from their insurance company or go to the very expensive ER. They must often wait for days. They must pay progressively MORE for insurance to cover the same CT, and the CT costs more than it did 20 years ago. The legal scenario of the modern world makes the CT mainstream, because we ALL have to order it, but it has gotten no more affordable. This is the way to take something mainstream that will simultaneously BANKRUPT the system for the sake of remaining mainstream, as the treatment is often still marginal. This is not an issue with the PC, which entered the mainstream by no longer having a marginal cost-benefit. In summary, the MRI for nebulous pain symptoms is of marginal utility because of the cost. As costs go down, the cost-benefit improves, and its use becomes less marginal. The ICU stay is marginal because at $5000/day it provides little benefit. It is the same problem. At $100 an MRI would be of much greater relative utility than an MRI at $1000. However, no one will find a way to provide a $100 MRI as long as there is a non market payer providing $1000. There is NO incentive to drive the price down, and the mainstream imaging will have marginal value. There. It was fast written, rambles a little, but atleast Im posting again.
As the presidential primaries come into full swing, I figure that its about time that I relay a prediction. This isnt going to be about who is going to win or lose. Its all about what the winner, whomever he (or she) may be will do. This is really more of a long term prediction, as I doubt all but one or two candidates have any intention of doing it. This of course means that it will take time for it to happen. There is right now a raging debate about how much physicians are worth and which jobs should be the purview solely of physicians. The issues of scope and reimbursement seem to have intensified in the midst of a very related debate taking place in the halls of Washington regarding the role of government in the provision of healthcare. These are currently seperate issues, but I predict that by the end of the next presidential term, they wont be. You see, when people debate these issues, the one thing that NEVER gets questioned is the fact that government has become ENTIRELY in charge of all of these points. Physicians, the market, patients, even the nursing unions and the hospitals that probably stand to gain the most from some sort of healthcare socialization still have no independent say in the way that things happen outside of their influence on Washington. For the government to be able to take over healthcare, a number of things will need to happen. There is already a move (Kennedy bill to restrict compounding pharmacies, FDA attempting to standardize all supplements in N. America in conjunction with healthcare giants such as Mexico) to standardize medical care. Cookbook medicine is becoming a reality, and it doesnt take a genius to notice that critical thinking is already beginning to become more rare in a culture where every new idea is a potential lawsuit and strict reimbursement standards make it progressively more difficult to actually get paid for doing anything that Medicare deems unnecessary. This standardization will continue. The next logical conclusion to jump to is that it doesnt require MD training to administer cookbook medicine. In fact, that training is almost a liability in the current environment. People in high places understand that there will come a point where reimbursement cuts will make becoming a physician financially unsound (though regardless of what many of us may think, this hasnt happened yet). The process is too arduous and expensvie to continue within the environment of declining reimbursement. As it is, physician training in this country is often akin to giving everyone a porcshe to drive through a school zone. Many of use are entirely over trained for what we do. The distinction of which training is necessary to what however, cant be decided by a committee. Physicians have a nasty habit of going independent. They even opposed to the creation of Medicare and are one of the first groups who adamantly opposed free government money in the 20th century. None of us are naive enough at this point to believe that this remotely approaches majority opinion amongst physicians today. However, that independent streak occasionally pokes out. In the US today, there are a number of competing groups who are pushing for independent practice who have a history that is much more amenable to being told what to do. They have a lot to gain by giving in to any new initiative proposed (most of which they never really opposed at all). Enter the midlevel. The nursing unions already call any member who opposes ANY piece of legislation to extend healthcare entitlement immoral. That is a demographic that almost has to continually give power to the politicians in order to survive, thanks to a position that is held in large part by restricting training competition and keeping organized labor in an environment where physicians are not allowed to do so. Now, Im going to stop here and point out that I am NOT opposed to midlevels. I think that people should have the right to hire whoever they want to perform their services. If someone wants a midlevel PCP, that is their perogative. However, the one thing that seems to be true is that in this current climate, the patient will have NO say. While I generally support midlevels, I generally oppose misrepresentation. I believe that there is a definite attempt by some (not most) to overrepresent the training of midlevels, and I believe that this will be just a little too good to pass up on the part of officials in Washington looking to buy votes without devaluing the currency too much more than they already have. You see, the current system is sort of like everyone being forced to buy a BMW or go without driving. The prices are high, and they are rising. The government wants to promise everyone a BMW, but the price is just a little too high. I think that the plan is to eventually buy a bunch of Fords covered with BMW symbols. No one is suggesting that we just have an open market in which some people have BMWs, some people have Fords, but almost everyone can afford to drive. So this is my prediction. Obama, Clinton, Huckabee, Romney, or Guliani. Everyone has the same agenda, though that agenda is admittedly presented in a different way by each of them. Healthcare access for all will inevitably become healthcare provided for free either for all or for those who cant afford it (which is an ever growing piece of the population in the over-regulated healthcare environment) by the government. The hospital systems are becoming more monopolized and well connected. They will continue to get richer, because they dont really care about medicine itself. Physicians will either cave and become progressively more enslaved to the system, or they will be systematically removed from more and more of their responsibility regardless of what the patients want. The powerful nursing lobbies will continue to become more powerful, and the ARNP and the PA will take over a progressively larger role in healthcare. The ARNP will remain more powerful than the PA. The paycuts will continue, and the solution will be cheaper labor, not paying people what theyre worth. It will continue to become more difficult to practice independently. It will all take time, but I suspect the agenda (which seems fragmented now) will become a lot more clear over the next 4 years. Sorry for the negative vibes.