The conversation around rapidly rising healthcare costs started in the 80's when healthcare costs grew significantly faster than the economy through the 60's, 70's and 80's. Even though the increase in healthcare spending began to slow in the past decade, it is still problematic for many US citizens.
A study conducted at Harvard University found that the largest cause of personal bankruptcy in America is due to medical bills. While many believe that personal bankruptcy induced by medical expenses only affects the 55% who are insured by our employers—this is simply not the case. According to a recent study from the Kaiser Family Foundation, the average premium that workers with families contribute has risen by 81% over the past 10 years, with out of pocket expenses rising by 150%. This is partially explained by the fact that employers are paying more for our healthcare coverage due to the rising cost of care.
In order to investigate why healthcare costs have been rapidly rising, we interviewed the Founder and Executive Director of Costs of Care—Dr. Neel Shah. In his interview with us Dr. Shah explains the main contributing factors to the rising cost of care in the US and how education and technology will allow for a future in which we pay for medical bills that are treat us in a cost effective manner.
What are the main factors that have contributed to the rising cost of care in the US?
There are a couple of different factors that are contributing to rising healthcare costs. The conventional wisdom is that about 70% of the overall rising healthcare costs over the last half a century have been due to the deployment new technologies. For the most part these technologies have made people better off but in some cases, said deployed technologies are only increasing costs for patients who likely would receive similar benefit from a treatment of lesser cost.
Think about the way we used to treat heart attacks for instance. In 1965, when Dwight Eisenhower had a heart attack, they gave him the top of the line treatment which at the time equated to a month of bed rest. The procedure wasn’t very expensive but it also didn’t help much either. Moving forward, over the last 50 years a variety of medications were developed to treat people susceptible to heart attacks which has been relatively successful. In relatively recent years though, we’ve switched from primarily treating people for heart attacks with medicine, to treating them with a surgical procedure. It used to be that we had to crack open their breast bone to perform a major open heart surgery; and now for the most part, we can thread catheters into their blood vessels and use stints instead.
While the advancement in the way we treat heart attacks is great, the difference in treatment cost is now upwards of $10,000 more per patient today than even fifteen years ago. On the back end, people are now living on year longer than they used to even fifteen years ago. Conventional wisdom tells us that on average, even though we are using more technology, people are better off. But this begs the question—do we need to be spending that much more to get the exact same results?
Take Moore’s Law for example which tells us that the number of transistors in a chip double every two years—meaning that computing costs should go down because we will be able to store twice as much data. Pretty much every other technology follows this logic except for in healthcare. Healthcare is THE exception to Moore’s Law. Instead of costs falling in half, they’re doubling. In fact, if healthcare did follow Moore’s Law then those same heart attack patients should live to be well over 200 for the amount that we spend.
As we deploy new technologies, we spend more in healthcare. Part of the reason is because we deploy new technologies before knowing how well they work. They way modern medicine works is that for any given condition we have a variety of options for treatment and very little information about the value of each option. When you combine this with the payment system for procedures, where as long as it’s not complete quackery and it’s not going to hurt somebody—usually you can get it paid for.
An example of this is proton-beam therapy which is used to treat many forms of cancer—you need a $150 million facility to treat someone with proton beams. Per treatment and upfront cost asides, proton therapy is two to six times more expensive than the alternative form of radiation which is equally effective. And yet, the numbers of proton beam facilities have doubled just in the last five years.
What I have described only touches the surface of rising healthcare costs in the US but I would say they are the main contributing factors.
Continued education is integral to clinician development and ultimately, the quality of patient care. What part does clinical education play in decreasing the cost of care?
I think clinical education plays a huge role in decreasing the cost of care. Part of the problem is that no doctor goes to medical school to treat the GDP so there is a big disconnect. Doctors are smart—they understand that 17% of GDP spent on healthcare is bad and $800 billion dollars wasted is not good for anyone but this doesn’t really inform how you take care of the patient. For the longest time we haven’t done a good job of reframing the ‘cost’ conversation within the doctor patient relationship. This is starting to change though as patients are increasingly on high deductible health plans which means that it hits their pocket when you make a decision that leads to high and unnecessary medical bills. As a result, patients are starting to come into doctors’ offices to question how much things cost and whether they really need something. I think for the first time we’re starting to see what used to be a really abstract thing—healthcare costs—in a way that considers the patient in front of us.
Unfortunately though, medical education has not caught up to the information that we need to make the best decision for patients to balance treatment and costs. Not only are we not taught anything about healthcare costs in medical school, but the way we are taught actually makes us terrible stewards of resources. Every conference and every case report in The New England Journal of Medicine is about things that are really rare meaning that we as doctors spend a lot of time hunting for things that are really rare. When we get yelled at in medicine, especially as trainees, it’s always about things that we didn’t do but could have done—it’s never about the things we did do, but didn’t have to.
The cost of care is not yet factored into our studies, or in our training. Medical education desperately needs to factor in cost considerations in unison with treatment options, and medical educators need to be conscious about educating medical students on things that they didn’t have to do.
Do you believe that technology can aid in decreasing the cost of care? If so, which technologies do you believe have (or will have) the biggest impact in decreasing costs for patients?
Generally, technology that supports making good decisions—there is huge room for this. In pretty much every other industry people have been increasingly reliant on decision support. The same is true in healthcare. We’ve actually made patients a lot safer in the last 20 years by having computers that catch potential errors or oversight in our treatments. For instance, if you try to order medication that a patient is allergic to, the computer will alter you. There is a similar opportunity to not just look at the quality and patient safety piece but the whole value and cost picture as well. Abraham Verghese, the author of Cutting Stone, has a great quote that ties into this—“if you’re ordering off the menu and it doesn’t have any prices on it, it’s really easy to get the fillet mignon every time.”
This largely speaks to how doctors order right now. You can sit down at a computer and if you want to get a $3000 MRI you just click ‘MRI’. There is definitely a huge opportunity here to think about how we can embed not just explicit pricing information in our workflow, but good decision support as well to make sure patients are receiving care that treats them while simultaneously being cost effective.
If you want to learn more, check out our webinar on 5 Ways to Reduce eLearning and Corporate Training Costs: