The True Cost Of End-Of-Life Medical Care

By Chuck Dinerstein, MD, MBA — Sep 28, 2018
It's a well-known fact that we spend a lot of money on the healthcare of those who are dying, especially during the last 12 months of life. But is that "wasted" money a bad thing?
Courtesy of geralt and Pixabay

Here is the dogma: 25% of Medicare’s annual spending is used by the 5% of patients during the last 12 months of their lives. If only I could determine who was going to die, I could, as a physician offer less expensive alternatives, certainly machine learning and AI can help. A study by the National Bureau of Economic Research (NBER), published in Science, reports that the dogma regarding Medicare spending is not entirely correct and that, predicting death isn’t, well, very predictable.

The Dogma

The researchers made use of Medicare claim data for 20% of beneficiaries alive on January 1, 2008, through December 31, 2008, and who had been enrolled in Medicare for at least the prior year. Sure enough, when you take the spending on those who died after January 1 and by December 31st, and look back through time, the original findings are corroborated, 5% died, and they spent 21% of all the money – close enough. If instead, we looked at the spending of that 5 % in 2008, those individuals accounted for about 15% of all spending, significantly less than the classic “fact.”

The observant reader will note that this reduction may be due to mathematical chicanery, not all beneficiaries living through the entire year. But to be accurate, the dogma report that we spend 15% of Medicare’s annual spending on beneficiaries who will die within that year. And while the discrepancy in funding is significant, let’s not quibble, we spend a great deal on those patients that are dying. But certainly, with the advances in artificial intelligence, we can predict who is going to die within a year and perhaps offer some less expensive alternatives. Machine learning will provide the answer. Well, as it turns out, not really.

Predicting Death

Predicting when a person might die is, of course, a function of when it the patient’s lifespan you opine. In the presence of agonal breathing and the family gathered together (picture a Dickensian death), the accuracy of prediction is outstanding. But when we move that date further back, to a point where it might serve some practical value to a physician and patient, the accuracy in predicting death is pretty weak.

The researchers then used standard algorithms that calculated mortality based upon a patient’s other medical problems, their co-morbidities. Perhaps knowing the calculated chance of death would separate out those we should single out for special care. But Medicare beneficiaries are a relatively healthy lot, 95% had a 1 in 4 or less chance of dying. Even with a 1 in 2 chance of dying only 10% of those patients died, the other 90% beat the odds. Using our current predictors to capture the 5% that die I would have to offer special care to anyone who had a 40% chance of dying.

I can see the office visit now, “We calculate that there is a greater than 60% chance that you will survive the year, but new regulations require us to offer to reduce your care to keep Medicare from wasting money. Don’t forget your co-pay is due before you leave the office today.”

The researchers did ask the “natural question…whether these results would change if we had better predictions…” presumably from talking to physicians or using BIG DATA, the flawed electronic health records. Instead, they created an “’artificial’ oracle,” combining the mortality predictions based on patient’s co-morbidities with who actually died. This artificially increased the predictive quality of the measure to 96%, way more than any test or algorithm reported in the literature. The results were unchanged, patients with a 47% chance of death or higher accounted for the 5%, it was still a crapshoot.

Where is the money going?

It turns out that the reason spending is so high for those end of life patients is that we spend a lot of money on the very sick, and they are more likely to die, but we can’t tell the very sick from the soon to be dead.  The big spenders are three groups, those that are going to die, those with a significant medical event (e.g., diagnosed and treated for cancer) and those with a chronic illness which cost us every year.

How you pose a question changes the answer; given our inability to predict who will live and die, talking about 25% of spending by that 5 % is an interesting, but useless fact, my kids would call it a factoid. Mahatma Gandhi said, “The true measure of any society can be found in how it treats its most vulnerable members.” We should be judged by how we care for our frail and dying, not some average.

 

Source: Predictive modeling of US Health Care Spending in Late Life Science DOI: 10.1126/science.aar5045

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: