Obamacare, aka the Affordable Care Act, became law six years ago. The intention was to ensure that nearly all Americans have health insurance, while controlling costs. How did that work out?
When the law was enacted, about 16 percent of Americans were uninsured. That has dropped to 10 percent. So instead of 50 million uninsured Americans, there are now about 30 million without insurance. That’s better, but hardly universal.
Health cost inflation slowed for a few years, probably because of the recession, but it’s now resuming its rapid growth. In total, the United States spent $8,400 per person on health care six years ago, or $2.6 trillion. Last year we spent $10,000, or $3.2 trillion.
Obamacare has no doubt provided health insurance to people who would otherwise have been without it, mainly through an extension of Medicaid in most states, and the creation of shopping exchanges to enable individuals to buy insurance, often with government subsidies. But insurance no longer means what it once did. Businesses are capping their contributions to employees’ health benefits, premiums are increasing, and deductibles and copayments are soaring. There are reports that people who have gained insurance can’t use it because of high out-of-pocket costs. Insurance is becoming hollowed out, and we are learning that health insurance is not the same as health care.
The reason Obamacare is unable to expand access and coverage while containing costs is that it made only marginal changes to the underlying factors that make the American health system the most expensive in the world. There are two: First is the spectacularly inefficient private insurance industry, which thrives by refusing coverage for expensive medical conditions and generally denying claims. These companies’ profits, marketing, and other overhead expenditures are so high that when Obamacare restricted them to 20 percent of premiums, it seemed draconian. Compare their costs to Medicare’s overhead of about 2 percent.
The second, and perhaps greater, underlying problem is the perverse incentives of providers to perform as many highly reimbursed tests and procedures as possible. These providers include hospitals, whether technically nonprofit or not, for-profit outpatient facilities, such as imaging and dialysis centers, and even specialists whose income is proportional to the high-tech procedures they perform. Other advanced countries spend on average less than half as much per capita on health care as we do, provide truly universal care, and get generally better results, because they have either a single-payer financing system or tightly regulated multiple payers, plus a largely nonprofit provider system.
SCROLL TO CONTINUE WITH CONTENT
Get our best delivered to your inbox.
When Bernie Sanders called for “Medicare for All” to replace Obamacare, he was met with objections that it would be too expensive. But that is because of a confusion between government expenditures for health care, and total expenditures, which include employer and individual out-of-pocket costs. Government officials and political candidates usually focus on government costs, particularly Medicare and Medicaid. It would be possible to increase government expenditures for health care, but offset that by eliminating premiums, reducing out-of-pocket costs, and freeing employers from the burden of providing health benefits.
The government now pays roughly 65 percent of health costs (including Medicare, Medicaid, government employees, and employer tax deductions). Medicare for All, according to an analysis just published in the American Journal of Public Health, would require that figure to rise to about 80 percent. But these costs would be almost totally repaid by the savings in premiums, deductibles, and other out-of-pocket costs. Moreover, with time, the greater efficiency of Medicare for All would slow health cost inflation. We could gradually adopt Medicare for All by lowering the qualifying age one decade at a time to reduce the disruption.
Hillary Clinton recently called for a public option in Obamacare (an idea scuttled in 2009) that would permit people in their 50s and early 60s to choose either Medicare or private insurance. The problem with that proposal is that private insurance companies would woo the healthiest people in that age group, and leave the sickest to Medicare. Medicare would then be subsidizing the for-profit insurance industry, and there would be little or no savings. It is much more efficient for everyone in an age group to be enrolled in Medicare, so there couldn’t be that kind of “cherry picking.”
But Medicare as it now stands is not perfect. Although it is a single-payer system within our larger market-based system, it uses the same profit-seeking providers, and its out-of-pocket costs are also growing. And it doesn’t cover everything – for example, long-term care. If the United States extended Medicare to the entire population, it would make sense also to convert to a largely nonprofit provider system. Reducing the perverse role of profit-seeking among providers, with the propensity to over-diagnosis and over-treatment, would yield much greater savings. We could then expand the Medicare benefit package and get rid of out-of-pocket costs altogether.
By eliminating the two drivers of health cost inflation in the United States – private insurers and a profit-oriented provider system – we would bring the United States into line with the rest of the advanced world. It will be argued that this idea is “politically unrealistic,” but that hardly justifies not even trying, or imagining that anything else will work. The first step is to tell it like it is.