Sunday, July 19, 2020

MFA




       Over the next several months we’re going to see a renewal of the health care debate, It will undoubtedly be waged by folks who genuinely care that too many of us don’t have either insurance or reasonable access to preventive medicine, on one hand, and those who have it, but want to make the rest of us believe it’s too expensive to provide for all Americans on the other. The second group are liars. I have written elsewhere several times about the real math related to cost. For simplicity’s sake let’s agree to call it (a universal health care system) Medicare For All (MFA).    

        To briefly recap the cost issue associated with MFA:

       Those who oppose it use two grossly incorrect assumptions: The first is the claim that MFA would cost $10,000 per individual annually. This lie is based on the flawed assertion that the figure of $10 grand is accurate. It is not, and it ain't real close!.  




A reasonable person examining this data set will note that about 1/8 of the population spends more than the $10,000 figure and the rest spend far less. The other and very persuasive factor here is that the figures above include the current abusive drug pricing which, apparently, only we in the USA are “privileged” to pay. Calculating the weighted averages, the actual national average spending per patient in the USA is $4299. Putting it in other words, the $10 grand figure is a lie.

The second lie is even more egregious. 

Building on the $10 K figure (remember, it’s really about $4300) the assertion is made that all of it would have to be paid with additional taxes. At first blush, there is the temptation to nod sagely as if that statement were not blatantly false, which it is. I know, I know, why?
        Well Jethro, this discounts or ignores all the health care spending already existing at present. Average (2017, latest data available, It’s surely a bit higher in 2019) US Employer contribution (per person, the expenditure is more for family policies) is $5655, the Employee contribution for single coverage (family coverage is usually MUCH higher) is $1241, and the employee also pays an average of $1537 in Medicare deductions from salary.

        The crux of all this is that, for the average US worker, there is already a total of $8433 annually, never seen in a pay envelope except as a deduction, which goes to healthcare spending! Using the best estimate of the number of employees under those circumstances (155,761,000 in 2018) that means that over $1.3 trillion is already going to health care spending under current circumstances. Just as a math exercise, that means that right now (and these numbers exclude some high earners in business not considered industrial) Between employee contribution and employee payroll deductions there is already about $4,000 for every individual in the country sent to either private insurers or Medicare annually.  The anti MFA folks discount all this to scare the rest of us.

       Let me put it another way. Just assuming annual employer and employee health care contributions remained as they are, we are only about $300 dollars per person short of the real annual cost of universal MFA. Add to that some considerations I’m going to list shortly and it’s not even that.

The second part of this requires some historical perspective:

        Once upon a time health care was a cash (or in more urban areas, charity, at times) affair and frequently, being poor equated to dying an early death from what, today, is easily treatable (diabetes , heart disease, etc.). The first health care coverage which was similar to the modern model was Texan in origin and provided primarily, coverage for hospitalization only. This was, to a great extent, because as technology (anesthesia, X rays, etc.,) blossomed, so did the costs of procuring and using them. The originator of recognizable group health care insurance was a Baptist charity hospital, but within a decade, the model had spread across the country. Three million people had signed up by 1939 and the concept was now known as Blue Cross Plans. 

       The goal, as incepted, was not to make money, but to protect patient savings and keep hospitals — and the charitable religious groups that funded them — afloat. Blue Cross Plans were then not-for-profit!

        Even so, prior to World War II, (when most treatments were still relatively unsophisticated and cheap) relatively few Americans had health insurance. The invention of effective ventilators enabled a widespread expansion of surgery suites and intensive care units as wartime “in extremis” medical procedures had broken new ground in corrective and lifesaving surgical procedures. 

       Massachusetts General Hospital started the first anesthesia department in the United States in 1936. The first intensive care unit equipped with ventilators opened during a polio epidemic in Copenhagen in the early 1940s. That meant more people could be saved, including soldiers injured during the war and victims of polio outbreaks. Five dollars a day and a 21-day maximum stay (a Blue Cross original provision) were not remotely close to adequate. Insurance with a capital I was increasingly needed.

         A private industry selling direct to customers could have filled the need — as it has for auto and life insurance. But a quirk of WWII history and well-intentioned federal policy helped firm up the concept of employer-based health insurance in the United States. When the National War Labor Board froze salaries during and immediately after World War II, companies facing severe labor shortages discovered that they could attract workers by offering health insurance instead. To encourage this trend, the federal government ruled that money paid for employees’ health benefits would not be taxed. This strategy was a win-win in the short term, but in the long term has had some very losing implications.

        Essentially all employer provided policies covered only major medical, meaning they paid for extensive care but not routine doctor visits and the like. The original purpose of health insurance was to ease financial disasters brought about by a serious illness, such as loss of home or job, not to make health care cheap. Our current concepts of what insurance should do have grown. 

       Between 1940 and 1955, the number of Americans with health insurance skyrocketed from 10 percent to over 60 percent! (pre-Medicare/Medicaid) Blue Cross/Blue Shield became seen as a force for good across America. Non-profit, according to their charter, the “Blues” accepted everyone who sought to sign up; all members were charged the same rates, no matter how old or how sick. Blue Cross Blue Shield had become one of the most trusted brands in postwar America. It was not to last.

        At a time when medicine had more of value to offer, tens of millions of people were interested in gaining access and expected their employers to provide insurance so they could do so. For-profit insurance companies moved in, unhindered by the Blues’ stated charitable (non-profit) mission. They accepted only younger, healthier patients on whom, at least statistically, they could make a buck. They charged different rates, depending on factors like age, as they did with life insurance. Many simply refused people with certain preexisting conditions. And they produced different types of policies, for different amounts of money, which provided different levels of protection. Think deductibles, copays, drug plans, etc.

       Watching these “new guys” pass them by woke up the “Blues,” still seen (by brand name at least) as the good guys in health insurance. Many of the Blues seized the business opportunity, with Blue Cross and Blue Shield of California particularly aggressive, encircling and absorbing fellow Blues in a dozen other states. Renamed WellPoint, it is now the biggest of the for-profit companies descended from the original nonprofit Blue Cross Blue Shield Association; today it is the second-largest insurer in the United States. Most of its plans still operate under the name Anthem BlueCross BlueShield, but “non-profit” is merely a distant memory. I know this has been long, but it’s important to know how this paradigm shift occurred, even if no one else was in the room where it happened. (sorry, Lin Manuel!)

        Like essentially every entity in the health care industry, especially Big Pharma, insurers spend a lot of money on lobbying and supporting business friendly candidates, attempting to ensure that their cash cow doesn’t go dry by government fiat. Today, when any elected official orates against MFA it’s a safe bet that he’s using data provided by the for- profit insurance industry. Of course, all high executive salaries, lobbying expenses and media ads are expensive, and premiums reflect it.

        A statistic which drives the Health Insurance industry is the “medical loss ratio.”  In 1993 (just 37 years ago), before the Blues ditched client welfare in favor of profit, insurers spent 95 cents out of every dollar of premiums on medical care, which is the afore mentioned “medical loss ratio.” To increase profits, all insurers, regardless of their tax status, have been spending less on care in recent years and more on activities like marketing, lobbying, administration and the paying out of dividends. The average medical loss ratio is now closer to 80 percent.

        Some of the Blues were spending far less than that even in 2010.   The medical loss ratio at the Texas Blues, where the whole concept of health insurance started, was just 64.4(!!) percent in 2010. Understand that number.  35.5% of all premium income for what was once a non-profit, now goes to other costs not related to client welfare (you know – profit!)  

       Now for some comparisons: As stated above the average administrative cost across the Health care industry (profit plus actual cost of managing the program as stated above) runs at about 30%, Medicare’s admin cost is about 2%. The figure used when scare mongering re: “MFA” is the private admin cost figure.

       Now for apples to apples. Consider the cost of providing health care as a percentage of Gross National Product (GNP)
    
The figure at the top of 16.9% is the portion of the US GNP, which is spent on healthcare, ergo not available for other things like debt reduction. The UK, however, spends just 9.8%, and even Switzerland which mandates private insurance for every citizen spends just 12.2%. I know, so what does this mean to me?

       In simple math, it means that if the US had a health care system which ran as efficiently as the UK National Health service, in 2019 we would have saved $1.5 trillion dollars. To state it in simplest terms, we would have had an almost $500 billion surplus instead of a $1.06 trillion deficit! For comparison that’s about 84% of the Defense budget for the year, and $200 more than current Medicare/Medicaid spending.

        And before you even think of telling me a UK “horror story”, look at OECD or Kaiser health care consumer satisfaction surveys. (Yeah, we’re close to last there, too.) And in many US cities wait times for elective procedures are as long or longer than in the UK. And don’t even think about specialists. Data from other nations show that universal coverage does not necessarily result in substantially longer wait times. In fact, there are a variety of circumstances in which the United States' peer nations have shorter wait times. The longest waits were in Boston, where patients wait an average of 72 days to see a dermatologist and 66 days to see a family doctor!

        Does anyone in government have the guts to incept universal health care?  I doubt it, or if so, over a long time, but just negotiating drug costs for Medicare/Medicaid would save, conservatively speaking, about $125 billion.  Again, just for comparison, simply negotiating drug costs for Medicare/Medicaid to levels comparable to what private plans now pay, would save slightly more than the current Veteran’s Affairs and Federal Welfare spending budgets combined!   

And I do believe that’s all I have to say about that!

No comments:

Post a Comment