Major Malpractice

Print Friendly, PDF & Email

Until the 1970s, America had a stoic, if not lackadaisical, attitude toward health. During the 1960s, half of those called up for military service were being rejected because of physical, mental, or moral unfitness. Heart disease was the number one leading cause of death. President Kennedy believed that America’s flabbiness was a national security threat. Cancer was the second leading cause, yet half the country smoked. To make matters worse, America spent very little on healthcare — a Spartan-like 5% of GDP in 1960.

This sorry state of unfitness set the stage for a 60-year spending frenzy to make America healthy. The War on Cancer began in 1971. The National Institutes of Health (NIH) saw its budget soar (in inflation adjusted dollars) from less than $5 billion in 1962 to more than $40 billion today. The government’s anti-tobacco campaign — of relentlessly increasing taxes, smoking bans, and anti-smoking propaganda (especially the 1993 EPA report claiming that environmental tobacco smoke killed as many as 3,000 nonsmokers annually) — drove smoking down to less than 15% of the population, and exposure to secondhand smoke to near zero. Medicare spending, which was a meager $7.5 billion in 1970, grew to a staggering $839.3 billion in 2021. In terms of total healthcare spending, US expenditures have grown to an obsessive 20% of GDP today.

The result: America’s health has deteriorated, as has its vaunted medical system. Instead of smoking, Americans have simply switched to other bad habits, most notably overeating, drug use, and narcissism. Obesity has soared from 13.4% of the population in 1960 to 42.5% today — despite health-conscious Americans following an endless stream of new and improved diets, purchasing an endless stream of new and improved fitness equipment, and joining gyms and health clubs, where they follow an endless stream of new and improved exercise routines.

America’s health has deteriorated, as has its vaunted medical system.

 

According to a recent CDC report, half of the country is now on prescription drugs, with 1.2 billion pills prescribed annually — mostly painkillers, cholesterol drugs, and antidepressants. In 1960, we spent $2.7 billion on prescription drugs; in 2020, we spent $348.4 billion. For emotional health, blockbuster drugs such as Librium, Valium (“Mother’s Little Helper”), and Prozac were developed to treat excessive worry, anxiety, tension, irritability, restlessness, sleep disturbance, and melancholy — psychoneurotic disorders known to previous generations as the vicissitudes of life. For pain, opioids (licit and illicit) became the new aspirin. For recreation, non-health-conscious Americans began taking an endless stream of illegal drugs (such as marijuana, cocaine, heroin, methamphetamine, and fentanyl). Today, annual overdose deaths, which ranged from 5,000 to 6,000 in the 1960s, have reached almost 108,000.

The era of abundance and leisure that fell upon the baby boomers of post-war America was met, not with gratitude, but with self-admiration. By 1976, when Tom Wolfe wrote “The ‘Me’ Decade,” narcissism, as a cultural force, was already well underway. In his 1979 book, “The Culture of Narcissism,” Christopher Lasch noted the psychological patterns of pathological narcissism that originate in the American family: “the fascination with fame and celebrity, the fear of competition, the inability to suspend disbelief, the shallowness and transitory quality of personal relations, the horror of death.” A study of American teenagers “found that while only 12 percent of those aged 14–16 in the early 1950s agreed with the statement ‘I am an important person,’ 77 percent of boys and more than 80 percent of girls of the same cohort by 1989 agreed with it.”

There is no question that America’s gluttons, drug addicts, and selfhood seekers constitute an extraordinary driver of healthcare demand. But most of the problem lies on the supply side. The American medical system has fallen precipitously from its 1960s perch as the world’s best. It is home to the world’s best doctors and medical researchers, but also to the world’s highest medical costs. This is primarily because our medical system is void of competitive forces that reward performance, efficiency, and innovation. Such forces have been meticulously removed from the healthcare industry by the AMA, hospitals, insurance companies, and politicians. They have been replaced by an immense, stumbling juggernaut of rules and regulations, organized around a third-party payment system in which the buyer (i.e., the patient) has no influence on price. Price is set, behind closed doors, by a cabal of hospital administrators and insurance company executives.

Annual overdose deaths, which ranged from 5,000 to 6,000 in the 1960s, have reached almost 108,000.

 

The Affordable Care Act (aka Obamacare), a 906 page legislative monstrosity that was supposed to bend the healthcare cost curve down, has done the opposite. Health insurance, for example, has increased 28.2% in the past year. And doctors, once guided by their Hippocratic Oath, are now coerced by its 19,368 page (as of 2016) regulatory monstrosity of subsidiary rules — for the most part written by and for lawyers and accountants. Today, one third of the cost of healthcare goes to administration. With his bill paid by an insurance company or the government, the patient doesn’t care about the exorbitant cost. Nor, apparently, does he care about the shoddy performance of medical professionals and researchers who aren’t very good at their jobs.

In celebrating the War on Cancer’s 50th anniversary, the journal Nature revealed that “The ‘war on cancer’ isn’t yet won.” Despite the advances cited, “Today, scientists rarely talk of a broad cure for cancer.” That is, medical researchers have found no cures and cancer remains the second leading cause of death. They just need more time and money: “The combination of technological advances with continued collaboration between basic and clinical researchers could sustain the momentum generated by the act into the next 50 years.” This is science theater, not science.

Cancer is caused by mutations. Two-thirds of the time it is caused by random mutations that occur during DNA replication — aka bad luck. Only 29% of the time is it induced by environmental factors; a mere 5% by genetic inheritance. The longer you live, the more likely you are to encounter a cancer-causing mutation. Further, without a cure, measures to prevent cancer will only postpone the inevitable. Unless you are taken by some other cause of death, chances are you will be eventually taken by DNA replication errors. Ironically, the most likely other cause is a preventable medical error.

Our medical system is void of competitive forces that reward performance, efficiency, and innovation. Such forces have been meticulously removed from the healthcare industry.

 

It has been estimated that as many as 400,000 people now die annually from preventable medical errors committed in hospitals. The existence of such a large number of errors has been known for decades, as has been their relentless, unobstructed increase (e.g., from 98,000 in 1999 and 180,000 in 2013). Today, if it were listed in national vital statistics reports, inpatient medical error would be the third leading cause — to say nothing of nonlethal, but serious, iatrogenic harm. Heart disease kills almost 700,000 per year; cancer kills about 600,000. However, examination of malpractice lawsuits shows that the number of cases involving death and major injury for outpatient care is about the same as for in-patient care. Thus, when inpatient and outpatient cases are combined, the annual death count could be as high as 800,000, which would make medical error the number one leading cause of death in the US.

The American Medical Association (AMA) appears to be as oblivious to medical errors as it is indifferent to obesity, drug addiction, and narcissism. Judging by the topics addressed at its 2022 Interim Meeting, it worries more about health sector decarbonization, equity, gender identity, and daylight savings time. Disturbed, perhaps, by the record-breaking 19,384 gun murders of 2020, it plans to “work with key stakeholders” to find “common ground, nonpartisan measures to mitigate the effects of firearms in our firearm injury public health crisis.” The AMA’s time would be better spent working with the stakeholders of the 800,000 patients that its members accidentally kill each year, to mitigate the effects of the medical error public health crisis. A common-ground, nonpartisan measure to reduce medical error deaths to a number less than, say, 19,384 would be a good start.

And the NIH appears to be in no rush to cure cancer. The outfit is run by academic elites who have risen to power through political posturing and the control of scientific research grants. They are the types of individuals that President Dwight Eisenhower warned against in his farewell address, when he admonished, “The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present — and is gravely to be regarded.” Under this arrangement, said Eisenhower, “a government contract becomes virtually a substitute for intellectual curiosity.” With government agencies such as the National Cancer Institute (NCI) funding cancer research, the paycheck seems to have trumped curiosity and the political posturing of its elites has trampled the urgency to discover a cure.

Indeed, in its overview and mission statement, the NCI doesn’t even mention cancer cures, blithely stating that it “leads, conducts, and supports cancer research across the nation to advance scientific knowledge and help all people live longer, healthier lives.” Instead of touting its plan to find cures, it boasts of its commitment “to the core values of equity, diversity, and inclusion that allow all staff to reach their potential and fully contribute to the institute’s cancer mission.” When doctor Francis Collins retired as NIH director, Nature examined the challenges that would face his replacement in running the world’s largest biomedical research agency. It found three: COVID, racism, and China; and urged that the new director be female, because “it’s time for a change.” At the NIH, the challenges of racism and feminism appear to be greater than those of cancer, Alzheimer’s, Parkinson’s, ALS, AIDS, and diabetes.

The AMA’s time would be better spent working with the stakeholders of the 800,000 patients that its members accidentally kill each year, to mitigate the effects of the medical error public health crisis.

 

Britannica‘s “Medicine in the 20th Century” provides a chronological discussion of the most significant medical advances of that century. It begins with the birth of the chemotherapeutic era in 1910, followed by the numerous advances that occurred between 1928, when Alexander Fleming discovered penicillin, and 1954, when Jonas Salk discovered the polio vaccine. These advances led to the conquest of diseases such as typhoid, tetanus, diphtheria, tuberculosis; malaria, leprosy, pernicious anemia, smallpox, and yellow fever. The article concludes with organ transplantation, particularly the first successful heart transplant, conducted by Christiaan Barnard in 1967. Evidently, no advances of comparable magnitude have been achieved since then — that is, during the entire time when the annual budget of the NIH exploded to $40 billion.

In recent years the US medical system has been scurrying away from competence and meritocracy to mediocrity and wokeness. In her Wall Street Journal commentary “Woke Medical Organizations Are Hazardous to Your Health,” Heather Mac Donald laments the decline in both scientific progress and the quality of medical care that will result from the ongoing shift of funding “from basic science to political projects aimed at ‘dismantling white supremacy’ in medicine.” Says MacDonald, “Virtually every major medical organization — from the American Medical Association and the American Association of Medical Colleges [AAMC] to the American Association of Pediatrics — has embraced the idea that medicine is shot through with racism and inequity.” As examples: the AMA is hurtling towards the rejection of meritocracy; the AAMC has recently developed standards under which medical school students must “master the concepts of oppression, White privilege, social risk factors, ‘race as a social construct,’ colonialism and intersectionality”; pediatric gender-affirming surgery has become a big money maker for hospitals, with, e.g., top surgeries (the removal of breasts for adolescent girls or the implantation of breasts for adolescent boys) increasing by 389% from 2016 to 2019.

How far can the American healthcare system fall? Its decline during the past 60 years has hardly been noticed. In the absence of accountability it will only continue. The elites in charge have not been fired or forced to resign; they haven’t even apologized for their incompetence. Instead, they have been rewarded and become fixtures of mediocrity in an ossified bureaucracy. Francis Collins presided over the NIH for 14 years, failing to find a cure for any kind of cancer. Anthony Fauci presided over the National Institute of Allergy and Infectious Diseases (NIAID) for 38 years, failing to find a cure for AIDS or even a cure for misinformation about it, much of it generated by his organization. Both were promoted, becoming the architects of America’s failed COVID-19 pandemic lockdown response, where they used censorship to conceal their ineptitude. In “The Collins and Fauci Attack on Traditional Public Health,” professors Jayanta Bhattacharya and Martin Kulldorff wrote that “rather than engaging in scientific discourse,“ Collins and Fauci authorized “a quick and devastating published takedown” to blacklist their views. Those views — expressed in “the Great Barrington Declaration” — applied widely accepted scientific principle to criticize the Collins-Fauci plan, in general, for ignoring the traditional methods of public health and, in particular, for the damage inflicted by lockdowns that left patients to languish “with cancer, cardiovascular disease, diabetes, and other infectious diseases, as well as mental health and much else.” Censorship of opposing views is the bane of science. Yet the leaders of the American medical community regard Collins and Fauci as paragons.

The elites in charge have not been fired or forced to resign; they haven’t even apologized for their incompetence. Instead, they have been rewarded.

 

America still has the best doctors and researchers in the world; it has the best medical schools; it has the most advanced medical technology. Yet incompetent, unaccountable elites have thoroughly squandered this potential. By any objective measure, the 60-year spending spree to improve American healthcare has been an abysmal failure. Among advanced nations, America’s medical system has become the most costly, and we have become a nation of overfed, drug-addled hypochondriacs, living in fear of diseases that our researchers cannot cure and of doctors who may kill us while under their care. American medicine’s recent foray into wokeness will only delay the discovery of cures and increase the occurrence of medical errors. In the 1960s, patients smoked in waiting rooms. Try that today and you will be ushered out of the building posthaste and with scorn. But statistically speaking, you would be less likely to die from smoking than from contact with the doctor you are waiting to see.

What does the American medical system have to say about (1) its excessive costs, (2) its inability to cure diseases, and (3) its refusal to reduce preventable errors? Its tacit reply: Americans must (4) additionally endure its servile attempts to appear woke. That is, unbridled greed, neutered curiosity, reckless patient care, and virtuous posturing are its top priorities. The new Hippocratic Oath has become “Fifth, do no harm.”

Leave a Reply

Your email address will not be published. Required fields are marked *