Search
Close this search box.
Search
Close this search box.

A Brief Political History of Modern Medical Practice

by John L. Wilson, Jr., M. D.

 

Modern healthcare in the United States has come a long way since its homespun origins in the 18th century. The evolution of our healthcare system has been strongly influenced by not only scientific advances, but also by many non-medical historical and political events. Because healthcare is among the most regulated industries in the world, politics and medicine will continue to be strange bedfellows in the foreseeable future.

 

The history of the influences that define medical practice involves the usual players: money, power, and politics. Understanding some of the political influences may help shed light on the good, the bad, and the ugly of our present-day healthcare system and the prevailing attitudes that many physicians and consumers have developed toward alternative health practices.

 

I would like to take a look at the intended and unintended impact that the following events have had on our healthcare system:

  • The formation of the American Medical Association in 1847
  • The release of the Flexner report in 1909
  • The discovery of penicillin in 1928 and sulfa drugs in 1935
  • The regulation of healthcare by the government and professional boards
  • The passing of the Kefauver Harris Amendment in 1962
  • The establishment of Medicare under the Social Security Act in 1965
  • The increasing dominance of the pharmaceutical industry in healthcare

 

The present-day outcomes of these events could not have been anticipated and, not surprisingly, resulted in coming full circle and rediscovering the values inherent in our medical roots.

 

Doctors didn’t have any licensure requirements mandated by the government or professional boards at the start of the 19th century, a time when a person who had a calling to be a doctor received his medical education largely through apprenticeship. Doctors often worked in their communities in other capacities, such as judges or merchants, stopping to perform surgery or other riskier interventions when the treatments available through the primary healthcare of the day, midwifery or herbalism, failed to solve the problem at hand.

 

Medical societies formed in the early 19th century, and the dues-paying members eventually proclaimed themselves able to pass judgment on the competence of others in and out of the societies. The American Medical Association (AMA) was founded in 1847. By 1900, most states had licensing boards. These boards were, at that time, generally inclusive and tolerant of all types and methods of medical practice – – homeopathy, herbal medicine, midwifery, surgery, etc.

 

Then came the Flexner Report. Abraham Flexner was a high school principal hired in 1909 by the Carnegie Foundation to do a study of American medical education. Rumor had it that he had never set foot in a medical school prior to visiting 69 medical schools in 90 days to gather data. Many felt that the resulting Flexner Report was permeated by unfair bias from the influence of its underwriters: the American Medical Association, the Carnegie Foundation, and the Rockefeller General Education Board.

 

Regardless, the Flexner Report became a roadmap for the future of American healthcare, and several changes occurred as a direct result. Larger, moneyed institutions became the preferred source of medical education, replacing apprenticeship (at a time in history when people lived in rural areas and seldom traveled). States’ authority through licensure and the establishment of medical boards grew to have more influence. The centuries old healing modalities of homeopathy and herbalism were nearly eliminated through funding that favored their allopathic counterpart, a move that implied (to the public) an “official” lack of approval of the former and sanction of the latter. Seven minority (black and women) medical colleges of the day closed within three years of the release of the Flexner Report. A cottage industry style of healthcare became a corporate style dominated by white males.

 

In recent decades, medical education is coming full circle to rediscover value in its rural roots. Apprenticeship, now renamed preceptorship, has helped to attract and keep physicians in rural communities that routinely suffer from physician shortages. Once again it is considered worthwhile to become educated in the art and science of medicine at the side of an experienced physician. In addition, gender bias in selecting medical school applicants has decreased, reflected by the fact that in 1999-2000, 46 percent of entrants in US medical schools were women.

 

Throughout history, bacterial infections have plagued civilization. For millennia, people lacked scientific understanding of the cause and method of transmission of infectious illnesses and, as a result, superstitious explanations surfaced in an attempt to understand why some people got sick and died and others didn’t.

 

Two scientific discoveries in the early 1900s that resulted in treatments for infections greatly affected the future of healthcare. The hope felt as a result of the discovery of penicillin in 1928 by Alexander Fleming, a British bacteriologist, and the discovery of sulfa drugs in 1935 by Gerhard Domagk, a German physician, strongly influenced the practice of medicine. Indeed, many lives that were previously doomed to end prematurely from bacterial infections were spared.

 

However, nobody anticipated the ability of bacteria to develop resistance to antibiotics. As the limitations of antibiotics have become apparent in recent decades, healthcare practitioners who are aware of nutrition’s ability to enhance the body’s natural resistance to infections are using strategies that are reminiscent of the way our farm-based predecessors ate: organic, fresh, seasonal, whole foods. Diet and its modern counterpart, supplemental nutritional therapy, are again being used as a strategy to improve people’s ability to resist infections.

 

During the late nineteenth and twentieth centuries, scientific advances quickly prevailed over the art of medicine. But, patients, and some physicians, still know that well-indicated treatments delivered artfully in a context of understanding, hope, and encouragement are more effective than treatments delivered in a more technical style characteristic of a research scientist. A good bedside manner goes a long way toward creating a context that enhances the likelihood of healing.

 

State medical boards were formed with the intention of using uniformity in medical practice to ensure quality, not considering the possibility that uniformity would lead to conformity, which ultimately risks ensuring the status quo. The regulation of medical practice has important benefits, including helping to assure consistency of medical education and competency of practitioners. However, the authority given to regulatory agencies, like any authority, has the potential to be misused. State medical boards are charged with protecting the citizenry from charlatans, hucksters, and various sellers of snake oil. But, a narrow, biased, or uninformed interpretation of charlatanism by state medical boards can increase the risk that intelligent, proactive citizens will be denied access to the kind of healthcare they desire. Regulatory oppression of ideas that oppose prevailing medical customs has resulted in the limitation of access to treatment, a difficult pill to swallow for a person who values the freedom to choose his own healthcare options.

 

Once again, in recent decades, we seem to be coming full circle. Consumer interest in alternative medicine harkens back to practitioner-based healthcare in contrast to our present-day corporate-based healthcare. Grass root legislation has mandated that several state medical boards protect the access of their citizens to alternative medical therapies. This interest in alternative therapies is driven, in large part, by increased awareness of the dangers of a predominantly drug-based and increasingly invasive healthcare system.

 

A medical tragedy occurred in Europe in the 1950s that spawned legislation that would forever change the practice of medicine in the United States. The drug thalidomide, used to treat nausea of women in the first trimester of pregnancy, resulted in severe birth defects, typically absent or malformed limbs, in the exposed fetuses. In 1962, in an effort to prevent another similar disaster in the wake of the thalidomide disaster, the U. S. Congress passed the Kefauver Harris Amendment that required pre-market approval for drugs.

 

The dominance of drugs as a first line approach for health problems has increased the possibility of a patient being harmed by healthcare. Medical doctors today are expected to keep up with a rapidly changing and growing body of scientific information. Continuing medical education of doctors is largely accomplished through attending seminars that are sponsored by pharmaceutical companies in exchange for the opportunity to expose attendees to their latest miracle drug. Professional medical journals are full of pricey, glossy, multi-page advertisements designed to sell drugs. Television advertisements penetrate targeted healthcare consumer groups in carefully choreographed sales pitches that rapidly blur over significant side effects of the drug being promoted. Such advertising has the potential to over-expose physicians and healthcare consumers to new drugs whose safety has not yet been established.

 

In 2000, Prilosec was the world’s top selling medication with annual sales of $4.6 billion. Lipitor, a cholesterol-lowering drug, was the second best seller at $4.1 billion. All cholesterol-lowering drugs combined accounted for more than $12 billion dollars in annual sales. Can you say, “overdose?”

 

The prestigious British journal, The Lancet, in January of 2003 found pharmaceutical advertising to be “misleading.” If you need a drug, you may well be safer with an older one with a proven track record.

 

In 1965, another event strongly influenced the delivery of healthcare in the United States: the signing of Medicare into law. It was the culmination of 20 years of debate started by President Harry S. Truman who recognized the need for recipients of Social Security to have access to affordable healthcare during their retirement years. Since then, the federal legislature has enacted several Medicare laws, including authorizing Medicare payments to HMOs in 1972, authorizing routine mammogram reimbursement by Medicare in 1988 (an economic advantage for radiologists), and establishing a fee schedule, implemented in 1992, that limited Medicare reimbursement paid to physicians.

 

Fee schedules require any participating physicians who treat Medicare enrollees to accept federal reimbursement even if it is below their actual cost of providing the service and to agree not to bill the patient for the difference. As a result, many physicians have had to limit the services they offer to Medicare patients, or work within the system by seeing an increased volume of patients for shorter and shorter periods of time to make up for their inability to bill appropriately. As a result, many patients feel that they are but cows in a system of cattle chutes. Removing free enterprise from doctors (or plumbers, merchants, mechanics, or any trade) removes incentive. A lack of incentive in a service-based profession translates to consumer and provider dissatisfaction.

 

The bottom line of the cumulative effect of these political events is that many physicians and consumers are concerned about the resulting medical-pharmaceutical-industrial-insurance complex that is part and parcel of healthcare in the United States.

 

How do we go about coming full circle once again and reclaiming our legacy of practitioner-based healthcare?

 

Patients and physicians can become informed about the strengths and liabilities of our present healthcare system. Understanding the limitations of drugs to satisfactorily address our nation’s chronic health problems, the effects of regulation on the erosion of healthcare choices, the inability of the federal government to meet all the healthcare needs of its enrollees, and the impact of stripping free enterprise from healthcare providers, are philosophical stepping stones along the path of change.

 

What can a concerned healthcare consumer do? Seek out physicians who can think outside of the box (just in case the solution they need isn’t in the box), who are interested in finding the cause of symptoms, who do not use drugs as a first line approach, who are educated in the treatment of health problems in methods that avoid or minimize drug use, and who are self-empowered to make healthcare decisions independent of politically-based expedience to best serve their patients.

Share:

Facebook
Twitter
LinkedIn
Categories