Click here to read Part 1 ~ How the maternity care system (care for healthy childbearing women) in the US originally became illogical
Part 2 ~ Obstetrical management of normal childbirth ~ 1910-1980
By 1910, the ‘standard of care’ for normal childbirth as defined by the new surgical discipline of obstetrics and gynecology was (no surprise!) an intensely medical and surgical model. As the scientific developments of each successive decade brought more and better treatments for the rare but still real complications of pregnancy and childbirth, professional enthusiasm lead to a lengthening list of medical and surgical procedures that were being used with decreasing medical justification and increasing frequency.
Interventions originally developed to treat specific complications were used routinely, which is to say preemptively, based on an assumption that the prophylactic use of a specific treatment for a particular complication would prevent that particular complication from occurring.
The distinction btw traditional childbirth practices of midwifery & the new surgical discipline of obstetrics & gynecology
As a new surgical discipline, obstetrics wanted to distinguish itself from the historical practice of midwifery. Midwives and country doctors provided non-medical support for the normal biology and physiology of childbearing, with medical and surgical interventions used only if serious complication required them. But by 1910 obstetrically-trained surgeons had developed dramatically different childbirth practices based on routine medical and surgical interventions used prophylactically. But the newly developing obstetrical profession believed their ‘modern’ model of care was vastly superior, while the model used by midwives and country doctors was “old-fashioned”, “inadequate” and “dangerous”.
The Great Divide ~
Before and After Common Knowledge of the Germ Theory
of contagion & infectious disease (BC vs AC)
During the pre–antibiotic stage of human history (prior to 1940), there was absolutely no medical treatment for the systemic bacterial infections called “blood poisoning” by the lay public and sepsis or ‘septicemia’ by the medical profession. This included the potentially-fatal septic condition following surgery (known as ‘post-op’ infections) and following childbirth, known as ‘childbed fever’ or puerperal sepsis, for which hospitalized maternity were particularly at risk.
Historically hospitals have always been recognized as bio-hazardous environments. Before recognition of microscopic pathogens, no one, including doctors, understood the critical role played by ordinary cleanliness and simple hand-washing. That doctors and hospital staff needed to disinfect their hands by scrubbing with soap and running water, use disinfectants to kill bacteria on floors, bedding, supplies and equipment, and steam to sterilize surgical instruments between patients would have been seen as illogical and a waste of time.
The story of modern biological science, and how individual scientists discovered microscopic life-forms later identified as pathogenic, is long and fascinating. Research over more than two centuries eventually lead to the “Germ Theory” in 1881, with bacteria, protozoa and viruses identified as the causative agent in contagion and infectious disease. This, in turn, lead to the discovery of anti-microbial drugs (sulfa marketed in 1934) and antibiotic drugs (penincillin in 1945) able to effectively treat potentially fatal infections.
A quick ride in the Way-back Machine
The backstory of modern medicine began in the 17th century, when Van Leeuwenhoek (1632-1723), the Dutch drapery merchant-lens grinder-amateur inventor and budding scientist who first discovered “wee beasties” in the drop of water he was examining under the single-lens microscope that he personally made for himself. These one-celled bacteria and protozoa appeared to him as wiggling threads, long strings of undulating rods and beads and rapidly twilling spirals.
Over the next two centuries, each new scientist investigator slowly advanced our knowledge of the biological sciences by standing on the shoulders of the scientists who had come before. The last and most famous is the French chemist Louis Pasteur, whose is best known as having developed the simple process we call ‘pasteurization’ still used everyday to make certain foods, especially wine and milk, safe to drink.
Pasteur studied the purification (spoiling) of liquids and infection in humans (a topic later known as microbiology and bacteriology) for several decades before publishing the final conclusions of his research in 1881.
This was the progenitor moment that created the conditions necessary to develop modern medical science as we think of as it today.
Without a thorough understanding of the Germ Theory of infectious disease and anti-microbial drugs, be our modern hospital system could not exist. They would be unable to offer emergency room care of to patients with traumatic injuries, or perform brain surgery, organ transplants, or even a simple appendectomy without risking a fatal infection a large proportion of the time. As was said in the 18th and 19th century,”the surgery was a success but the patient died anyway”. The reason was usually a post-operative infection.
The contemporary English surgeon Joseph Lister was one of those physician-scientists that built on the work of those who went before him. By happenstance Lister was fluent in French and thus able to read all of Pasteur’s earlier publications. This information allowed Lister to develop a new understanding of these issues. He particularly addressed the surgeon’s eternal conundrum characterized as “the surgery was a success, but the patient died” and answered the question of why an operation on an un-infected body cavity (such as skull, chest, abdomen, scrotum, etc) produced a surgical wound infection 95% of the time.
In hospitalized patients, post-operative deaths from septicemia ran as high as 50 to 100% of cases, depending on what part of the body was involved. With his new knowledge, Dr. Lister developed the principles of asepsis and sterile technique, which he used during operations with great success. The strict use of aseptic and sterile surgical techniques dropped his post-op mortality rate to just 3 to 5%. These scientific principles still form absolute bedrock of modern surgical techniques today, as the standard of practice for preventing infection in surgical patients. For this extraordinary and game-changing accomplishment, he was knighted and became Queen Victoria’s official surgeon. As Sir Joseph Lister, he is remembered by history as the ‘father of modern surgery‘.
General application of Lister’s principles
Lister’s aseptic principles and ideas of sterilization came to be known as “Listerization”. Antiseptic practices originally developed for operating rooms and surgical wards were soon applied to nurses and doctors both and then expanded to include the entire hospital. Historically hospitals were often referred to as ‘death houses’ since they were where the homeless and medically-indigent went to die. People who had families or economic resources avoided hospitals at all costs. Becasue they were so bio-hazardous doctors normally provide medical care to middle and upper-class families in their homes. Surgeries were literally performed on kitchen table-tops because that was safer than a hospital. Nurses were hired to care for recovering patients in their own homes.
More than 30 years after the historical Austrian obstetrician (Ignaz Semmelweis) first identified the hands of doctors, medical students and nursing staff as the vector that spread childbed fever, hand-washing finally took its rightful place as the most basic element of the ‘modern’ practice of medicine. Hospitals got rid of all their carpets and upholstered furniture, walls were washed with disinfectants, and floors regularly swabbed with strong germicides. Scrupulous hand-washing by physicians and the nursing staff became the new standard; the results were nothing short of miraculous.
Suddenly the human experience with infection was dramatically transformed nearly overnight. This affected personal hygiene, public sanitation, government health policies, medical care in general and hospitals in particular, as knowledgeable professionals were equipped to control their bio-hazards and drastically reduce contagion and cross-patient contamination.
In addition to other advances in the biological sciences and in chemistry and physics (ex. x-rays and other diagnostic technology), the principles developed and implemented by Lister paved the way for our modern relationship with hospitals as relatively safe places for medical treatments and surgical procedures.
The medical no-man’s land of between 1881 and 1945
But in a pre-antibiotic world, the word relatively was still a very important. While far fewer patients acquired a nosocomial (hospital-origin) infection, the stark reality for those who did get one of these potentially-fatal infections was unchanged since there was absolutely NO effective treatment for sepsis until sulfa drugs first become available in the US in 1938 and penicillin in March 1945.
In the 60 or so years between the discovery of the germ theory and the development of effective antibiotic treatments — a time I describe as a ‘no man’s land — the most rigorous attention to aseptic principles was still not enough to actually eliminate all fatal nosocomial infections. Regrettably it was not uncommon for virulent pathogens to spread between sick and healthy patients, or between the hands of medical and nursing staff (or contaminated hospital equipment or supplies), which in turn infected healthy patients.
Without access to antibiotics, the prevention of infection assumed center stage in hospitals, most especially for hospitalization maternity patients. In the early 1900s the only option available to assure strict asepsis was to “Listerize” childbirth — that is, adopt the same aseptic principles and sterile techniques developed by Dr. Joseph Lister for performing surgery.
[Demons under the Microscope" by Thomas Hagar, 2007]
Other aspects of Listerization included restricted access to the L&D unit, allowing ONLY hospital personnel dressed in special surgical garb (scrub suits), caps and masks who, prior to entering, used a special scrubbing technique with antiseptic soap. Unfortunately, these antiseptic protocols also eliminated the presence husbands and other family members.
Preventative Medicalization of Labor
During the first or ‘dilatation’ stage of labor, newly admitted labor patients found themselves alone in a hospital bed, socially isolated and no doubt, anxious and afraid. However, they were soon to be medicated by L&D nurses, who on doctors’ orders gave every labor woman large, frequently repeated doses of narcotics and scopolamine (amnesic drug referred to as ‘Twilight Sleep”). This was to assuring that all mothers-to-be had absolutely no memory of the labor or birth; this was a purposeful strategy to prevent a possible psychotic reaction (or other mental breakdown) by the labor patient due to the stress or pain of labor.
(ref: Twilight Sleep: New Discoveries in Painless Childbirth, Williams, 1914)
Preventative care for normal birth ~ an obstetrically-managed surgical procedure
The obstetrical management of the second stage of labor — birth of the baby — under the Lister’s principles of asepsis and sterile technique was now referred to as the surgical procedure of “delivery”. That required doctors and nurses to don surgical scrub suits, caps, masks, sterile gloves and to enforce a strict “no admittance” policy to any but L&D staff in appropriate surgical garb.
EDITing – 08-28-2016 ~ The heavily drugged mother, who still under the amnesia effects of Twilight Sleep drugs, was then moved by stretcher to the sterile OR-type delivery room. After being moved over to the operating table, the legs of the mother-to-be put in stirrups. One of the essential features of Dr. Lister’s original techniques for aseptic surgery depended on the surgical patient remaining perfectly still at all times to properly preserve the sterile field, which in turn allowed the surgeon to do the best and safest job possible. This naturally required that the patient be rendered unconscious under general anesthesia. Obstetricians also found it necessary to This was done to render her unconscious so she would be able to lay perfectly still, something that otherwise was nearly impossible for a woman in the throes of pushing stage labor. This was a core aspect of Lister’s requirements for sterile technique, as it guaranteed that doctor would be able to maintain absolute control over the sterile field and other aspect of surgical sterility. ,
she was given general anesthesia.
The actual ‘delivery’ was conducted as a strict surgical procedure that began with the episiotomy. This meant cutting into the mother’s birth canal with a sterile surgical scissor, a procedure considered necessary to prevent the mother from possibly having a serious perineal tear. Forceps were then used to ‘lift’ the baby out, an action believed to prevent possible neurological damage to the baby if the birth had proven difficult birth. The use of forceps was also believed to prevent damage to the mother’s pelvic floor, thus preventing ‘female troubles’ (such as incontinence) in the future.
The third and final stage of labor — delivery of the placenta — included that manual removal of the placenta (physically peeling it off the wall of the uterus), which was done to prevent postpartum hemorrhage. The surgical procedure of delivery concluded with suturing the episiotomy incision. Obstetricians generally believe that it is easier to suture an episiotomy incision than a naturally occurring tear.
{Ref: obstetrical textbook "Principles and Practice of Obstetrics", by Dr Joseph DeLee, 1924 edition}
In the decades before the discovery of antibiotics, the highly medicalized style of care introduced in 1910 was a last-ditch attempt to eliminate puerperal sepsis (childbed fever) in hospitalized maternity patients and by sheer happenstance, this ‘perfect storm’ of events resulted in most profound change in childbirth practices in the history of the human species.
Trying to fool Mother Nature: a smart idea or iatrogenic problem?
Each of these prophylactic interventions were associated with a significant number of well-known risks, many of which had more serious or long-term morbidity or mortality than the possible complication they were thought to prevent. In contemporary times, complications that occur as a result of medical care or medical procedures are called ‘iatrogenic’.
Many obstetrical interventions in normal childbirth are known to have iatrogenic implications. For instance the use of general anesthesia for normal childbirth (approximately 4 million births a year) as part of the Listerization protocols adopted in 1910 resulted in the complications of general anesthesia being identified in 1960 as the third leading cause of maternal deaths for the previous decade (1950 to 59).
As for the negative consequence of general anesthesia on unborn and newborn babies, exact figures are hard to come. However several historical sources (incl. Levy, 1917) identified a 40% increase in birth injuries and perinatal mortality during the first decade of Listerization. The unborn babies of hospitalized labor patients were also being exposed to repeated doses of narcotics, general anesthesia and forceps deliveries.
A more complex example is the pre-emptive use of the treatment for a retained placenta, i.e. routine manual removal. This surgical procedure requires obstetricians to reach a gloved hand up into the mother’s uterus and use his/her fingertips to separate the placenta from the uterine wall. To the obstetrical profession, it seemed perfectly logical that immediately removing the placenta (instead of waiting for the mother to expel it naturally) would eliminate all possibility of a retained placenta and/or excessive postpartum bleeding. Surely this apparently ‘simply’ procedure would greatly improve obstetrical care.
But unfortunately, routinely reaching a glove hand up into the uterus risks traumatically damaging the uterine lining. This can and does produce the very complication — massive hemorrhage — that doctors were seeking to prevent by manually removing the placenta.
It also introduced potentially-fatal bacteria by dramatically increased the likelihood that a virulent bacteria will be carried from the vagina (which is not a sterile body cavity) up into the uterus, which is and needs to remain sterile. During the pre-antibiotic era of human history (before 1945) fatal infections acquired during the childbirth process accounted for 1/3 of all maternal deaths.
On a case by case basis, the individual complications associated with the pre-emptive use of these invasive interventions were not perceived to be iatrogenic complications, but rather just a “proof of theory”, i.e. evidence confirming that childbirth was indeed pathologically dangerous.
Such “irrefutable” proof of danger seemed to justify, for safety’s sake, any level of intervention in normal birth. Complications associated with the preemptive use of interventions were obviously regrettable, but interpreted as an unpreventable kind of ‘collateral damage’ — the price that a few but very unlucky women paid so many luckier women could to be saved from the disasters of normal childbirth.
Because the obstetrical profession did not (and still does not) recognize this iatrogenic blow-back as the dangerous side effects of medically unnecessary interventions, there has never been any reason for them to re-evaluate the safety and appropriateness of the idea that highly medicalized routine childbirth practices are better and safer than the non-interventive and supportive model of physiologic care.
Obstetrical Management, circa 2015: more, better and newer interventions, but the same shot-gun approach
Beginning in the late 1970s and early 1980s, the new sub-speciality of obstetrical anesthesiology make routine access to epidural anesthesia available in the L&D units of larger hospitals. As a result, the use of narcotics during labor and general anesthesia for delivery quickly fell out of fashion, being replaced by epidural anesthesia given early in labor, which could be easily “topped-off” during the pushing state, or its level raised to surgical anesthesia if a Cesarean was decided upon.
The historical use of forceps was likewise replaced by the ease of vacuum extraction and popularity of Cesarean surgery. Now that obstetrical anesthesiologists were immediately on hand (since they were required to be present to in the L&D unit to monitor epidurals), transitions to a surgical delivery was an wonderfully seamless.
With notable (and much appreciated!) exceptions, obstetricians generally don’t believe that the right use of gravity is of much value, that is, using physiologically-based care that lets mothers remain mobile and able to be upright, out-of-bed, and able to squat to push. As members of a surgical specialty, obstetricians don’t see physiological management as part of their job description. Even if it were, the ubiquitous use of epidural in today’s obstetrical departments makes the practical use of gravity virtually impossible.
In any event, the liberal use of induction, Cesarean section, electronic fetal monitor machines that go ‘ping’, and many other new, but no less invasive, forms of interventions have replaced the most of historical forms of medicalization.
But at the most fundamental level, the modern model of obstetrical medicalization still represents the most profound change in childbirth practices in the history of the human species. At its most basic level, it is still a late 19th century model based on the same erroneous assumption that drove the newly emerging surgical specialty of obstetrics to medicalize and subsequently Listerize normal childbirth in 1910.
Continued to part 3 of the Preamble to the CCM’s VBAC statement