The Wax Meta-analysis, which was first published in 2010. It cites 37 references to support statements in its analysis of relative safety for planned, professionally-attended births home births. However the meta-analysis is based on only 12 specific studies and but all 12 were used when extrapolating data for its various conclusions, including its formal “Conclusion” statement that PHB was associated with a doubling or tripling of neonatal deaths.
It should be noted that anti-PHB conclusions arrived at by Wax et al would logically apply to free-standing birth centers (i.e. those not physically within a hospital) since the “low medical-resources” of independent birth centers is a situation that applies equally to planned home birth.
Of the 37 references cited by the meta-analysis 12 were used to compare safety outcomes between hospital & OOH. The dozen studies contributed a total of 342,056 {planned} home and 207,551 {planned} hospital deliveries for analysis, or data based on a total 549,607 births. Unfortunately, all studies based
Irrespective of such details, 549,607 birth is over half a million, which is an impressive number by any measure. At the time it was published in 2010, the Wax meta-analysis had the largest database for PHB.
{Editor’s Note: Unfortunately, all studies based on “intention” — in this case the planned or ‘intended’ place of birth, whether home and hospital — suffer from the fact that whatever the parents or the caregiver planned, the baby may have actually been born in some other and unplanned location.
The only way around this problem is for researchers to have direct information from the parents or practitioners or access to official records in the case of a death. No large studies on OOH birth have had the ability to do this.The only exception was a study by Burnett & Rooke published in JAMA in 1980 on homebirths (planned and unplanned) in North Carolina; they actually connected up all the dots for every neonatal death recorded in their data.
However, the future of electronic record keeping may allow researchers to better target their data to the situation they are actually studying.}
The Wonders of Modern Media – spreading the Word thru every Middlesex, village, and farm!
Virtually every modern media outlet — TV networks, cable news, NY Times, Wallstreet Journal, NPR, professional publications and websites that report on science — uniformly noted that the conclusions from a study of such an incredible size (i.e. over half a million) must be seen as absolutely definitive. Many opined that the Wax Meta-analysis would become the final word on the safety of planned home birth.
The definitive word generated by the Wax meta-analysis was “dangerous” — just planning to have a home birth was enough to double the overall number of neonatal deaths, even when the mother actually went to the hospital and had her baby delivered by an obstetrician. When the meta-analysis excluded data on babies with lethal congenital anomalies, its authors reported a neonatal mortality (NNM) rate that was three times higher than hospital births.
Indeed, if these assumptions had been founded on demonstratable fact, that would be the end of the story. But as is said of war, truth is its first victim. The most useful set of facts, which came from the largest study in the original 12 chosen by Wax et al, was left on the cutting room floor. This was quite extraordinary, as it included perinatal mortality rates in a total cohort of 743,070 women, with 466,112 planned home births and 276,958 planned hospital births.
The Dutch meta-analysis (see reference below) of perinatal mortality in low-risk women cared for by midwives was based on . Perinatal mortality is an umbrella category that puts all fetal deaths during pregnancy and labor, all stillbirths AND all neonatal deaths together in a single category. According to the data from three merged national perinatal databases the main conclusion of the Wax meta-analysis, which was elevated NNM rate from 2 to 4 times higher than hospital birth, was wrong.
Perinatal mortality is an umbrella category that puts all fetal deaths during pregnancy and labor, all stillbirths AND all neonatal deaths together in a single category. In this case, the PNM rate from the Netherlands that included over 321,000 PHB found no statistically-significant difference. In other words, planning and OOH birth is either just as safe or just as dangerous as a planned hospital birth. For reasons known only to its authors, this rather startling fact was not mentioned in the group’s press releases and the headlines it generated.
Ref: “Perinatal mortality and morbidity up to 28 days after birth among 743,070 low-risk planned home and hospital births: a cohort study based on three merged national perinatal databases” 2014 by de Jonge, CC Geerts, van der Goes, BW Mol, SE Buitendijk, JG Nijhuise
Royal College of Obstetricians and Gynaecologists; BJOG 2014; DOI: 10.1111/1471-0528.13084
Correspondence: Dr. A de Jonge, Department of Midwifery Science, AVAG and the EMGO Institute of Health and Care Research, VU University Medical Center, Van der Boechorststraat 7, 1081 BT, Amsterdam, The Netherlands. Email ank.dejonge@vumc.nl
A trick with mirrors in which data instead of smoke disappeared
The other and even bigger slip “twixt cup and lip” was big enough to sail an ocean liner through. While the study’s authors repeatedly pointed its really really big numbers — 12 studies with 549,607 births — they didn’t mention that only used 6 studies of the original 12 studies were use to calculate the study’s central theme and it most important conclusion — that neonatal mortality rate for women planning a home birth was twice to three times higher than a hospital birth. The biggest study (over 321,000 PHB) was NOT included.
The six studies chosen by Wax and other its authors to calculate the NNM only had 49,803 births, with only 33,302 hospital births and 16,500 PHB. When babies born with lethal birth defects were excluded, there was a total of only 47,632. This drastically reduced number is less than 1/11 eleventh of the much touted half a million.
The big study by de Jong contributed a whooping 321 307 of the 342,056 of the planned home births in the study. The other 11 studies combined only had data for 20,749 PHB. Without the study from Netherlands and other 5 studies included in meta-analysis, the original set of 342,000 PHB was whittled down to a tiny subset had only 16,500 births. Why would a researcher on the safety of PHB leave that mountain of data on the cutting room floor?
But even with only half of the original studies, it would be statistically odd for one single study to account for one-third of the entire NNM subset of research. This was a study on PHB done in Washington State on data from 1989 to 1996 (published 2002, Pang et al) and accounted for 6,133 of the 16,500 PHB in the NNM subset. In addition, NNM in the Pang Study contributed an astonishing two-thirds of the total neonatal mortality — 20 out of the total of 32 newborn deaths. However, its dubious methodology could not determine with certainty whether these newborn deaths occurred in planned vs unplanned OOH births, attended vs.
Pang and its other authors undertook a study of “planned home birth” using birth certificate data in a state that did NOT collect data on planned place of birth during its birth-registration process .
Due to these difficulties, the authors were not able to be certain that births in their study actually were planned OOH births (vs unplanned), attended (vs. precipitous or unattended), or that babies were born at term (instead of a pre-term where the baby was born before its mother could be transferred to the hospital. As a result, the study could not say for sure that newborn deaths were directly related to planned home birth.
The Pang study also was not able to separate bad outcome that resulted from parental decisions such as exercising their right to decline routine prenatal testing or a level II ultrasound that would have caught a serious anomaly such as congenital heart condition, that would have prevented the loss of life.
While no person can ever be sure of the motives of another, leaving out the single largest study on professionally-attended PHB in the meta-analysis — De Jong from the Netherland — elevates the question of bias that level of “bait and switch“. As this meta-analysis was configured, the public is lead to believe that it is getting one thing, then at the last minute, something else and entire different is substituted. The public, medical profession, and scientific community believed that what made the Wax Meta-analysis unique and ‘definitive’ on the topic of professionally-attended PHB was the size of its data pool, or in statistical-speak, its “power” i.e. more than 1/2 million births, with the PHB arm of the study containing far more that 1/4 million births.
But the conclusion of the authors, and the statement most quoted and discussed — that PHB is associated with a doubling or tripling of neonatal deaths — reflects a ‘slight of hand’ that began by leaving out 6 or the 12 studies. Of particular importance, they didn’t use any of the data from extreme large Netherlands (deJong), while at the same time including the PANG study (Outcomes of Planned Home Births in Washington State, 2002). These decisions reduced the data pool to just 16,500 PHB or 1/11th our of the original 1/4 million.
The “RESULTS” section of the meta-analysis was very positive in its finding for PHB. It reported a 2- to 10-fold reduction in medical and surgical interventions and statistically-significant reduction in several complications that are relatively frequent in hospital births (such as episiotomies, perineal lacerations and serious PPH). So why did they have such a narrow focus the “dangers” of OOH births while ignoring its obvious benefits and also ignoring the many problems with our obstetricalized maternity care system?
It is both illogical and unproductively expensive to routinely use obstetrical intervention on healthy women with normal pregnancies but over the course of the 20th century, that has become the standard of care in the US.
I can’t help but wonder if this slight-of-hand is simply a desperate attempt to distract us from asking the obvious questions”
- Is interventionist hospital care safe for healthy women with normal pregnancies?
- Is the medicalization of normal labor and birth scientific and evidence-based?
- Is it cost-effective?
- Does it meet the practical needs (emotional and developmental) as well as the immediate, delayed and long-term safety of mothers and babies?
- 70% of are childbearing women are healthy and give birth to term babies: Does it make sense to spend 25% of our entire healthcare budget on maternity care at a time when retirement-age baby boomers are requiring an ever-larger and more expense piece of the health care pie?
I can see why they want to avoid this ackward conversation.
A critical looks at the studies used in Wax’s meta-analysis of NNM ~ taking out those that don’t belong and those that do back in
The Wax Meta-analysis has three major flaws or errors. It’s greatest flaw was in not acknowledging that a critical measure of safety is the dramatic reduction in medical and surgical interventions associated with midwife-attended OOH birth prevents maternal morbidity and mortality.
Additional studies even identify that planning a vaginal birth reduces NNM compared to elective Cesarean in healthy women with no identified maternal or neonatal risk factors.
The bottom line is simple: When you reduce the CS rate you reduce maternal mortality and neonatal mortality.
By far the most important methodological error was the decision to exclude several large and methodologically well-done studies. This includes 2 large studies from Canada and a huge study from the Netherlands.
These are:
# 11. Hutton EK, Reitsma AH, Kaufman K. Outcomes associated with planned home and planned hospital births in low-risk women attended by midwives in Ontario, Canada, 2003- 2006: a retrospective cohort study. Birth 2009;36:180-9.
# 12. Janssen PA, Saxell L, Page LA, Klein MC, Liston RM, Lee SK. Outcomes of planned home birth with registered midwife versus planned hospital birth with midwife or physician. CMAJ 2009;181:377-83.
# 16.deJong A, van der Goes BY, Ravelli ACJ, et al. Perinatal mortality and morbidity in a nation- wide cohort of 529,688 low-risk planned home and hospital births {Netherlands}. BJOG 2009;116:1177-84.
The other side of that same coin, this meta-analysis is inappropriately included studies with old data (as long ago as the mid-1970s), a very small number of or otherwise inferior studies, specifically #4, #10, #13 and #15.
The last in that list — #15 — is the Pang study from Washington State. By its own account (click here), Pang contained many serious methodological flaws by:
- Substituting “educated guesses” for factual data
- Using “soft” data to arrive at “hard” conclusions
- Coming to global conclusions based on extremely narrow criteria that included (by the authors’ own admission) many missing and misclassified pieces of data
- Failing to include complications from Cesarean surgery the hospital birth cohort. In an email to me, one of its authors noted that operative and post-op complications associated with Cesarean surgery were not included in the study’s data because: “as far as I know, midwives don’t do C-sections at home”. Since the ‘average’ blood loss during a Cesarean is a 1,000 ML (twice the rate that is considered a PPH hemorrhage, this substantially lowered hosptial rates for PPH, and also affected low Apgar scores and the rate of newborns that required resuscitation in the hospital group.
- Ignoring the economic and human cost associated with the typically high rate of medical interventions in the hospital cohort (Pitocin-accelerated labors, narcotic use, epidurals, episiotomies and admission of babies to neonatal intensive care)
- Ignoring all the delayed and downstream complications associated with the high operative rate in planned hospital births, especially those in post-cesarean women in subsequent pregnancies which include an increased rate of fetal demise from placental abruptions and having to perform emergency hysterectomies due to a placenta percreta. There is a 7% maternal mortality rate associated with this well-known complication of Cesarean surgery in subsequent pregnancies
Baby deaths: the conundrum created by the Wax Meta-analysis
Perinatal death is a category that includes intrapartum and live-born babies who died within 7 days. This data is often stated twice – the rate when lethal birth defects are included and a lower rate that excludes “unpreventable” deaths associated with congenital anomalies.
Note: the 7 studies on perinatal mortality do include the large study from the Netherlands by de Jong et al, which contributed 331,666 midwife-attended PHB}
PHB –> 229 PN deaths out of a total of 331,666 PHBs (0.07)
Hosptial –> 140 PN deaths out of a total of 175,443 hospital births (0.08)
Perinatal deaths EXCLUDING birth defects
PHB ~ 225 deaths out of 330,324 home births (a rate of 0.07)
Hospital ~ 134 deaths out of 173,266 hospital births (a rate of 0.08)
Neonatal deaths ~ only 6 studies used –> de Jong and others excluded, while Pang and other were included
All newborn deaths, including those with lethal birth defects
PHB ~ 32 out of 16,500 births (1:515 or 1.9 per 1,000 live births)
Hospital 32 out of 33,302 births (1:1,040 or 0.96 per 1,000 live births)
Neonatal deaths EXCLUDING birth defects
PHB ~ 23 neonatal death out of 15,633 births (incl. hosp. transfers) 1:670 or 1.5 per 1,000 live births
Hosptial 14 neonatal deaths out of 31,999 hospital births (excluding PHB transfers) 1:2,286 or 0.43 per 1,000 live births
Wax Meta-analysis studies — what in, what out and why such strange and anti-intuitive choices for NNM rate
The meta-analysis is a relatively new form of study that rehashes the research of others by putting a lot of smaller numbers in a very big pot and seeing what conclusion can be drawn from such meta-data. What this often means — for better or worse — is that studies that did not independently have enough statistical “power” to make any definitive statements can now be added together; this gives the meta-analyst (the person, not the product) the ability to make definitive statements ‘second-hand’.
However, the definitions used by each independent study — done in different times, on different continents, on different demographics of childbearing women, in entirely different healthcare systems. It is no surprise that these studies routinely differ, sometimes quite dramatically, so that we are no longer comparing ‘like with like’. And yet, when these numbers are combined into this fancy piece of research known as a meta-analysis, each of these disparate contributions come together like compound interest; this makes some of us famous while leaving many others out in the cold.
In regard to this particular meta-analysis, the picture that comes to mind is someone with the power to make up two lists. In doing so, they assign the great majority of benefits to themselves and most of the detriments the other group. At the end of this process, it is legally a feat a complete and we are just stuck with the problems this has created.
A list of the studies that Wax et used to base its conclusions.
KEY: The studies marked with a black X were excluded from the calculation of neonatal mortality.
Those in a purple font identify the 6 studies used to figure NNM.
BLUE text indicates best qualities of studies:
- large and/or prospective
- data collected since the year 2000
I have identified those qualities with blue ampersand symbols – &&&
Red text identifies data that for various reason is not the best choice — for example, established to have flowed methodology, data more than 20 years old, so small that it could not contribute its share of data, such as randomized control trial that only had 11 participants.
The studies that should be removed from NNM subset are marked with a double “at” symbol {@@} and are tabbed to the right 5 spaces