Wax Meta-analysis references by category — those used and NOT used in calculating NNMR

by faithgibson on August 15, 2016

0-2 copyThe Wax Meta-analysis, which was first published in 2010. It cites 37 references to support statements in its analysis of relative safety for planned, professionally-attended births home births. However the meta-analysis is based on only 12 specific studies and but all 12 were used when extrapolating data for its various conclusions, including its formal “Conclusion” statement that PHB was associated with a doubling or tripling of neonatal deaths.

It should be noted that anti-PHB conclusions arrived at by Wax et al would logically apply to free-standing birth centers (i.e. those not physically within a hospital) since the “low medical-resources” of independent birth centers is a situation that applies equally to planned home birth.

Of the 37 references  cited by the meta-analysis 12 were used to compare safety outcomes between hospital & OOH. The dozen studies contributed a total of 342,056 {planned} home and 207,551 {planned} hospital deliveries for analysis, or data based on a total 549,607 births. Unfortunately, all studies based

Irrespective of such details, 549,607 birth is over half a million, which is an impressive number by any measure. At the time it was published in 2010, the Wax meta-analysis had the largest database for PHB.

{Editor’s Note: Unfortunately, all studies based on “intention” — in this case the planned or ‘intended’ place of birth, whether home and hospital — suffer from the fact that whatever the parents or the caregiver planned, the baby may have actually been born in some other and unplanned location.

The only way around this problem is for researchers to have direct information from the parents or practitioners or access to official records in the case of a death. No large studies on OOH birth have had the ability to do this.The only exception was a study by Burnett & Rooke published in JAMA in 1980 on homebirths (planned and unplanned) in North Carolina; they actually connected up all the dots for every neonatal death recorded in their data.

However, the future of electronic record keeping may allow researchers to better target their data to the situation they are actually studying.}

BirdAgainstHarvestMoon_09The Wonders of Modern Media – spreading the Word thru every Middlesex, village, and farm!

Virtually every modern media outlet — TV networks, cable news, NY Times, Wallstreet Journal, NPR, professional publications and websites that report on science — uniformly noted that the conclusions from a study of such an incredible size (i.e. over half a million) must be seen as absolutely definitive. Many opined that the Wax Meta-analysis would become the final word on the safety of planned home birth.

The definitive word generated by the Wax meta-analysis was “dangerous” — just planning to have a home birth was enough to double the overall number of neonatal deaths, even when the mother actually went to the hospital and had her baby delivered by an obstetrician. When the meta-analysis excluded data on babies with lethal congenital anomalies, its authors reported a neonatal mortality (NNM) rate that was three times higher than hospital births.

Indeed, if these assumptions had been founded on demonstratable fact, that would be the end of the story. But as is said of war, truth is its first victim. The most useful set of facts, which came from the largest study in the original 12 chosen by Wax et al, was left on the cutting room floor. This was quite extraordinary, as it included perinatal mortality rates in a total cohort of 743,070 women, with 466,112 planned home births and 276,958 planned hospital births.

The Dutch meta-analysis (see reference below) of perinatal mortality in low-risk women cared for by midwives was based on . Perinatal mortality is an umbrella category that puts all fetal deaths during pregnancy and labor, all stillbirths AND all neonatal deaths together in a single category. According to the data from three merged national perinatal databases the main conclusion of the Wax  meta-analysis, which was elevated NNM rate from 2 to 4 times higher than hospital birth, was wrong.

Perinatal mortality is an umbrella category that puts all fetal deaths during pregnancy and labor, all stillbirths AND all neonatal deaths together in a single category. In this case, the PNM rate from the Netherlands that included over 321,000 PHB found no statistically-significant difference. In other words,  planning and OOH birth is either just as safe or just as dangerous as a planned hospital birth. For reasons known only to its authors,  this rather startling fact was not mentioned in the group’s press releases and the headlines it generated.

Ref: “Perinatal mortality and morbidity up to 28 days after birth among 743,070 low-risk planned home and hospital births: a cohort study based on three merged national perinatal databases” 2014  by de Jonge, CC Geerts, van der Goes, BW Mol,  SE Buitendijk, JG Nijhuise

Royal College of Obstetricians and Gynaecologists; BJOG 2014; DOI: 10.1111/1471-0528.13084

Correspondence: Dr. A de Jonge, Department of Midwifery Science, AVAG and the EMGO Institute of Health and Care Research, VU University Medical Center, Van der Boechorststraat 7, 1081 BT, Amsterdam, The Netherlands. Email ank.dejonge@vumc.nl

A trick with mirrors in which data instead of smoke disappearedstreetart39-866x650

The other and even bigger slip “twixt cup and lip” was big enough to sail an ocean liner through. While the study’s authors repeatedly pointed  its really really big numbers — 12 studies with 549,607 births — they didn’t mention that only used 6 studies of the original 12 studies were use to calculate the study’s central theme and it most important conclusion — that neonatal mortality rate  for women planning a home birth was twice to three times higher than a hospital birth. The biggest study (over 321,000 PHB) was NOT included. 

The six studies chosen by Wax and other its authors to calculate the NNM only had 49,803 births, with only 33,302 hospital births and 16,500 PHB. When babies born with lethal birth defects were excluded, there was a total of only 47,632. This drastically reduced number is less than 1/11 eleventh of the much touted half a million.

The big study by de Jong contributed a whooping 321 307  of the 342,056 of the planned home births in the study. The other 11 studies combined only had data for 20,749 PHB. Without the study from Netherlands and other 5 studies included in meta-analysis, the original set of 342,000 PHB was whittled down to a tiny subset had only 16,500 births. Why would a researcher on the safety of PHB leave that mountain of data on the cutting room floor?


But even with only half of the original studies, it would be statistically odd for one single study to account for one-third of the entire NNM subset of research. This was a study on PHB done in Washington State on data from 1989 to 1996 (published 2002, Pang et al) and accounted for 6,133 of the 16,500 PHB in the NNM subset. In addition, NNM in the Pang Study contributed an astonishing two-thirds of the total neonatal mortality — 20 out of the total of 32 newborn deaths. However, its dubious methodology could not determine with certainty whether these newborn deaths occurred in planned vs unplanned OOH births, attended vs.

Pang and its other authors undertook a study of “planned home birth” using birth certificate data in a state that did NOT collect data on planned place of birth during its birth-registration process .

Due to these difficulties, the authors were not able to be certain that births in their study actually were planned OOH births (vs unplanned), attended (vs. precipitous or unattended), or that babies were born at term (instead of a pre-term where the baby was born before its mother could be transferred to the hospital. As a result, the study could not say for sure that newborn deaths were directly related to planned home birth.

The Pang study also was not able to separate bad outcome that resulted from parental decisions such as exercising their right to decline routine prenatal testing or a level II ultrasound that would have caught a serious anomaly such as congenital heart condition, that would have prevented the loss of life.


While no person can ever be sure of the motives of another, leaving out the single largest study on professionally-attended PHB in the meta-analysis — De Jong from the Netherland — elevates the question of bias that level of  “bait and switch“. As this meta-analysis was configured, the public is lead to believe that it is getting one thing, then at the last minute, something else and entire different is substituted. The public, medical profession, and scientific community believed that what made the Wax Meta-analysis unique and ‘definitive’ on the topic of professionally-attended PHB was the size of its data pool, or in statistical-speak, its “power” i.e. more than 1/2 million births, with the PHB arm of the study containing far more that 1/4 million births.

But the conclusion of the authors, and the statement most quoted and discussed — that PHB is associated with a doubling or tripling of neonatal deaths — reflects a ‘slight of hand’ that began by leaving out 6 or the 12 studies. Of particular importance, they didn’t use any of the data from extreme large Netherlands (deJong), while at the same time including the PANG study (Outcomes of Planned Home Births in Washington State, 2002). These decisions reduced the data pool to just 16,500 PHB or 1/11th our of the original 1/4 million.

The “RESULTS” section of the meta-analysis was very positive in its finding for PHB. It reported  a 2- to 10-fold reduction in medical and surgical interventions and statistically-significant reduction in several complications that are relatively frequent in hospital births (such as episiotomies, perineal lacerations and serious PPH). So why did they have such a narrow focus the “dangers” of OOH births while ignoring its obvious benefits and also ignoring the many problems with our obstetricalized maternity care system?

It is both illogical and unproductively expensive to routinely use obstetrical intervention on  healthy women with normal pregnancies but over the course of the 20th century, that has become the standard of care in the US.

I can’t help but wonder if this slight-of-hand is simply a desperate attempt to distract us from asking the obvious questions”

  • Is interventionist hospital care safe for healthy women with normal pregnancies?
  • Is the medicalization of normal labor and birth scientific and evidence-based?
  • Is it cost-effective?
  • Does it meet the practical needs (emotional and developmental) as well as the immediate, delayed and long-term safety of mothers and babies?
  • 70% of are childbearing women are healthy and give birth to term babies: Does it make sense to spend 25% of our entire healthcare budget on maternity care at a time when retirement-age baby boomers are requiring an ever-larger and more expense piece of the health care pie?

I can see why they want to avoid this ackward conversation.

puppies-9-inTrunk_Mar2010RottwA critical looks at the studies used in Wax’s meta-analysis of NNM ~ taking out those that don’t belong and those that do back in

The Wax Meta-analysis has three major flaws or errors. It’s greatest flaw was in not acknowledging that a critical measure of safety is the dramatic reduction in medical and surgical interventions  associated with midwife-attended OOH birth prevents maternal morbidity and mortality.

Additional studies even identify that planning a vaginal birth reduces NNM compared to elective Cesarean in healthy women with no identified maternal or neonatal risk factors.

The bottom line is simple: When you reduce the CS rate you reduce maternal mortality and neonatal mortality.

By far the most important methodological error was the decision to exclude several large and methodologically well-done studies. This includes 2 large studies from Canada and a huge study from the Netherlands.

These are:

11. Hutton EK, Reitsma AH, Kaufman K. Outcomes associated with planned home and planned hospital births in low-risk women attended by midwives in Ontario, Canada, 2003- 2006: a retrospective cohort study. Birth 2009;36:180-9.

12. Janssen PA, Saxell L, Page LA, Klein MC, Liston RM, Lee SK. Outcomes of planned home birth with registered midwife versus planned hospital birth with midwife or physician. CMAJ 2009;181:377-83.

# 16.deJong A, van der Goes BY, Ravelli ACJ, et al. Perinatal mortality and morbidity in a nation- wide cohort of 529,688 low-risk planned home and hospital births {Netherlands}. BJOG 2009;116:1177-84.

The other side of that same coin, this meta-analysis is inappropriately included studies with old data (as long ago as the mid-1970s), a very small number of  or otherwise inferior studies, specifically #4, #10, #13 and #15.

The last in that list — #15 — is the Pang study from Washington State. By its own account (click here), Pang contained many serious methodological flaws by:

  • Substituting “educated guesses” for factual data
  • Using “soft” data to arrive at “hard” conclusions
  • Coming to global conclusions based on extremely narrow criteria that included (by the authors’ own admission) many missing and misclassified pieces of data
  • Failing to include complications from Cesarean surgery the hospital birth cohort. In an email to me, one of its authors noted that operative and post-op complications associated with Cesarean surgery were not included in the study’s data because: “as far as I know, midwives don’t do C-sections at home”. Since the ‘average’ blood loss during a Cesarean is a 1,000 ML (twice the rate that is considered a PPH hemorrhage, this substantially lowered hosptial rates for PPH, and also affected low Apgar scores and the rate of newborns that required resuscitation in the hospital group.
  • Ignoring the economic and human cost associated with the typically high rate of medical interventions in the hospital cohort (Pitocin-accelerated labors, narcotic use, epidurals, episiotomies and admission of babies to neonatal intensive care)
  • Ignoring all the delayed and downstream complications associated with the high operative rate in planned hospital births, especially those in post-cesarean women in subsequent pregnancies which include an increased rate of fetal demise from placental abruptions and having to perform emergency hysterectomies due to a placenta percreta. There is a 7% maternal mortality rate associated with this well-known complication of Cesarean surgery in subsequent pregnancies

Baby deaths: the conundrum created by the Wax Meta-analysisSnakeCatchBird_small

Perinatal death is a category that includes intrapartum and live-born babies who died within 7 days. This data is often stated twice – the rate when lethal birth defects are included and a lower rate that excludes “unpreventable” deaths associated with congenital anomalies.

Note: the 7 studies on perinatal mortality do include the large study from the Netherlands by de Jong et al, which contributed 331,666 midwife-attended PHB

PHB –> 229 PN deaths out of a total of 331,666 PHBs                           (0.07)  

Hosptial –> 140 PN deaths out of a total of 175,443 hospital births    (0.08) 

Perinatal deaths  EXCLUDING birth defects

PHB ~ 225 deaths out of 330,324 home births                                               (a rate of 0.07)

Hospital ~  134 deaths out of  173,266 hospital births                                  (a rate of 0.08)

Neonatal deaths ~ only 6 studies used –> de Jong and others excluded, while Pang and other were included

All newborn deaths, including those with lethal birth defects

PHB ~       32 out of 16,500 births     (1:515 or 1.9 per 1,000 live births)

Hospital    32 out of 33,302 births      (1:1,040 or 0.96 per 1,000 live births)

Neonatal deaths EXCLUDING birth defects

PHB ~      23 neonatal death out of 15,633 births (incl. hosp. transfers)  1:670 or 1.5 per 1,000 live births

Hosptial   14 neonatal deaths out of 31,999 hospital births (excluding PHB transfers)  1:2,286 or 0.43 per 1,000 live births


Wax Meta-analysis studies — what in, what out and why such strange and anti-intuitive choices for NNM rate

Here is the list of the 12 studies used by Wax et al to calculate the ‘results’ of its meta-analysis. As noted above, only 6 of the 12 studies were used to calculate NNM. A number of the best studies — largest, most recent (since the year 2000) and using best methodology (prospective studies and those that had access to direct data on planning status and in the case of NNM, medical records that could identify the cause of death relative to the planned place of birth). were left on the cutting room floor.
It must be noted that the authors of such studies are free to include or exclude studies or any other sources of data as they see fit.  Currently, there are no agreed-upon rules by the scientific community, so the persons doing the work gets to do what they want. This is quite the opposite of traditional studies that require prior approval by a review committee to assure that they are not harmful, exploitive or methodologically defective.

The meta-analysis is a relatively new form of study that rehashes the research of others by putting a lot of smaller numbers in a very big pot and seeing what conclusion can be drawn from such meta-data.  What this often means — for better or worse — is that studies that did not independently have enough statistical “power” to make any definitive statements can now be added together; this gives the meta-analyst (the person, not the product) the ability to make definitive statements ‘second-hand’.

However, the definitions used by each independent study — done in different times, on different continents, on different demographics of childbearing women, in entirely different healthcare systems. It is no surprise that these studies  routinely differ, sometimes quite dramatically, so that we are no longer comparing ‘like with like’. And yet, when these numbers are combined into this fancy piece of research known as a meta-analysis, each of these disparate contributions come together like compound interest; this makes some of us famous while leaving many others out in the cold.

In regard to this particular meta-analysis, the picture that comes to mind is someone with the power to make up two lists. In doing so, they assign the great majority of benefits to themselves and most of the detriments the other group. At the end of this process, it is legally a feat a complete and we are just stuck with the problems this has created. 

From my standpoint, that is what the authors of the Wax meta-analysis did when they excluded data that if included would have reversed its negative conclusions and substituted data that was in general worthless. If useful data is your goal, two of the of the 12 studies standout as particularly egregious.
The first is a randomized control trial that only includes 11 subjects — 5 in the PHB group and 6 in the hospital cohort. Think of this when you see a study of 321,607 PHBs excluded from the NNM subset.
The second is likewise inexplicable — a study conducted in the late 1970s by a lay midwife and her obstetrician husband. She attended normal home birth and transferred laboring women who risked out for homebirth to her obstetrician-husband. Since very few midwives are married to obstetricians, there is nothing “representative” about this situation. Equally troubling is how small and how OLD this data set is — a total of only 521 home and hospitals births that occurred 40 years ago. Nonetheless,  the Koeler study was “in”, while the 2002-2006 Netherlands study was “out”. Go figure!

streetart47-485x650A list of the studies that Wax et used to base its conclusions.

KEY: The studies marked with a black X were excluded from the calculation of neonatal mortality.

Those in a purple font identify the 6 studies used to figure NNM.

BLUE text indicates best qualities of studies:

  • large and/or prospective
  • data collected since the year 2000

I have identified those qualities with blue ampersand symbols – &&& 

Red text identifies data that  for various reason is not the best choice — for example, established to have flowed methodology, data more than  20 years old, so small that it could not contribute its share of data, such as randomized control trial that only had 11 participants.  

The studies that should be removed from NNM subset are marked with a double “at” symbol {@@} and are tabbed to the right 5 spaces 

@remove@ ~ NNM ~ 4. Ackermann-Liebrich U, Voegeli T, Günter- Witt K, et al. Prospective study in Switzerland – Home versus hospital deliveries: follow-up study of matched pairs for procedure and outcomes. BMJ 1996;313:1313-8. {1989-1992, 489 PHB, 385 matched hospital cohort, total 874}
X ~ 5. Shearer JML. Five-year prospective survey of risk of booking for a home birth in Essex. BMJ 1985;219:1478-80. {1978 to 1983, 202 multip home births & 185 hospital births; total 387 }
X ~ 6. Wiegars TA, Keirse MJNC, van der Zee J, Berghs GAH. Outcome of planned home and planned hospital births in low-risk pregnancies: prospective study in midwifery practices in the Netherlands. BMJ 1996;313:1309-13. {1990-93, 1140 PHB, 696 hospital cohort, total 1,836}
NNM ~ 7. Lindgren HE, Radestad IJ, Christensson K, Hildengsson IM. Outcomes of planned home births compared to hospital births in Sweden between 1992 and 2004: a population-based register study. Acta Obstet Gynecol 2008;87:751-9 {1992-2004, 896 PHB, 11,391 hospital, total of 12,287}
@remove@ ~ NNM ~ 10. Woodcock HC, Read AW, Bower C, Stanley FJ, Moore DJ. A matched cohort study of planned home and hospital births in Western Australia 1981-1987. Midwifery 1994;10:125-35. {976 PHB, 2,928 matched hospital cohort or total of 3,904}
&&& ~ X ~ 11. Hutton EK, Reitsma AH, Kaufman K. Out- comes associated with planned home and planned hospital births in low-risk women attended by midwives in Ontario, Canada, 2003- 2006: a retrospective cohort study. Birth 2009;36:180-9. {years 2003 to 2006; 2899 PHB w/ 10,083 hospital cohort, total 12,983 }
&&&  ~  X ~ 12. Janssen PA, Saxell L, Page LA, Klein MC, Liston RM, Lee SK. Outcomes of planned home birth with registered midwife versus planned hospital birth with midwife or physician. CMAJ 2009;181:377-83. {data from 2000-2004; PHB 2899; hosp. cohort 10,083, total 12,982
@remove @ ~ NNN ~ 13. Koehler NU, Solomon DA, Murphy M. Outcomes of a rural Sonoma county home birth practice: 1976-1982; Birth 1984;11:165-9. {retrospective study of home birth practice of a solo lay midwife & her obstetrician-husband who attended her hospital transfer patients; 454 home births & 67 hosp births – total of 521}
X ~ 14. Dowswell T, Thornton JG, Hewison J, Lil- ford RJL. Should there be a {randomised} trial of home versus hospital delivery in the United Kingdom? BMJ 1996;312:753-7. {data for 1994; 5 PHB, 6 hosp cohort, total of 11}
@ remove @ ~ NNM ~ 15. Pang JWY, Heffelfinger JD, Huang GJ, Benedetti TJ, Weiss NJ. Outcomes of planned home births in Washington State: 1989-1996. Obstet Gynecol 2002;100:253-9. {1989 to 1996; 6,133 PHBs; 10,593; total of 16,726}
&& ~ X ~ 16.deJong A, van der Goes BY, Ravelli ACJ, et al. Perinatal mortality and morbidity in a nation- wide cohort of 529,688 low-risk planned home and hospital births {Netherlands}. BJOG 2009;116:1177-84. {data 2000-2006; PHB 321,307; 163,261 hosp birth cohort, total 484,568}
NNM ~ 17. Janssen PA, Lee SK, Ryan EM, et al. Outcomes of planned home births versus planned hospital births after regulation of midwifery in British Columbia. CMAJ 2002;166:315-23. {1998-1999, PHB 862, hosp cohort 1314, total 2,176