Secondary Logo

Journal Logo

Contents: Original Research

Quality Assurance Practices in Obstetric Care

A Survey of Hospitals in California

Lundsberg, Lisbet S. PhD; Lee, Henry C. MD; Dueñas, Grace Villarin MPH; Gregory, Kimberly D. MD; Grossetta Nardini, Holly K. MLS; Pettker, Christian M. MD; Illuzzi, Jessica L. MD; Xu, Xiao PhD

Author Information
doi: 10.1097/AOG.0000000000002437

Quality and safety assurance in obstetric care are of vital importance given the large annual volume of births (3.98 million births in the United States in 2015).1 Moreover, obstetrics is unique because it involves two patients (mother and fetus) and unexpected adverse maternal and newborn outcomes such as postpartum hemorrhage and neurologic damage can have significant and often long-term health consequences on previously healthy individuals.2,3 Nevertheless, 1.0% of mothers and 2.3% of newborns experience severe morbidities during birth,3,4 suggesting potential room for improvement and need for continued attention to perinatal care quality.2,5,6

Hospitals play a pivotal role in ensuring quality of obstetric care because 98.5% of U.S. births occur within hospitals.1 However, research evaluating hospital practices of quality management activities pertaining to obstetrics is sparse and has been limited to several selected areas of care (eg, postpartum hemorrhage,7 shoulder dystocia,8 or obstetric emergencies9) rather than assessing the broad spectrum of obstetric care. A more comprehensive assessment of hospital quality assurance activities will help identify areas of current practices needing improvement.

Moreover, research has shown wide variability in intrapartum practices across hospitals (eg, elective induction of labor and cesarean delivery).10,11 This suggests a potential difference in approaches to and perceptions about risk among hospitals and possible variation in quality management activities. Identifying institutional characteristics associated with different quality assurance practices can inform types of hospitals that may benefit from enhanced efforts. To address these issues, we evaluated quality management activities among obstetric hospitals in California and examined variation in adoption of evidence-supported practices.

MATERIALS AND METHODS

Data for this study came from a larger project (the California Hospital Survey of Maternity and Newborn Care) in which a statewide cross-sectional survey of hospitals in California was conducted to understand their practices of perinatal care. With an overarching hypothesis that hospitals vary in practice of obstetric care, which could result in differences in resource utilization and birth outcomes, development of the questionnaire for the California Hospital Survey of Maternity and Newborn Care was informed by a thorough review of the relevant literature. In particular, we reviewed published articles on hospital variation in practice, resource utilization, and outcomes of obstetric care as well as factors that might contribute to such variation. A medical librarian conducted a comprehensive search of three bibliographic databases (Ovid MEDLINE, PubMed, and EMBASE) using combinations of controlled vocabulary and text words (Appendix 1, available online at http://links.lww.com/AOG/B52). Our research team (G.V.D., H.C.L., J.L.I., L.S.L., X.X.) reviewed the identified 2,270 articles, the results of which guided selection of topic areas and questions to be included in the survey. We also reviewed the design and content of questionnaires used in other relevant surveys examining hospital efforts in improving quality of perinatal care and culture of patient safety12–14 and adopted and adapted some questions into our survey instrument.

Based on these assessments, we developed a draft survey instrument. We then convened 11 clinicians and one patient representative to participate in an open-ended discussion (in the format of focus groups) to review and refine survey questions and ensure comprehensiveness of survey content. The clinicians had expertise in obstetrics, perinatology, anesthesiology, midwifery, family medicine, and obstetric nursing. We pilot-tested the refined survey instrument at two hospitals affiliated with the authors' institutions and five hospitals in California. These pilot test hospitals represented diverse characteristics in size, teaching status, and level of care and pilot test respondents had different roles and backgrounds in their obstetric units. We revised, added, and deleted survey questions based on pilot testing feedback. The final instrument included 148 questions covering a wide range of topics assessing obstetric unit infrastructure and staffing, organization and delivery of intrapartum care, cost containment initiatives, and quality assurance and improvement efforts. The final version questionnaire was offered in both hard copy and web-based format using the Qualtrics software platform.

The survey was launched in September 2015. Nonmilitary hospitals throughout California that provided obstetric delivery services at the time of the survey were eligible to participate. We identified these hospitals using a comprehensive list of maternity hospitals maintained by the California Maternal Quality Care Collaborative supplemented by our review of delivery volume (ie, having one or more deliveries) in annual financial reports submitted by all hospitals in California to the Office of Statewide Health Planning and Development.15 Working in collaboration with the California Maternal Quality Care Collaborative and the California Perinatal Quality Care Collaborative, we identified labor and delivery nurse managers, directors of maternity services, or other individuals knowledgeable about obstetric unit practices as potential survey respondents. An initial introductory email explained the purpose and process of the survey followed by a separate email inviting participation with an electronic link to the web-based survey. Recognizing that a single person might not be familiar with all topics assessed in the questionnaire, respondents were encouraged to obtain input from other knowledgeable staff as needed. Email reminders were sent to enhance response rate, and nonrespondents were followed up with a phone call and hard-copy mailings of the survey instrument. If there was no response after these reminders, we collaborated with the California Maternal Quality Care Collaborative and California Perinatal Quality Care Collaborative or contacted hospital labor and delivery units to identify an alternative individual to respond to the survey. A $25 gift card was provided to respondents in appreciation of their time.

We downloaded responses from the survey website electronically to a study database. Completed hard-copy surveys were reviewed and manually entered in the database (entered by one research staff member and quality-checked by a second). The Yale University Human Investigation Committee and the Stanford University Administrative Panel for the Protection of Human Subjects reviewed this study protocol and determined the California Hospital Survey of Maternity and Newborn Care exempt from institutional review board review because it involved a survey of public behavior. Survey participants were provided study information at the beginning of the questionnaire and their survey response indicated consent to the study. Additionally, the Stanford University Administrative Panel for the Protection of Human Subjects approved the focus group part of the study. All focus group members provided verbal consent.

This current study focused on hospital quality management practices and analyzed data from the relevant sections of the survey, including use of clinical protocols, tracking of quality indicators, credentialing requirements, training and interprofessional simulation, review of cases with significant morbidity or mortality, infrastructure and systems in gathering feedback, and other quality improvement efforts (see Appendix 2, available online at http://links.lww.com/AOG/B52, for these survey questions). We paid particular attention to several practices that were reported in prior research as effective in improving birth outcomes or patient safety or were recommended by the American College of Obstetricians and Gynecologists and the Society for Maternal-Fetal Medicine as useful steps to safely reduce primary cesarean delivery.16 These evidence-supported quality assurance or quality improvement practices included: multidisciplinary team training and simulations for shoulder dystocia,8,17,18 postpartum hemorrhage,19 and eclampsia9; tracking and review of severe maternal morbidity cases20; assessment of indicators for cesarean delivery16; and clarification and standardization of diagnosis for labor dystocia and heart rate interpretation.16

We tabulated hospital quality assurance practices as well as respondent and institutional characteristics using frequencies and percentages for categorical variables and median and interquartile range for continuous measures. We also estimated 95% CIs for the proportion of hospitals practicing each specific activity (with finite population correction because our sample included most obstetric hospitals in California). For each individual variable, our reported estimate reflected the corresponding statistic among hospitals with nonmissing data on that variable. To inform potential nonresponse bias, we compared characteristics of hospitals that completed survey sections on quality assurance with hospitals that did not complete these sections or did not respond to the survey using χ2 or Fisher exact test as appropriate.

Additionally, we performed bivariate analysis to examine the association between institutional characteristics and a hospital's likelihood of adopting evidence-supported practices. The Cochrane-Armitage test and Spearman correlation were used to assess ordering in frequency of interprofessional simulations with institutional characteristics, and χ2, Fisher exact, and Wilcoxon rank-sum tests were used to evaluate the relationship of hospital characteristics with other evidence-based practices. Institutional characteristics of teaching (compared with nonteaching) status, urban (compared with rural) location, and type of ownership (government nonfederal, private for-profit, and private nonprofit) were obtained from the 2015 American Hospital Association annual survey.21 Hospital annual delivery volume was ascertained from survey response. Statistical analysis was performed using SAS 9.4.

RESULTS

The California Hospital Survey of Maternity and Newborn Care contacted 253 hospitals initially (Appendix 3, available online at http://links.lww.com/AOG/B52). Seven hospitals did not or no longer provided labor and delivery service, and one was closed. Of the remaining 245 hospitals actively providing labor and delivery service, 191 responded to our survey as of August 2016, yielding an overall response rate of 78.0%. The present study focused on a subset of 185 hospitals that completed survey sections on quality assurance practices. These hospitals were similar in teaching status, volume, and type of ownership and more inclusive of rural institutions when compared with nonresponding hospitals or hospitals that did not complete quality assurance sections of the survey (Appendix 4, available online at http://links.lww.com/AOG/B52). Among the 185 hospitals, some quality practice responses were missing; reported statistics for each practice reflect estimates among those providing data on the corresponding variable.

Survey respondents of the 185 included hospitals were primarily directors of maternity, women's, or perinatal services (42.4%) or labor and delivery or perinatal nurse managers (29.4%) (Table 1). Median respondent tenure in their current position was 4 years (interquartile range 1.5–10) and median tenure in the current obstetric unit was 12.5 years (interquartile range 5–22). These hospitals had a median annual birth volume of 1,600 (interquartile range 700–2,885) and were diverse in characteristics.

Table 1.
Table 1.:
Characteristics of Survey Respondents and Responding Hospitals

Figure 1A and Appendix 5, available online at http://links.lww.com/AOG/B52, summarize hospital practices related to audit and review of written clinical protocols. Seventy-seven percent (77.7%) reported not having written guidelines for diagnosis of labor arrest, and another 4.9% reported having such guidelines but did not regularly audit adherence. For management of abnormal (nonreassuring) fetal heart rate, 16.8% of the hospitals did not have written protocols and 20.7% did not regularly audit them. Additionally, 77.1% and 3.3% of the hospitals reported not having or not regularly auditing protocols related to a diagnosis of anal sphincter injury and perineal repair, respectively. For other areas of care, although most hospitals regularly audited adherence to protocols, a substantial proportion did not have standard protocols or did not regularly audit them (Fig. 1A). Peer review or practice audit was the most frequently used remedy in the event of a health care provider having poor compliance with written protocols (reported by 86.5% of the hospitals; data not shown).

Fig. 1.
Fig. 1.:
Hospital practices regarding protocol auditing and interprofessional simulations. A. Hospital audit of adherence to written clinical protocols. B. Hospital conduct of interprofessional simulation for clinical scenarios. Percentages may not add to 100 as a result of rounding.Lundsberg. Hospital Obstetric Quality Assurance Practice. Obstet Gynecol 2018.

Most hospitals regularly tracked common quality indicators (Table 2), ranging from 70.9% for the rate of vaginal delivery after cesarean delivery to 99.5% for overall and primary cesarean delivery rates and 100% for patient satisfaction. However, fewer hospitals tracked indications for cesarean delivery (69.0%) and induction of labor (62.0%) or the rate of episiotomy (60.3%). When asked about engagement in other quality improvement efforts, 77.4% and 92.3% of the hospitals reported engaging in initiatives of primary cesarean delivery rate reduction and restraining elective labor induction before 39 weeks of gestation, respectively, while 91.8%, 74.6%, and 71.8% of the hospitals reported adoption of California Maternal Quality Care Collaborative tool kits for obstetric hemorrhage, preeclampsia, and early elective deliveries,22 respectively. However, fewer hospitals reported enhanced patient education regarding cesarean delivery without medical indications (58.8%).

Table 2.
Table 2.:
Hospital Tracking of Quality Indicators and Engagement in Quality Improvement Efforts

Formal credentialing or certification in fetal monitoring was required for nurses at most (79.3%) responding hospitals, whereas only 28.8% of hospitals required attending physicians or midwives to be formally credentialed or certified, and 20.1% reported formal credentialing or certification was not required of providers (Table 3). Likewise, regular training in patient communication and team communication skills was conducted primarily among nurses (60.9% and 61.4%, respectively) rather than attending physicians or midwives (19.0% and 29.9%, respectively). Importantly, one in 10 hospitals (9.7%) reported not regularly reviewing cases with significant morbidity or mortality, and 14.6% regularly reviewed these cases but did not regularly implement remedies.

Table 3.
Table 3.:
Hospital Organization of Quality Assurance Activities

Most hospitals conducted interprofessional simulations at least annually for eclampsia (53.1%), emergency cesarean delivery (64.6%), shoulder dystocia (68.7%), neonatal resuscitation (74.3%), and postpartum hemorrhage (77.7%) (Fig. 1B; Appendix 6, available online at http://links.lww.com/AOG/B52). Nevertheless, only 34.4% performed such simulations for anesthesia-related obstetric emergencies at least annually. Moreover, 26.3%, 14.3%, and 8.7% of the hospitals reported never performing interprofessional simulations for eclampsia, shoulder dystocia, or postpartum hemorrhage, respectively.

Teaching status was significantly associated with more frequent conduct of interprofessional simulations for eclampsia (P=.003), shoulder dystocia (P=.02), and postpartum hemorrhage (P=.04) (Table 4), while larger volume was associated with more frequent simulations for eclampsia (correlation coefficient=0.16, P=.04). Overall, hospital type of ownership was significantly associated with their likelihood of having written protocols for abnormal fetal heart rate tracings (P=.01). Closer examination of different types of ownership showed that hospitals with private nonprofit ownership were more likely to have a written protocol for abnormal fetal heart rate tracings than other hospitals (89.4% vs 71.7%, P=.002; data not shown in tables). There was no other statistically significant difference in practices by hospital characteristics.

Table 4.
Table 4.:
Association Between Institutional Characteristics and Evidence-Supported Obstetric Quality Assurance Practices
Table 4-A.
Table 4-A.:
Association Between Institutional Characteristics and Evidence-Supported Obstetric Quality Assurance Practices

DISCUSSION

Obstetric units at 185 hospitals in California engaged in a broad range of quality assurance activities. However, they varied in practices. Teaching hospitals and hospitals that were of larger volume or private nonprofit ownership had greater use of evidence-supported practices.

This study extends the scant but emerging literature regarding hospital obstetric quality assurance practices and patient safety initiatives. Despite growing interest in understanding hospital activities in improving quality of care,23,24 prior studies rarely focused on obstetrics although childbirth remains the leading reason for hospitalizations in the United States.25 The few studies that assessed obstetric practices were limited to procedures or guidelines in selected areas of care.7–9,26 In contrast, our analysis provides a comprehensive description of a broad spectrum of obstetric quality assurance activities, enabling improved understanding of overall practice patterns among hospitals.

We found important gaps in use of evidence-supported practices that are known as beneficial for improving patient outcome and safety. For instance, a concerning proportion of hospitals reported never or infrequent conduct of interprofessional simulations for obstetric emergencies and one tenth of obstetric units did not regularly review cases with significant morbidity or mortality, highlighting opportunities for improvement. Additionally, lack of protocols or regular audit of protocols for diagnosis of labor arrest and management of abnormal fetal status was notable; and the majority of hospitals only mandated nurses but not physicians or midwives to be formally credentialed or certified in fetal monitoring. The low proportion of hospitals that required formal credentialing in fetal monitoring for residents may be the result of our survey’s inquiring about “formal” credentialing, while some hospitals might use educational or more informal approaches to ensure that residents were practicing within standards. Likewise, some hospitals might not require “formal” credentialing for residents because they were considered still in a training position and did not have hospital privileges. Because labor dystocia or arrest and abnormal fetal heart rate status are the leading indications contributing to the increased primary cesarean delivery rate,16,27 enhancing protocols for management of these conditions may be particularly useful in safely reducing primary cesarean deliveries and associated morbidities. Further research is needed to validate our findings and assess the exact nature of protocols for management of abnormal fetal heart rate at hospitals that reported having such protocols.

We also identified significant proportions of hospitals lagging in other quality management activities (eg, tracking quality indicators, regular team communication training). The utility and cost-effectiveness of these practices in promoting better outcomes or quality of care require further investigation. Our exploratory analysis found no significant difference in the publicly reported rate of cesarean delivery among nulliparous term singleton vertex births and the rate of vaginal birth after cesarean delivery28 between hospitals that regularly tracked these rates compared with those that did not. However, such cross-sectional assessments are confounded by potential reverse causality and do not measure ultimate health outcomes. Further research prospectively monitoring maternal and neonatal outcomes and assessing cost implications29 of these initiatives will help identify beneficial practices. Additionally, some hospital practices (eg, tracking quality indicators or gathering feedback from patients but not using them to stimulate changes or remedies) may suggest missed opportunities and the need for a more proactive approach to quality assurance and more attention to the “quality” or “adequacy” of these efforts.

Moreover, there is important variation in quality assurance practices across hospitals. Hospitals that were smaller volume, nonteaching, and of private for-profit ownership may particularly benefit from enhanced efforts. Active involvement of these hospitals in future perinatal collaborative models may allow them to benefit from the pooled resources available. Nevertheless, some important differences in quality management practices among hospitals (eg, severe morbidity and mortality review) could not be explained by conventional institutional characteristics assessed in our analysis. Additional research is needed to understand the role of other factors (eg, insurer mandate of quality metrics) in explaining such variability in practice.

Findings of this study should be interpreted with several limitations in mind. First, our data came from California where there are robust quality efforts led by the California Maternal Quality Care Collaborative and California Perinatal Quality Care Collaborative.30 Hence, our findings may not be generalizable to practices elsewhere in the country. Second, like all survey research, there is a potential for nonresponse bias. Nevertheless, with an overall response rate of 78.0% (75.5% for this analysis) and largely similar characteristics between responding and nonresponding hospitals, the effect of nonresponse bias should be minimal. Third, hospital data were collected from a single informant and hence may be subjective to report or interpretation bias. However, survey respondents demonstrated extensive familiarity with their institution's obstetric practice (long tenure in their current position and obstetric unit and most identified as directors or managers of perinatal services) and 37.4% of them reported input from other colleagues when completing the survey. These should help reduce bias. Finally, the modest sample size and low proportion of hospitals with certain characteristics or practices limited our statistical power for discerning differences among hospitals. Although additional research is warranted to validate our findings, these data help facilitate hypothesis generation and inform future studies.

Overall, hospitals in California actively engaged in efforts to ensure quality of obstetric care. However, there was large variation in their practices. Further research is needed to elucidate causes of such variation and its effect on patient safety and maternal and neonatal outcomes.

REFERENCES

1. Martin JA, Hamilton BE, Osterman MJK, Driscoll AK, Mathews TJ. Births: final data for 2015. Natl Vital Stat Rep 2017;66:1.
2. Callaghan WM, Creanga AA, Kuklina EV. Severe maternal morbidity among delivery and postpartum hospitalizations in the United States. Obstet Gynecol 2012;120:1029–36.
3. Centers for Disease Control and Prevention. Severe maternal morbidity in the United States. Available at: https://www.cdc.gov/reproductivehealth/maternalinfanthealth/severematernalmorbidity.html. Retrieved August 19, 2016.
4. California Maternal Quality Care Collaborative. Quality measures, algorithm flowcharts. Available at: https://www.cmqcc.org/focus-areas/quality-measures/unexpected-complications-term-newborns/algorithm-flowcharts. Retrieved December 11, 2017.
5. D'Alton ME, Main EK, Menard MK, Levy BS. The National Partnership for Maternal Safety. Obstet Gynecol 2014;123:973–7.
6. Lyndon A, Johnson MC, Bingham D, Napolitano PG, Joseph G, Maxfield DG, et al. Transforming communication and safety culture in intrapartum care: a multi-organization blueprint. Obstet Gynecol 2015;125:1049–55.
7. Kacmar RM, Mhyre JM, Scavone BM, Fuller AJ, Toledo P. The use of postpartum hemorrhage protocols in United States academic obstetric anesthesia units. Anesth Analg 2014;119:906–10.
8. Shaddeau AK, Deering S. Simulation and shoulder dystocia. Clin Obstet Gynecol 2016;59:853–8.
9. Merién AE, van de Ven J, Mol BW, Houterman S, Oei SG. Multidisciplinary team training in a simulation setting for acute obstetric emergencies: a systematic review. Obstet Gynecol 2010;115:1021–31.
10. Barger MK, Dunn JT, Bearman S, DeLain M, Gates E. A survey of access to trial of labor in California hospitals in 2012. BMC Pregnancy Childbirth 2013;13:83.
11. Glantz JC. Labor induction rate variation in upstate New York: what is the difference? Birth 2003;30:168–74.
12. Korst LM, Feldman DS, Bollman DL, Fridman M, El Haj Ibrahim S, Fink A, et al. Variation in childbirth services in California: a cross-sectional survey of childbirth hospitals. Am J Obstet Gynecol 2015;213:523.e1–8.
13. Agency for Healthcare Research and Quality (AHRQ). Hospital survey on patient safety culture. Available at: http://www.ahrq.gov/professionals/quality-patient-safety/patientsafetyculture/hospital/index.html. Retrieved October 27, 2017.
14. King V, Slaughter-Mason S, King A, Frew P, Thompson J, Evans R, et al. Improving maternal & neonatal outcomes: toolkit for reducing cesarean deliveries. Portland (OR): Oregon Health & Sciences University; 2013.
15. Office of Statewide Health Planning and Development (OSHPD). Annual financial report. Sacramento (CA): Office of Statewide Health Planning and Development; 2017.
16. Safe prevention of the primary cesarean delivery. Obstetric Care Consensus No. 1. American College of Obstetricians and Gynecologists. Obstet Gynecol 2014;123:693–711.
17. Draycott TJ, Crofts JF, Ash JP, Wilson LV, Yard E, Sibanda T, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol 2008;112:14–20.
18. Grobman WA, Miller D, Burke C, Hornbogen A, Tam K, Costello R. Outcomes associated with introduction of a shoulder dystocia protocol. Am J Obstet Gynecol 2011;205:513–7.
19. Shields LE, Wiesner S, Fulton J, Pelletreau B. Comprehensive maternal hemorrhage protocols reduce the use of blood products and improve patient safety. Am J Obstet Gynecol 2015;212:272–80.
20. Obstetric Care Consensus No. 5: severe maternal morbidity: screening and review. Obstet Gynecol 2016;128:e54–60.
21. American Hospital Association (AHA). AHA annual survey database. Available at: https://www.ahadataviewer.com/additional-data-products/aha-survey/. Retrieved December 11, 2017.
22. California Maternal Quality Care Collaborative. Maternal quality improvement toolkits. Available at: https://www.cmqcc.org/resources-tool-kits/toolkits. Retrieved December 11, 2017.
23. Cohen AB, Restuccia JD, Shwartz M, Drake JE, Kang R, Kralovec P, et al. A survey of hospital quality improvement activities. Med Care Res Rev 2008;65:571–95.
24. Weiner BJ, Alexander JA, Baker LC, Shortell SM, Becker M. Quality improvement implementation and hospital performance on patient safety indicators. Med Care Res Rev 2006;63:29–57.
25. Podulka J, Stranges E, Steiner C. Hospitalizations related to childbirth, 2008. Rockville (MD): Agency for Healthcare Research and Quality; 2011.
26. Rasmussen OB, Yding A, Anh OJ, Sander Andersen C, Boris J. Reducing the incidence of obstetric sphincter injuries using a hands-on technique: an interventional quality improvement project. BMJ Qual Improv Rep 2016;5. pii: u217936.w7106.
27. Barber EL, Lundsberg LS, Belanger K, Pettker CM, Funai EF, Illuzzi JL. Indications contributing to the increasing cesarean delivery rate. Obstet Gynecol 2011;118:29–38.
28. Cal Hospital Compare. Available at: http://calhospitalcompare.org. Retrieved September 2017. Retrieved October 4, 2017.
29. Schuster MA, Onorato SE, Meltzer DO. Measuring the cost of quality measurement: a missing link in quality strategy. JAMA 2017;318:1219–20.
30. Centers for Disease Control of Prevention. Perinatal quality collaboratives. Atlanta (GA). Available at: https://www.cdc.gov/reproductivehealth/maternalinfanthealth/pqc.htm. Retrieved October 16, 2017.

Supplemental Digital Content

© 2018 by The American College of Obstetricians and Gynecologists. Published by Wolters Kluwer Health, Inc. All rights reserved.