Electronic fetal monitoring is most commonly taught through clinical experience (Table 3), a method used by 219 programs (92%). Five programs (2%) use this as their only approach. Structured lecture, seminar, or in-service training is provided by 208 programs (87%) Case studies with FHR tracing review during a regularly scheduled conference or as part of a morbidity and mortality review are used by 204 programs (85%). More than half of programs use written materials. Less commonly used methods are computer-assisted tutorials, real-time interactive simulation, and instructional videotapes. One fellowship reported that they have no formal program for teaching EFM. Significantly more residencies than fellowships use lectures and written materials to teach EFM (P = .03).
To maintain EFM skills, clinical experience was again reported to be the most commonly used method, with 219 programs (92%) using this approach (Table 4). Case studies with FHR tracing review are also frequently used at forums such as morning rounds, department grand rounds, conferences, and morbidity and mortality reviews. Structured lectures, seminars, and in-services are used less frequently to maintain EFM skills; a similar trend is seen for written materials. Real-time interactive simulation, computer-assisted tutorials, and instructional videotapes are rarely used. Significantly more residencies use written materials (P = .01) to maintain EFM skills than do fellowships.
One hundred two programs (43%) reported that their residents and fellows participate in continuing education activities pertaining to EFM at least monthly (Table 5). An additional 55 programs (23%), the majority of which are residency programs (P = 02), provide training every 2–6 months. Twenty-three programs (10%) reported that they do not require continuing EFM education or did not respond to the question.
Competency in EFM is most commonly assessed by faculty or peer review (Table 6). Various types of quality assurance programs (eg, morbidity and mortality review) are also used frequently. Written or oral examinations (P = .001), skills checklists (P = .01), and logbooks are used exclusively by residencies. One third of fellowships (14 of 43) and 12% (24 of 196) of residencies (P = .001) reported that they do not formally assess competency in EFM.
Competency assessment is described as an ongoing process by many program directors. Seventy-six programs (32%) have monthly assessments, whereas an additional 82 programs (34%) assess their trainees every 2–6 months (Table 7).
There have been repeated calls for improved educational programs and competency validation in EFM, 15,16 but according to our survey, most US training programs still use supervised clinical experiences as both their primary(and, in some cases, only) source of teaching EFM and their principal competency assessment tool.
The ideal format for teaching EFM is unknown. The traditional model of medical education places a trainee in a supervised environment with the expectation that guided experience will lead to acquisition of skills adequate for independent, competent practice in the community. This “learn by doing” approach assumes that FHR tracings encountered during training will be varied enough to illustrate all of the important principles and patterns of EFM and that experienced individuals will be on hand to review the tracings and teach from them. Often neither is the case. 17 Didactic teaching sessions are particularly helpful for novices learning EFM but are extremely time intensive and, hence (as reflected in our survey), not practical for skill maintenance. Physicians are motivated to learn in response to specific problems posed by particular patients. 18 Perhaps it is on this basis that the program directors we surveyed reported that they frequently use case studies with strip review as a tool for teaching EFM. Computer programs, when compared to an equivalent lecture, have been shown to produce similar knowledge gains in EFM skills in nearly half the time 19 but, according to our survey, are not widely used for training physicians in the United States.
The optimal frequency for providing continuing education in EFM is, likewise, unknown. Beckley et al 20 found that knowledge pertaining to EFM interpretation learned from a computer program was retained up to 7 months. Trepanier et al 21 determined that participants in a 6-hour-long EFM workshop lost little of their pedantic knowledge after a 6-month period, but that their practical clinical skills dropped by almost 10% in the same time frame. Two thirds of the programs we surveyed reported that their constituents participate in EFM training activities at least every 6 months. Maternal–fetal medicine fellows are more focused in their training and, hence, participate in these activities more frequently than obstetrics and gynecology residents. One residency program director reported that their residents “may rotate on Oncology and go for 5 months without reading a [FHR] strip.” Other factors stated by respondents that affect the frequency of EFM instruction include volume of births and difficulty of cases.
Clinical competence is defined as the capability to perform acceptably those duties directly related to patient care. 22 Since 1994, the Joint Commission on Accreditation of Healthcare Organizations has mandated yearly validation of core competencies for nurses, 23 and formalized assessment in EFM has already been incorporated into nursing practice. 24 Physicians are not usually employed by hospitals, so they less commonly have similar processes or standards. 15 The typical expectation for physician competence has been graduation from an accredited medical school, completion of a residency program, successful completion of board examinations, and maintenance of a state license to practice medicine. But standards are changing. The Outcome Project of the Accreditation Council for Graduate Medical Education is a long-term initiative designed to increase emphasis on competency-based assessment of residents, thereby improving patient care. 25 Beginning in July 2002, subspecialty programs (including obstetrics and gynecology) were required to adopt new competency program requirements. As a consequence, we found in our survey that residency programs are more likely to formally assess their constituents than are fellowship programs.
We recognize the limitations of this survey. The findings are drawn exclusively from the subjective experience and perspectives of residency and fellowship program directors. Input from other participants in the educational process (such as residents, fellows, and faculty) was not solicited. Despite the overall candor of responses and comments, we acknowledge that individual respondents may have chosen to emphasize the strengths rather than the weaknesses of their respective programs. However, because this group of physicians is frequently surveyed in efforts to gauge training activities, we believe their answers represent the accepted true status of their programs. Another limitation is that only 76% of program directors responded. Our sample, however, was balanced both geographically and in program size and affiliation. It is unlikely, though unprovable, that the nonresponders have more detailed formal programs for EFM education and assessment. The largest obstacle in devising a survey to accurately characterize training and assessment was establishing specific definitions for each training and assessment method. We selected the terms in an attempt to minimize variation in interpretation of survey questions, but despite our best efforts, this variation will undoubtedly exist and influence program director's responses. “Clinical experience” was meant to encompass skills gained while taking care of real patients in labor. This type of training is customarily supervised by an attending physician, fellow, or senior-level resident, though the degree of supervision can vary. “Written materials” was intended to include textbooks, manuscripts, pamphlets, and similar types of textual instructional resources. We meant for “written examination” to refer to paper-based testing materials. “Formal” EFM training and assessment was intended to refer to a program's prescribed educational agenda and trainee evaluation.
A 1989 survey of Canadian hospitals found that EFM training programs for physicians varied widely and called for standardized educational programs and practice protocols for teaching hospitals in that country. 26 This survey documents a lack of formalized education in US physician training programs for this common obstetric skill and will, hopefully, serve as a starting point for critically evaluating and improving EFM teaching and assessment methods.
1. Leveno KJ, Cunningham FG, Nelson S, Roark M, Williams ML, Guzick D, et al. A prospective comparison of selective and universal electronic fetal monitoring in 34,995 pregnancies. N Engl J Med 1986;315:615–9.
2. MacDonald D, Grant A, Sheridan-Pereira M, Boylan P, Chalmers I. The Dublin randomized controlled trial of intrapartum fetal heart rate monitoring. Am J Obstet Gynecol 1985;152:524–39.
3. Haverkamp AD, Orleans M, Langendoerfer S, McFee J, Murphy J, Thompson HE. A controlled trial of the differential effects of intrapartum fetal monitoring. Am J Obstet Gynecol 1979;134:399–412.
4. Neldam S, Osler M, Hansen PK, Nim J, Smith SF, Hertel J. Intrapartum fetal heart rate monitoring in a combined low and high-risk population: A controlled clinical trial. Eur J Obstet Gynecol Reprod Biol 1986;23:1–11.
5. Kelso IM, Parsons RJ, Lawrence GF, Arora SS, Edmonds DK, Cooke ID. An assessment of continuous fetal heart rate monitoring in labor. A randomized trial. Am J Obstet Gynecol 1978;131:526–32.
6. Luthy DA, Shy KK, van Belle G, Larson EB, Hughes JP, Benedetti TJ, et al. A randomized trial of electronic fetal monitoring in preterm labor. Obstet Gynecol 1987;69:687–95.
7. Curtin SC, Mathews TJ. U.S. obstetric procedures, 1998. Birth 2000;27:136–8.
8. Thacker SB, Stroup D, Chang M. Continuous electronic heart rate monitoring for fetal assessment during labor. Cochrane Database Syst Rev 2001;2:CD000063.
9. American College of Obstetricians and Gynecologists. 1999 survey of professional liability: Claims data tabulations. Washington: American College of Obstetricians and Gynecologists, 1999.
10. Accreditation Council for Graduate Medical Education Program requirements for residency education in obstetrics and gynecology. Available at: http://www.acgme.org/req/220pr701.asp
. Accessed 2002 Mar 4.
11. Council on Resident Education in Obstetrics and Gynecology. Educational objectives: Core curriculum in obstetrics and gynecology. Washington: Council on Resident Education in Obstetrics and Gynecology, 2002.
12. American Board of Obstetrics and Gynecology. Guide to learning in maternal-fetal medicine. Dallas: American Board of Obstetrics and Gynecology, 1996.
15. Vogler JH. National standardization of fetal monitoring terminology and competency validation: A call for interdisciplinary collaboration. J Perinatol 1997;17:228–32.
16. Cibils LA. On intrapartum fetal monitoring. Am J Obstet Gynecol 1996;174:1382–9.
17. Catanzarite VA. FMTUTOR: A computer-aided instructional system for teaching fetal monitor interpretation. Am J Obstet Gynecol 1987;156:1045–8.
18. Slotnick HB. How doctors learn: Physicians’ self-directed learning episodes. Acad Med 1999;74:1106–17.
19. Murray ML, Higgins P. Computer versus lecture: Strategies for teaching fetal monitoring. J Perinatol 1996;16:15–9.
20. Beckley S, Stenhouse E, Greene K. The development and evaluation of a computer-assisted teaching programme for intrapartum fetal monitoring. BJOG 2000;107:1138–44.
21. Trepanier MJ, Niday P, Davies B, Sprague A, Nimrod C, Dulberg C, et al. Evaluation of a fetal monitoring education program. J Obstet Gynecol Neonatal Nurs 1996;25:137–44.
23. Afriat CI, Simpson KR, Chez BF, Miller LA. Electronic fetal monitoring competency—to validate or not to validate: The opinions of experts. J Perinat Neonatal Nurs 1994;8:1–16.
24. Schmidt JV. The development of AWHONN's fetal heart monitoring principles and practices workshop. Association of Women's Health, Obstetric and Neonatal Nurses. JObstet Gynecol Neonatal Nurs 2000;29:509–15.
© 2003 The American College of Obstetricians and Gynecologists
26. Davies BL, Niday PA, Nimrod CA, Drake ER, Sprague AE, Trepanier MJ. Electronic fetal monitoring: A Canadian survey. CMAJ 1993;148:1737–42.