A Consensus-Driven Approach to Redesigning Graduate Medical Education: The Pediatric Anesthesiology Delphi Study : Anesthesia & Analgesia

Secondary Logo

Journal Logo

Featured Articles: Original Clinical Research Report

A Consensus-Driven Approach to Redesigning Graduate Medical Education: The Pediatric Anesthesiology Delphi Study

Ambardekar, Aditee P. MD, MSEd*; Eriksen, Whitney PhD, RN; Ferschl, Marla B. MD; McNaull, Peggy P. MD§; Cohen, Ira T. MD, MEd; Greeley, William J. MD, MBA; Lockman, Justin L. MD, MSEd#,**

Author Information
Anesthesia & Analgesia 136(3):p 437-445, March 2023. | DOI: 10.1213/ANE.0000000000006128

Abstract

BACKGROUND: 

Pediatric anesthesiology fellowship education has necessarily evolved since Accreditation Council for Graduate Medical Education (ACGME) accreditation in 1997. Advancements in perioperative and surgical practices, emerging roles in leadership, increasing mandates by accreditation and certification bodies, and progression toward competency-based education—among other things—have created pressure to enrich the current pediatric anesthesiology training system. The Society for Pediatric Anesthesia (SPA) formed a Task Force for Pediatric Anesthesiology Graduate Medical Education that included key leaders and subject matter experts from the society. A key element of the Task Force’s charge was to identify curricular and evaluative enhancements for the fellowship program of the future.

METHODS: 

The Task Force executed a nationally representative, stakeholder-based Delphi process centered around a fundamental theme: “What makes a pediatric anesthesiologist?” to build consensus among a demographically varied and broad group of anesthesiologists within the pediatric anesthesiology community. A total of 37 demographically and geographically varied pediatric anesthesiologists participated in iterative rounds of open- and close-ended survey work between August 2020 and July 2021 to build consensus on the current state, known deficiencies, anticipated needs, and strategies for enhancing national educational offerings and program requirements.

RESULTS: 

Participation was robust, and consensus was almost completely achieved by round 2. This work generated a compelling Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis that suggests more strengths and opportunities in the current Pediatric Anesthesiology Graduate Medical Education program than weaknesses or threats. Stakeholders agreed that while fellows matriculate with some clinical knowledge and procedural gaps, a few clinical gaps exist upon graduation. Stakeholders agreed on 8 nonclinical domains and specific fundamental and foundational knowledge or skills that should be taught to all pediatric anesthesiology fellows regardless of career plans. These domains include (1) patient safety, (2) quality improvement, (3) communication skills, (4) supervision skills, (5) leadership, (6) medical education, (7) research basics, and (8) practice management. They also agreed that a new case log system should be created to better reflect modern pediatric anesthesia practice. Stakeholders further identified the need for the development of standardized and validated formative and summative assessment tools as part of a competency-based system. Finally, stakeholders noted that significant departmental, institutional, and national organizational support will be necessary to implement the specific recommendations.

CONCLUSIONS: 

A Delphi process achieved robust consensus in assessing current training and recommending future directions for pediatric anesthesiology graduate medical education.

KEY POINTS

Question: How should pediatric anesthesiology graduate medical education adapt to adequately prepare trainees for their future?

Findings: Stakeholders engaged in a 3-round Delphi process to achieve consensus on a Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis, clinical and nonclinical domains that should be included in the curriculum, modifications for the case log system, and suggestions for standardized curricular and assessment tools.

Meaning: Despite the diverse environment in which pediatric anesthesia training occurs across the country, stakeholders achieved robust consensus on the current state of, and recommendations for future directions in, pediatric anesthesia education; this instructive and impactful exercise will shape pediatric anesthesiology education for the future and can serve as a model for other subspecialties.

The first pediatric anesthesiology fellowships began after World War II in Boston, Philadelphia, and San Francisco.1 The number of programs increased in the 1980s, most generally clinically focused and varying in length, clinical exposure, and nonclinical requirements. There was no standardization or oversight until 1997, when the Accreditation Council for Graduate Medical Education (ACGME) formalized a 12-month fellowship with standardized clinical and nonclinical experiences.2 Since then, the education of trainees in pediatric anesthesiology has and continues to evolve.

Minimally invasive procedures, fetal interventions, advances in interventional and diagnostic radiology in children, and improved understanding of chronic pain and opioid use disorders are examples of how clinical experiences have evolved. Outside of clinical care, pediatric anesthesiologists are commonly found leading hospitalwide safety initiatives and quality improvement (QI) projects, as local or national leaders in medical education, as practice leaders in private and academic models, and as research pioneers.

Additional accreditation requirements for education in nonclinical domains of QI, scholarly activity, and medical education, as well as considerations for trainee wellness and medical leave now exist.3,4 These changes threaten to decrease clinical exposure and challenge fellow preparation for independent practice. Evolution toward competency-based medical education (CBME) through the ACGME Core Competencies5 and Milestones Project6 has changed the way medical educators evaluate and assess anesthesiology trainees.7 Consequently, there has been discussion among both education and clinical leaders in pediatric anesthesiology about whether the current fellowship goals are adequate to meet the needs of graduates.

In 2018, the Society for Pediatric Anesthesia (SPA) Board of Directors assembled a Task Force on Pediatric Anesthesia Graduate Medical Education comprised of leaders, stakeholders, and subject-matter experts from SPA membership, Pediatric Anesthesiology Leadership Council (PALC), and the Pediatric Anesthesiology Program Directors’ Association (PAPDA). The charge was to develop recommendations to ensure an appropriate supply of pediatric anesthesiology fellows who are prepared to provide high-quality perioperative care to children in all settings, lead health care into the future, and generate new knowledge to the benefit of children, families, and the specialty. A curricular Task Force subcommittee was also formed to assess the current strengths and weaknesses of fellowship training and identify curricular and evaluative enhancements for the future.

The subcommittee proposed a nationally representative, stakeholder-based Delphi process whose primary aim was to generate consensus on (1) Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis of the current state of education, (2) knowledge gaps for matriculating and graduating fellows, (3) technical and procedural gaps in current fellowship education, (4) nonclinical experiences that should be mandated for all trainees, and (5) strategies to operationalize CBME practices.

METHODS

The purpose of this survey-based qualitative study was to generate consensus on the current state of, and future directions for, pediatric anesthesiology fellowship education. Institutional review board approval with waiver of written consent was obtained as survey completion was voluntary and implied consent. The overarching theme of the survey was “What makes a pediatric anesthesiologist?” Questions centered on how a pediatric anesthesiology fellowship should be structured to prepare trainees for their future. The subcommittee determined a priori that the focus of the process should not be to make the fellowship more desirable to residents, change the number of graduates, or modify the length of training even if the results had downstream impacts on recruitment into the field. Standards for Reporting Qualitative Research (SRQR), as recommended by the Enhancing the Quality of Transparency of Health Research (EQUATOR) network, were followed from conception to completion of this study.8

Delphi methodology was utilized with the guidance from the Mixed Methods Research Laboratory (MMRL) at the University of Pennsylvania. Delphi methodology is an iterative process that aims to acquire consensus on a specific topic from subject-matter experts in areas where the achievement of higher levels of evidence cannot be used.9 Task Force members are former (A.P.A., J.L.L., and I.T.C.) and current (M.B.F.) program directors and leaders (P.P.M. and W.J.G.) in pediatric anesthesiology who work in large, academic institutions. MMRL was commissioned as a third-party firm by SPA leadership to minimize bias in the process, to facilitate the exercise, and to collate and analyze results using proper qualitative and quantitative statistical analyses. The development of the initial, open-ended survey was guided by the overarching theme noted above. The investigators consider several clinical and nonclinical curricular domains mandated by ACGME and asked stakeholders for others they thought might be included, the only departure from traditional grounded theory. A preliminary tool was subsequently reviewed and edited by Task Force leaders (P.P.M. and W.J.G.) and a qualitative and nonphysician mixed-methods research specialist (W.E.). The final survey instrument for round 1 can be found in Supplemental Digital Content 1, Appendix 1, https://links.lww.com/AA/D990.

The Task Force identified groups of stakeholders from which to recruit participants in the Delphi process. These groups included (1) pediatric anesthesia service chiefs, (2) pediatric anesthesiology fellowship program directors, (3) private-practice physician leaders who hire pediatric anesthesiologists, and (4) pediatric anesthesiologists who recently graduated from fellowship training. Membership lists from PALC, PAPDA, and SPA memberships were used to get complete lists of potential stakeholder participants. PALC includes private practice leaders who hire pediatric anesthesiologists. These lists were used by MMRL and sorted with relevant demographic information to allow stratification before random participant selection.

Fellowship programs and pediatric anesthesia divisions differ in ways that could impact participant bias. For example, there is a perception that resources, philosophy, and practice may differ between larger or more academic programs and smaller or less academic pediatric anesthesia divisions. The Delphi process benefited from stratification across geographic region and fellowship program size. Accreditation data demonstrated that the size range of fellowship programs is 1 to 19 positions, with a mode of 2, median of 3, and mean of 4.5. Using these data, we defined small programs those with 4 or fewer positions and large programs as those with 5 or more positions.

MMRL used a random number generator to assign a number to each stakeholder on each of the 4 stakeholder lists. The lists were then sorted by geographic region and program size. Within each category, MMRL then chose participants from lowest number to highest number with a target of 30 total participants to ensure adequate representation of each subgroup within each of the 4 stakeholder lists.

Before initiation of the Delphi process, the instrument was piloted by knowledgeable individuals from the stakeholder groups. It was determined that participation in the activity would require considerable time commitment to obtain high-quality data. Task Force members (A.P.A., J.L.L., P.P.M., and W.J.G.) personally invited the selected stakeholders to participate by phone to enhance recruitment rate. On these calls, members would ensure that each potential participant understood the reason for their inclusion, the importance of their time and effort, and the plan for reliance on response data to make specific recommendations for the future of pediatric anesthesiology fellowship training. These discussions allowed opportunity for potential participants to ask questions about the consensus-building process. The SPA committed to gifting a future meeting registration fee as an incentive to complete the process to the end. Completion of the survey was voluntary and implied consent. Those who declined to participate were replaced by the next numbered participant on the respective stakeholder list as described above to ensure continued and adequate representation of stakeholders.

The Research Electronic Data Capture (REDCap) database was utilized to distribute, execute, and collate survey data.10 Round 1 of survey work spanned from August to October 2020. Results of the first, open-ended, survey were reviewed; identical and similar responses were consolidated and provided the basis of questioning for subsequent rounds. In round 2 (Supplemental Digital Content 2, Appendix 2, https://links.lww.com/AA/D991), most questions were closed-ended and asked participants to endorse or not endorse a theme. Round 2 of survey work was done from February to April 2021. Themes that did not meet consensus were carried onto round 3 (Supplemental Digital Content 3, Appendix 3, https://links.lww.com/AA/D992), executed between May and July 2021, where participants were asked to score on a 5-point Likert scale. Strongly (dis)agree/(dis)agree were collapsed to assess consensus. In addition, some open-ended questions were included to allow participants to clarify responses. The study team provided guidance to the MMRL for each iteration of the survey to ensure content-specific accuracy and fidelity to the research question.

Consensus for affirmation of elements in fellowship education was defined as >2/3, or 66.7%, approval by participants. Conversely, 1/3 approval (2/3 of participants not approving) was used to define elements that should not be mandated. Items with between 50% and 66.7% agreement were reported as areas for potential future study. Qualitative analyses and consolidation of items for round 2 survey were supported by NVivo 12.0, a qualitative data analysis software program, with independent coding of responses by 2 trained qualitative coders; quantitative analyses to assess consensus were conducted using Microsoft Excel.

RESULTS

A total of 53 stakeholders were sequentially invited to participate in the Delphi process. One recent graduate declined participation; 2 program directors, 2 service chiefs, 4 recent graduates, and 5 private practitioners did not respond to our invitation. Thirty-nine stakeholders agreed to participate; one service chief withdrew on request and 1 private practitioner never provided data after agreeing to participate. Table 1 shows the composition of Delphi participants for each survey round. Table 2 provides additional granularity of the distribution of program size and geographic region among the program directors and service chiefs who completed round 1. Three rounds of survey work were executed with 37, 29, and 30 participants, respectively, in each round, and consensus was generally achieved in most domains in round 2. Domains that did not achieve consensus are provided as Supplemental Digital Content 4, Appendix 4, https://links.lww.com/AA/D993.

Table 1. - Demographics of Survey Participants
Survey 1 Survey 2 Survey 3
Total no. 37 27 30
Geographic region
 West 7 6 7
 Midwest 6 4 6
 Southeast 5 5 5
 Northeast 8 6 6
Size of program
 Smalla 14 11 14
 Large 12 10 11
Role
 Program director 12 11 12
 Service chief 14 10 13
 Private practice 4 2 2
 Recent graduate 7 4 4
aSmall program defined as ≤4 fellows.

Table 2. - Composition of Program Directors and Service Chiefs
Program director Service chief Small Large West Midwest Northeast Southeast
Program director 12 X 7 5 3 3 3 3
Service chief X 14 7 7 3 3 5 3
Small X X 14 X 2 3 4 5
Large X X X 12 4 3 4 1

The Figure summarizes the SWOT analysis that was conducted using Delphi methodology. Two rounds were needed to achieve consensus on all but 2 statements made in reference to the SWOT analysis; these required a third round. By round 3, participants agreed that insufficient exposure to the PICU environment and critical care physiology during fellowship was not a weakness of our training paradigm (72.41%) and that CRNAs taking learning opportunities from fellows (72.41%) were not a threat. Participants agreed that a 1-year fellowship was a strength (82.8%) and the prospect of a 2-year fellowship was a threat (81.5%), and that the length of training should not be increased despite increasing requirements. There was variability in responses about a mismatch between fellowship candidates and positions, particularly across different regions, but there was strong agreement that too many fellowship positions exist nationally. The requirement to engage in research, QI, and scholarly projects by fellows emerged as a strength (and the subsequent insufficient engagement in these endeavors also emerged as a weakness); both statements achieved consensus with 69.0% and 66.7% agreement, respectively.

F1
Figure.:
SWOT analysis for pediatric anesthesiology fellowship training. ACGME indicates Accreditation Council for Graduate Medical Education; ERAS, Electronic Residency Application Service; PAPDA, Pediatric Anesthesia Program Director’s Association; QI, quality improvement; SPA, Society for Pediatric Anesthesia; SWOT, Strengths, Weaknesses, Opportunities, and Threats.

Table 3 summarizes consensus among stakeholders about knowledge and procedural gaps noted in matriculating fellows. Consensus emerged in round 2 for all items. Participants agreed that there were no specific knowledge or procedural skill gaps among graduating fellows.

Table 3. - Knowledge and Procedural Skill Gaps for Matriculating Fellows
Item Agreement (%)
Knowledge gaps
 Pediatric knowledge of mitochondrial diseases 88.89
 Pediatric knowledge of genetic medicine 85.19
 Lack of skills related to research/QI/scholarly work 85.19
 Pediatric knowledge of congenital heart disease 81.48
 Pediatric knowledge of muscular dystrophies 81.48
 Business of medicine 81.48
 Pediatric knowledge of ventilation strategies 77.78
 Neonatal physiology 74.07
 Premature infant physiology 74.07
 Pediatric knowledge of cardiac physiology 66.67
 Lack of leadership skills 66.67
Procedural gaps
 Disclosing medical errors 81.48
 Neonatal care 77.78
 POCUS 74.07
 Explaining procedures to children 66.67
Abbreviations: POCUS, point-of-care ultrasound; QI, quality improvement.

The participants reached consensus around assessment of fellows in pediatric anesthesiology. They agreed that the fellowships would benefit from standardized, validated, nationally shared assessment tools (88.0%) including entrustable professional activities (69.6%) and other observation-based assessments (83.3%), simulation-based assessments (69.6%), assessment of coursework and engagement in didactics (75.0%), and standardized examinations such as the current board certification system (75.0%). They also identified need for a mechanism for protected time for mentors/faculty to provide feedback to fellows (88.0%), and minimum requirements to assess and provide feedback (75.0%). There was consensus that penalties for programs whose fellows are unsuccessful in meeting training requirements should be eliminated (72.0%). Finally, participants agreed that neither the use of a grading system (83.3%) nor the elimination of case log minimums (100.0%) would be helpful.

The Delphi process concurrently addressed the ACGME case log (minimum clinical experience) system for pediatric anesthesiology training. Participants agreed (75.9%) that current ACGME requirements broadly target the correct types of cases but that case requirements are currently too low (85.7%). A pediatric anesthesiology fellowship case log system revision is underway; that process and outcomes are discussed in an accompanying article (Ambardekar et al11).

Nonclinical domains, some of which are already mandated by ACGME program requirements,3 were included in the Delphi process to understand how fellowships should prioritize curricular enhancements, on which subcompetencies to focus, and how best to teach and assess progress. Table 4 summarizes how the stakeholders prioritized these domains with respect to clinical pediatric anesthesiology education and which methods participants thought might be used to assess growth and progress. After clinical pediatric anesthesia (the top priority), stakeholders believed that patient safety and communication skills should be the next prioritized nonclinical domains for which time during fellowship should be allotted. Supervision, leadership skills, medical education, QI, research methods, and practice management were identified as essential, in this order of priority, to be taught during the fellowship as well. Within each of the domains, the participants identified specific competencies and curricular items that should be and should not be required, as well as some specific curricular items that should and should not be taught. Specific details about these subcompetencies are included in Supplemental Digital Content 5, Appendix 5, https://links.lww.com/AA/D994, and Supplemental Digital Content 6, Appendix 6, https://links.lww.com/AA/D995. Participants supported creation of a national, shared, virtual curriculum for each of these nonclinical domains to provide uniform training across programs.

Table 4. - Pediatric Anesthesiology Areas of Focus and Ranking
Domain Mode rank Suggested assessment modality
Clinical pediatric anesthesia 1 Clinical observations, evaluations, and updated case logs
Patient safety 2 Evaluations, adherence to best practice
Communication skills 2 Multisource/360 evaluations, clinical observation, and simulation
Supervision 4 Observation during supervisory roles and evaluation
Leadership skills 5 Observation
Medical education 6 Observation, feedback, and evaluation
Quality improvement 7 Participation in and completion of QI project, didactic lectures
Research methods 8 Participation in scholarly project
Practice management 9 Does not warrant formal assessment
Abbreviation: QI, quality improvement.

The addition or expansion of nonclinical curricula adds to an already very clinically busy fellowship year. Stakeholders agreed that due to the large amount of clinical and nonclinical content they were proposing during the year, both trainees and future employers should expect some degree of on-the-job training after fellowship in clinical domains that are relevant to the job. Barriers to include education in the 8 nonclinical domains during the fellowship were identified by stakeholders to include faculty time, staffing resources, existing mentorship structures, and expertise. Stakeholders also emphasized that institutions/health care organizations must support and encourage activities that are typically nonrevenue generating but required to advance educational, academic, and leadership opportunities for trainees and junior faculty.

DISCUSSION

The primary aim of this nationally representative, stakeholder-based Delphi process was to achieve an understanding of the current state of pediatric anesthesiology fellowship training and to develop consensus around future improvements for fellowship curricula. These proposed changes can be used to design national initiatives as the specialty evolves that will withstand the internal and external pressures on graduate medical education, and to prepare and support fellows for their future careers. Participation among varied stakeholders was robust and allowed for consensus development in many areas that previously seemed to be points of national controversy.

The composition of stakeholders was a strength. Pediatric anesthesiologists train in a variety of settings and practice in even more varied clinical and academic settings across the United States. Committed to broad representation of the varied practice settings in which pediatric anesthesiologists provide care in this country, the authors felt it was important to generate consensus in an unbiased, methodical manner. The MMRL, with the support of SPA, provided the backbone for the integrity of this process.

While 37 participants engaged in the first survey, only 27 and 30 engaged in subsequent rounds. There remained even representation of geographic region, program size, and stakeholder role. Participants were replaced if they did not respond after a maximum of 5 reminders. Rather than potentially skewing data toward a certain demographic by adding willing and engaged alternatives, we kept our recruited group stable over time, and let participants engage when and where they were able.

A universal threshold percentage for consensus in Delphi work does not exist, and agreement between 51% and 80% has been suggested in the literature.12 As Delphi panels go, ours was well distributed but small; as such, we opted for a somewhat lower threshold at 66.6% agreement. Even still, our data remained quite stable across rounds 2 and 3, which some suggest is more important than a set percentage.13

SWOT analysis revealed a preponderance of strengths and opportunities over weaknesses and threats. The generally positive impression of clinical and didactic experiences in the fellowship, supported by the infrastructure provided by SPA, PAPDA, PALC, ACGME, and ERAS, validates the current paradigm’s strength. Themes identified and further honed by stakeholders as opportunities for improvement should inform leaders in anesthesiology education and provide a framework for the future. These themes include development of a shared national curriculum inclusive of specific nonclinical domains, leveraging of technology to enable this curriculum, reconsideration of case characterizations and case log minimums to better suit the specialty, and development of a standardized and validated assessment tool kit that all programs could use to provide both formative feedback and summative evaluations to its trainees.

Not unexpectedly, several themes emerged when stakeholders were asked about knowledge and procedural gaps among matriculating fellows (Table 3). Variable residency experiences across the United States in pediatric anesthesiology likely contribute to these gaps. These gaps should be communicated to residency program directors. Additionally, these topics should be included in the regular pediatric anesthesiology fellowship curriculum, though they may already be taught as suggested by stakeholder consensus on lack of knowledge or procedural gaps among graduating fellows.

Length of training in pediatric anesthesiology has been a discussion point since Andropoulos et al14 discussed the development of subspecialty pediatric anesthesiologists. There has been suggestion that lengthening pediatric anesthesiology fellowship training to 2 years might help incorporate the added clinical and nonclinical curricular components that are required of all ACGME-accredited programs, improve perceived knowledge and procedural gaps, and promote academic interests outside of clinical medicine to advance the specialty. A counter argument has also been made that many fellows go on to take clinically focused, nonacademic jobs and that academic domains should not be forced onto them. This Delphi process clarified that, regardless of career plans, our community broadly believes that trainees should have basic knowledge in the 8 identified nonclinical domains. Many of the subcompetencies and topics are already incorporated into the pediatric anesthesiology fellowship program requirements or milestones rubrics, affirming accreditation and competency requirements. Additionally, the process confirmed that stakeholders agree a strength of the current fellowship structure is its 1-year duration. To that end, fellowship should include the foundational knowledge needed for success for all future pediatric anesthesiologists regardless of career plans, and employers should be prepared to supplement this foundational knowledge based on individual career choices and local practice.

It is reassuring that there is stakeholder buy-in for the certification pathway supported by the American Board of Anesthesiology (ABA) and for the continued use of a case-log system. It is no surprise, however, that most agree the case log system requires recharacterization and revision to evolve with the complexity of our current practice. Specific findings of the Delphi process as they relate to current case log system and proposed changes are discussed in detail in the accompanying article by Ambardekar et al.11

Apprenticeship during clinical experiences alone is not sufficient for programs to succeed in a competency-based education system. CBME necessitates processes for regular, formative feedback and validated summative evaluations,15 and our stakeholders agree (96.3%) that this is an opportunity for improvement. Feedback in clinical medicine has historically been challenged for many reasons.16 A landmark article by Ende16 suggests that the learner and the faculty must work together as allies and that feedback should be based on observable behaviors and actions, delivered in a timely fashion, and permit engagement in the conversation without distraction. The manner with which feedback is delivered requires intention and care and requires faculty skill in its delivery. All of this necessitates dedicated time, thereby affirming stakeholder recommendations for prioritization of this important resource at departmental, institutional, and national levels.

Summative evaluation is also important. Aside from high-stakes board examinations, pediatric anesthesiology lacks validated tools to assess learners. There are some checklist-based assessments reported in the simulation literature relating to common and uncommon scenarios for which our community could continue to develop content validity.17–19 Similarly, simulation-based assessments for nontechnical skills in the domains of teamwork, situational awareness, and communication do exist and may be transferrable to pediatric anesthesiology practice.20,21 Other mechanisms for assessment such as performance in objective structured clinical examinations and accumulation of entrustable professional activities may exist locally but none have been formally reported in the pediatric anesthesia literature.

The paucity of assessment and evaluative tools apart from those mentioned above supports the consensus recommendation to develop a shared, standardized, and, eventually, validated set of tools for assessment and evaluation that spans the competencies mentioned here in pediatric anesthesiology. A systematic approach should be taken to understand what currently exists within programs and in the literature that may serve as foundational work for this important recommendation.

The execution of the resulting recommendations requires the significant commitment of time, expertise, and money, and this is further complicated by the variability of program size, clinical milieus, and academic interest. The perceived barriers to creating curricula in the 8 domains, developing formative and summative assessments to support CBME, and developing faculty to engage with them are real. Without the commitment of a national effort, shared resources, and leveraging of technology, these processes may be untenable.

At the time the Task Force was commissioned, there was a concerning trend in unfilled pediatric anesthesiology fellowship positions. Cladis et al22 highlighted how this mismatch between the total number of fellowship positions and matched candidates has progressively worsened since 2015 and impairs selectivity in the application process, especially if programs and their directors are encouraged to fill positions. Early in the Task Force work, there was discussion about whether recommendations should include changes to the number of pediatric anesthesiology fellowship programs/positions. As the work progressed and the Delphi process evolved, it became clear that for many reasons, including that the current available data related to pediatric anesthesia demand and supply are neither comprehensive nor well understood,23 there was no role for SPA, PALC, PAPDA, or ACGME to mandate or limit either the number or the size of Pediatric Anesthesiology Fellowships or positions. Rather, the charge of the Task Force, and ultimately 1 of the enormous, fundamental responsibilities of SPA, is to ensure that our programs graduate fellows who are prepared to provide high-quality perioperative care to children in all settings, lead health care into the future, and support dissemination of new knowledge for the benefit of children, their families, health care systems, and the specialty.

The most significant limitation to this study is the reliance on a nonvalidated survey tool. No tool exists to understand the nuanced issues relevant to pediatric anesthesiology graduate medical education studied herein. Secondly, there could be inherent participant bias in the responses as participants were volunteers in the process and willing to share their thoughts (and time) due to intrinsic interest in the topic. This could include the possibility of sampling error; participants could represent views not shared by others. These limitations, we feel, were mitigated by the inclusion of experts in mixed-methods research (W.E., MMRL) to ensure that sound methodological processes were used to build consensus with a broad participant base. Importantly, following this process, a list of specific recommendations was generated by the Task Force, and those recommendations were approved without modification by separate national membership votes of PAPDA, PALC, and the SPA Board of Directors.

CONCLUSIONS

This study provided a novel methodology to understand the current state of, gaps in, and optimal future directions for, pediatric anesthesiology fellowships in the United States. We present a process that may be helpful to other specialties within and outside of anesthesiology to evaluate their own training paradigms. A diverse group of stakeholders provided robust participation and achieved considerable consensus in most areas by round 2. Our current paradigm provides high-quality education and requires enhancements. The development of a standardized, shared, national curriculum that addresses nonclinical domains will provide consistent education and mitigate barriers of time, resource, and expertise. Efforts to update the case log system are needed to stay current with today’s practice and increase the rigor of the clinical education. A standardized toolkit for formative and summative assessments across all programs would equip program directors to ensure regular and accurate assessment of trainees. Continual and dedicated efforts to earmark time and money at the local and institutional levels and high-level buy-in and commitment at the national level will facilitate success of these efforts.

ENDORSEMENT

This manuscript was reviewed and endorsed by the Society for Pediatric Anesthesia executive leadership before submission.

ACKNOWLEDGMENTS

The authors acknowledge the Society for Pediatric Anesthesia for its financial support of the Delphi process, the broader Task Force members for their engagement and support, and the participants who generously agreed to volunteer their time, ideas, and energy to this project. The authors also thank, in advance, the many education and clinical leaders who will be asked to do the hard work of following the recommendations that emerged herein.

DISCLOSURES

Name: Aditee P. Ambardekar, MD, MSEd.

Contribution: This author helped with the conception and development of the project, the data analysis, and the writing and editing of the manuscript.

Conflicts of Interest: A. P. Ambardekar is currently volunteering with the Pediatric Anesthesiology Milestones 2.0 Writing Committee at the Accreditation Council for Graduate Medical Education (ACGME). He is the chair of the Review Committee for Anesthesiology at the ACGME.

Name: Whitney Eriksen, PhD, RN.

Contribution: This author helped with the development of the project, the survey execution, data analysis, and the writing and editing of the manuscript.

Conflicts of Interest: None.

Name: Marla B. Ferschl, MD.

Contribution: This author helped with the conception and development of the project and the writing and editing of the manuscript.

Conflicts of Interest: None.

Name: Peggy P. McNaull, MD.

Contribution: This author helped with the development of the project, the data analysis, and the writing and editing of the manuscript.

Conflicts of Interest: None.

Name: Ira T. Cohen, MD, Med.

Contribution: This author helped with the conception and development of the project and the writing and editing of the manuscript.

Conflicts of Interest: None.

Name: William J. Greeley, MD, MBA.

Contribution: This author helped with the development of the project, the data analysis, and the writing and editing of the manuscript.

Conflicts of Interest: None.

Name: Justin L. Lockman, MD, MSEd.

Contribution: This author helped with the conception and development of the project, the data analysis, and the writing and editing of the manuscript.

Conflicts of Interest: J. L. Lockman is currently volunteering with the Pediatric Anesthesiology Milestones 2.0 Writing Committee at the Accreditation Council for Graduate Medical Education (ACGME).

This manuscript was handled by: Edward C. Nemergut, MD.

GLOSSARY

ABA
American Board of Anesthesiology
ACGME
Accreditation Council for Graduate Medical Education
CBME
competency-based medical education
EQUATOR
Enhancing the Quality of Transparency of Health Research
ERAS
Electronic Residency Application Service
MMRL
Mixed Methods Research Laboratory
PALC
Pediatric Anesthesia Leadership Council
PAPDA
Pediatric Anesthesiology Program Directors’ Association
PICU
pediatric intensive care unit
QI
quality improvement
REDCap
Research Electronic Data Capture
SPA
Society for Pediatric Anesthesia
SRQR
Standards for Reporting Qualitative Research
SWOT
Strengths, Weaknesses, Opportunities, and Threats

REFERENCES

1. Costarino AT Jr, Downes JJ. Pediatric anesthesia historical perspective. Anesthesiol Clin North Am. 2005;23:573–595, vii.
2. Rockoff MA, Hall SC. Subspecialty training in pediatric anesthesiology: what does it mean? Anesth Analg. 1997;85:1185–1190.
3. Accreditation Council for Graduate Medical Education C. ACGME Program Requirements for Graduate Medical Education in Pediatric Anesthesiology. Accessed December 9, 2021. https://www.acgme.org/globalassets/pfassets/programrequirements/042_pediatricanesthesiology_2021.pdf.
4. Education LCoM. Standards: Functions and Structure of a Medical School. Accessed December 9, 2021. https://lcme.org/publications/.
5. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007;29:648–654.
6. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system–rationale and benefits. N Engl J Med. 2012;366:1051–1056.
7. Ambardekar AP, Walker KK, McKenzie-Brown AM, et al.; The anesthesiology milestones 2.0: an improved competency-based assessment for residency training. Anesth Analg. 2021;133:353–361.
8. O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89:1245–1251.
9. Niederberger M, Spranger J. Delphi technique in health sciences: a map. Front Public Health. 2020;8:457.
10. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42:377–381.
11. Ambardekar AP, Furukawa L, Eriksen W, et al. A consensus-driven revision of the Accreditation Council for Graduate Medical Education case log system: pediatric anesthesiology fellowship education. Anesth Analg. 2023;136:446–454.
12. Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs. 2000;32:1008–1015.
13. Crisp J, Pelletier D, Duffield C, Adams A, Nagy S. The Delphi method? Nurs Res. 1997;46:116–118.
14. Andropoulos DB, Walker SG, Kurth CD, Clark RM, Henry DB. Advanced second year fellowship training in pediatric anesthesiology in the United States. Anesth Analg. 2014;118:800–808.
15. Lockyer J, Carraccio C, Chan MK, et al.; ICBME Collaborators. Core principles of assessment in competency-based medical education. Med Teach. 2017;39:609–616.
16. Ende J. Feedback in clinical medical education. JAMA. 1983;250:777–781.
17. Watkins SC, Nietert PJ, Hughes E, Stickles ET, Wester TE, McEvoy MD. Assessment tools for use during anesthesia-centric pediatric advanced life support training and evaluation. Am J Med Sci. 2017;353:516–522.
18. Fehr JJ, Boulet JR, Waldrop WB, Snider R, Brockel M, Murray DJ. Simulation-based assessment of pediatric anesthesia skills. Anesthesiology. 2011;115:1308–1315.
19. Ambardekar AP, Black S, Singh D, et al. The impact of simulation-based medical education on resident management of emergencies in pediatric anesthesiology. Paediatr Anaesth. 2019;29:753–759.
20. Fletcher G, Flin R, McGeorge P, Glavin R, Maran N, Patey R. Anaesthetists’ non-technical skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth. 2003;90:580–588.
21. Clapper TC, Ching K, Mauer E, et al. A saturated approach to the four-phase, brain-based simulation framework for TeamSTEPPS in a Pediatric Medicine Unit. Pediatr Qual Saf. 2018;3:e086.
22. Cladis FP, Lockman JL, Lupa MC, et al. Pediatric anesthesiology fellowship positions: is there a mismatch? Anesth Analg. 2019;129:1784–1786.
23. Muffly MK, Singleton M, Agarwal R, et al. The pediatric anesthesiology workforce: projecting supply and trends 2015-2035. Anesth Analg. 2018;126:568–578.

Supplemental Digital Content

Copyright © 2022 International Anesthesia Research Society