In Response : Anesthesia & Analgesia

Secondary Logo

Journal Logo

Brief Reports, Book & Media Reviews, Correspondence, Errata: Letter to the Editor

In Response

Warner, David O. MD; Lien, Cynthia A. MD; Wang, Ting PhD; Zhou, Yan PhD; Isaak, Robert S. DO; Peterson-Layne, Cathleen PhD, MD; Harman, Ann E. PhD; Macario, Alex MD, MBA; Gaiser, Robert R. MD; Suresh, Santhanam MD, MBA; Rathmell, James P. MD; Keegan, Mark T. MB, BCh; Cole, Daniel J. MD; Fahy, Brenda G. MD, MCCM; Dainer, Rupa J. MD; Sun, Huaping PhD

Author Information
Anesthesia & Analgesia 133(1):p e5-e7, July 2021. | DOI: 10.1213/ANE.0000000000005556

We appreciate the interest of Dr Goudra1 in our study describing the first-year results of administering an Objective Structure Clinical Examination (OSCE) component of the American Board of Anesthesiology’s (ABA, Raleigh, NC) APPLIED Examination for initial certification2 The following provides responses to the points raised.

Dr Goudra1 states that the introduction of the OSCE “…was to satisfy the 2015 Accreditation Council for Graduate Medical Education’s (ACGME) safety and quality improvement requirements for all residency programs with particular reference to professionalism and communication.” This is nowhere stated by ABA and is not true; indeed, ACGME does not set requirements for board certification. We have published a detailed rationale for the introduction of the OSCE in a prior publication3 and will not restate this rationale here in the interests of space.

Dr Goudra1 states that “…to pass these skills, the ABA expects most candidates to perform these ‘communication tests’ by abiding to definite criteria and ‘check’ certain boxes which are detailed in their published OSCE scoring guidance.” As described in our previous article,3 this is not the approach taken by the ABA. After careful consideration, the ABA explicitly chose not to use a “check box” system, but rather a global score rating how often the candidate demonstrated the attributes of an ABA Diplomate. The OSCE content outline does provide a description of each station to help candidates prepare, but this is not a “checklist.” We agree that there is no single “standardized” approach to patient encounters, and that in clinical practice tailoring to individual circumstances is necessary to meet the needs of each patient. However, we also believe there are guiding principles that apply to a given clinical scenario that can be used by examiners to provide a global rating of candidate performance (eg, delivering all elements of informed consent, respect for patient autonomy, etc). We also note that in addition to assessing communication and professionalism, 2 of the 7 OSCE stations also assess technical skills such as the interpretation of monitors and application of ultrasound.

Dr Goudra1 discusses evidence regarding whether certification is associated with measures such as patient outcomes or disciplinary proceedings against anesthesiologists. There is now a considerable literature on this topic among American Board of Medical Specialties (ABMS) Member Boards, generally supporting an association between certification and such outcomes.4–6 Specific to anesthesiologists, as the author notes, we found in a previous study that the ability to pass an oral examination (the Structured Oral Examination [SOE] component of the APPLIED Examination, the only component administered at the time of the study) was correlated with the risk of later disciplinary license actions; just passing the written examination was not correlated.7 The author criticizes this finding because this outcome includes actions on the basis of conditions such as substance use disorder that are perhaps not relevant to the certification process. Please note we also performed a sensitivity analysis excluding actions related to substance use (Supplemental Table 4 of ref 6), which had little effect on the results. We have not yet performed a similar analysis for the OSCE as it has not been administered for long enough to do so. We agree that such an analysis will be important when it becomes practicable. As we state,2 “further study will be necessary to determine if there is a relationship between OSCE scores and measures of physician performance in clinical practice…”

Dr Goudra1 has questions related to the Rasch model used to score the OSCE based on the ratings of individual stations, which accounts for factors including station difficulty and examiner severity. The application of this model to the OSCE is also described in our previous article.3 Regarding Table 2,2 the second column presents station difficulty for all candidates (higher values indicate greater difficulty); the third column presents station performance of those candidates who failed the OSCE, attempting to describe which stations these failing candidates found most difficult. The order of difficulty differed somewhat for both groups, but there is no reason to think that they would necessarily be the same. He also states that we did “… not explain how the pass/fail cutoff was determined for the OSCE exam” in the article under discussion.2 This is true, because we had already done so in our previous article.2 To quote this earlier study, “The standard for passing the OSCE was set by the ABA Board of Directors based on the expectation that minimally competent candidates would, on average, ‘often’ demonstrate the qualities expected of an ABA Diplomate. The many-faceted Rasch model calculates the lowest logit measure that corresponds to the fair average of ‘often’ based on the ABA’s 4-point global rating scale, which accounts for examiner severity and station difficulty. To reduce the chance of ‘false negatives’ (ie, candidates who should have passed), the standard is adjusted down by a fraction of the standard error of measurement determined by the ABA. This adjustment gives the candidate the ‘benefit of the doubt’ due to measurement error.”3 A similar procedure is followed in separately determining the passing score for the SOE.8 In 2018, the calculated passing scaled scores happened to be the same for both examinations.

Dr Goudra1 questions the use of the correlation between the OSCE and SOE scores to infer that these 2 examinations measure different constructs, in particular our statement that “…observed variations from a linear relationship suggest that the examinations measure different abilities, that there is measurement error associated with each examination (eg, individual scores do not perfectly reflect the ability of a given candidate), or a combination of both factors.” This is not a “bold” statement; it is simply true. We do not agree with the author’s contention that the relationships shown in Figures 2 and 3 “…deviate from a linear relationship slightly”; these figures depict scattergrams of weakly correlated data that do not even approach linear relationships. We have subsequently evaluated the measurement constructs of the SOE and the OSCE in a separate publication, finding that the SOE and the OSCE indeed measure distinct constructs.9 We believe that this result provides further rationale for administering both the SOE and the OSCE as components of the APPLIED Examination, recognizing, as stated above, that we still need to assess in future study how the constructs measured by the OSCE correlate with physician performance in clinical practice.

Dr Goudra1 raises concerns that the OSCE seems “…very subjective and opens doors for discrimination.” We disagree that the OSCE is “very subjective,” given how OSCE scenarios are constructed and how examiners are extensively trained as described in our previous work.3,8 Included in this training are discussions of possible bias and strategies to mitigate such bias, as the ABA and other assessment organizations recognize the potential for bias in any assessment (including written examinations). Furthermore, as described previously,3 each OSCE station is scored independently by a separate examiner, so that each candidate receives scores from at least 7 examiners—1 for each OSCE station. For those candidates with marginal performance after initial scoring, their 5 Communication and Professionalism stations are rated again by separate examiners, who are not aware of prior ratings by other examiners or whether they are rating a performance previously judged as marginal or excellent by another examiner. Thus, these candidates receive a total of 12 ratings from 12 separate examiners for their OSCE performances, making it less likely that bias on the part of individual examiners could affect results.

Finally, we too noted the recent decision to discontinue the Clinical Skills component of the United States licensing examination process. Dr Goudra1 seems to imply that this decision in some way diminishes the value of the ABA OSCE. We do not have insight into why this decision was made, but note that other than involving standardized patient interactions, this clinical skills examination had little in common with the ABA OSCE, differing in purpose (licensing versus specialty certification), format, content, and scoring procedures.

The conclusion of our initial article describing the OSCE stated that “The ABA is committed to rigorous and transparent evaluations of its systems and processes,” and outlined several questions of interest to continue the process of OSCE validation.3 This commitment continues. We thank the author for his interest, and hope these clarifications are helpful.

David O. Warner, MD
Department of Anesthesiology and Perioperative Medicine
Mayo Clinic
Rochester, Minnesota
[email protected]

Cynthia A. Lien, MD
Department of Anesthesiology
Medical College of Wisconsin
Milwaukee, Wisconsin

Ting Wang, PhD
Yan Zhou, PhD
The American Board of Anesthesiology
Raleigh, North Carolina

Robert S. Isaak, DO
Department of Anesthesiology
The University of North Carolina at Chapel Hill
Chapel Hill, North Carolina

Cathleen Peterson-Layne, PhD, MD
Department of Anesthesiology
Emory University
Atlanta, Georgia

Ann E. Harman, PhD
The American Board of Anesthesiology
Raleigh, North Carolina

Alex Macario, MD, MBA
Department of Anesthesiology, Perioperative and Pain Medicine
Stanford University
Stanford, California

Robert R. Gaiser, MD
Department of Anesthesiology
University of Kentucky
Lexington, Kentucky

Santhanam Suresh, MD, MBA
Department of Pediatric Anesthesiology
Ann & Robert H. Lurie Children’s Hospital of Chicago
Northwestern University
Chicago, Illinois

James P. Rathmell, MD
Department of Anesthesiology, Perioperative and Pain Medicine
Brigham and Women’s Hospital
Harvard Medical School
Boston, Massachusetts

Mark T. Keegan, MB, BCh
Department of Anesthesiology and Perioperative Medicine
Mayo Clinic
Rochester, Minnesota

Daniel J. Cole, MD
Department of Anesthesiology and Perioperative Medicine
University of California, Los Angeles
Los Angeles, California

Brenda G. Fahy, MD, MCCM
Department of Anesthesiology
University of Florida
Gainesville, Florida

Rupa J. Dainer, MD
Division of Anesthesiology
Pediatric Specialists of Virginia
Fairfax, Virginia

Huaping Sun, PhD
The American Board of Anesthesiology
Raleigh, North Carolina

REFERENCES

1. Goudra B. Objective structured clinical examination — are they truly objective? Anesth Analg. 2021; 131:e3–e5.
2. Warner DO, Lien CA, Wang T, et al. First-year results of the American Board of Anesthesiology’s objective structured clinical examination for initial certification. Anesth Analg. 2020; 131:1412–1418.
3. Warner DO, Isaak RS, Peterson-Layne C, et al. Development of an objective structured clinical examination as a component of assessment for initial board certification in anesthesiology. Anesth Analg. 2020; 130:258–264.
4. Kinney CL, Raddatz MM, Sliwa JA, Clark GS, Robinson LR. Does performance on the American Board of physical medicine and rehabilitation initial certification examinations predict future physician disciplinary actions? Am J Phys Med Rehabil. 2019; 98:1079–1083.
5. Peabody MR, Young A, Peterson LE, et al. The relationship between board certification and disciplinary actions against board-eligible family physicians. Acad Med. 2019; 94:847–852.
6. Kopp JP, Ibáñez B, Jones AT, et al. Association between American Board of Surgery initial certification and risk of receiving severe disciplinary actions against medical licenses. JAMA Surg. 2020; 155:e200093.
7. Zhou Y, Sun H, Culley DJ, Young A, Harman AE, Warner DO. Effectiveness of written and oral specialty certification examinations to predict actions against the medical licenses of anesthesiologists. Anesthesiology. 2017; 126:1171–1179.
8. Sun H, Warner DO, Patterson AJ, et al. The American Board of Anesthesiology’s standardized oral examination for initial board certification. Anesth Analg. 2019; 129:1394–1400.
9. Wang T, Sun H, Zhou Y, et al. Construct validation of the American Board of Anesthesiology’s APPLIED Examination for initial certification. Anesth Analg. 2021; 133:226–233.
Copyright © 2021 International Anesthesia Research Society