Skip Navigation LinksHome > October 2007 - Volume 82 - Issue 10 > Demystifying “Learning” in Clinical Rotations: Do Immersive...
Academic Medicine:
doi: 10.1097/ACM.0b013e31813ffd3a
Predicting Clinical Skills Performance

Demystifying “Learning” in Clinical Rotations: Do Immersive Patient Encounters Predict Achievement on the Clinical Performance Examination (CPX)?

Fung, Cha-Chi; Relan, Anju; Wilkerson, LuAnn

Section Editor(s): Ruiz, Jorge MD; Kovach, Regina MD

Free Access
Article Outline
Collapse Box

Author Information

Correspondence: Cha-Chi Fung, PhD, UCLA David Geffen School of Medicine, Dean’s Office/Ed&R, 60-051 CHS, Los Angeles, CA 90095-1722; e-mail: (ccfung@mednet.ucla.edu).

Collapse Box

Abstract

Background: Although the Liaison Committee on Medical Education required U.S. medical schools to quantify students’ clinical encounters, optimum patient exposure needed to predict performance has been elusive. This study explored the relationship between comprehensive patient encounters logged on personal digital assistants (PDAs) during three medicine clerkships, to performance on a clinical performance examination (CPX).

Method: PDA log data for 166 medical students were used to identify educationally “rich” patient encounters, where students were assigned full responsibility for patient care.

Results: Univariate regression analyses predicting the effect of immersive patient encounters on CPX case scores did not show statistical significance.

Conclusions: Amount of patient exposure defined by the richness of student–patient interaction did not reflect on the CPX performance for six selected cases. Further research should examine qualitatively different learning experiences occurring with patient encounters and a higher volume of exposure to predict outcomes.

Clinical experiences are the highlight of medical students’ learning on their journey towards expertise in medicine. In recognizing this, the Liaison Committee on Medical Education (LCME) officially stated in Standard ED-2 that “the objectives of clinical education must include quantified criteria for the types of patients (real or simulated), the level of student responsibility, and the appropriate clinical settings needed for the objectives to be met.”1 Implicit in this requirement was the assumption that medical students should be exposed to breadth and depth in patient encounters for a complete clinical experience, and that this experiential learning would be invaluable in fulfilling the performance objectives of clinical education.*

Induced by the LCME requirement, medical schools have implemented logbooks or electronic logs using personal digital assistants (PDAs) to capture students’ patient encounters.2–4 The data generated have been successfully used to identify site differences, compare clerkships, and monitor students’ exposure to patients. However, studies on the relationship between process measures recorded by students and outcome-assessment measures have been sparse and inconclusive. In a review of logbooks in undergraduate medical education, analyzing 50 studies, Denton et al5 (p162) observe that “overall, the most pressing research need is to expand the field connecting educational structure and process to outcome measures … the lack of sufficiently scientifically rigorous published articles connecting process to outcome measures in undergraduate medical education is concerning and does not appear to support the LCME’s recent insistence on extensive documentation of students’ patient encounters.”

Medical schools have grappled with identifying the range of patients that students must see for a meaningful clinical learning experience, one indication of which is the relationship between supervised patient encounters and performance measures. The supervision, at the minimum, can be assumed to entail feedback on a student’s patient encounter as it pertains to patient outcome—including history taking, physical exam, or patient education. Against this backdrop, medical schools are inclined to believe that “the more patients medical students see, the better they are at patient care.” Arising from conventional wisdom, this conclusion has theoretical merit as well, as revealed by Ericsson’s6 theory of deliberate practice, which systematically demonstrates the effects of well-defined practice characteristics on higher levels of performance in attaining expertise and maintaining competence. In reviewing research in clinical education, however, it is vexing that studies predicting performance from patient exposure have failed to show the expected outcomes.7–9 The lack of significant findings in these studies could be a function of high levels of variability in students’ clerkship experiences, or they could be attributable to variance among predictor variables that are extraneous to measured skills. In ambulatory settings, students may see patients with multiple problems, may only be partially involved in patient care, or may see a large number of patients but with a limited range of problems. Any of these conditions may create a learning experience that does not transfer to cases used in assessments.9 In an attempt to quantify variables contributing to performance, recent research examining multiple variables at play in clinical education has highlighted the importance of supervision and feedback.10 An examination of patient care experience from multiple perspectives is needed to uncover the variables associated with transferable learning during patient encounters.

The purpose of this study was to examine the relationship between the amount of comprehensive patient exposure experienced in three core clerkships in primary care and performance on a clinical performance examination (CPX). The selected clerkships were ambulatory medicine (AM), inpatient internal medicine (IM), and family medicine (FM), which captured the clinical problems tested in the CPX: abdominal pain, hypertension, diabetes, back pain, cough, and chest pain. The CPX is an annual Objective Structured Clinical Examination (OSCE) administered at the end of the core clerkships, used to assess students’ history taking, physical examination, and patient–physician interaction (communication) skills. We examined the effect of patient encounters on performance according to three inclusion criteria that defined the immersive learning experience associated with the encounter: (1) patients with problems resembling cases on the CPX, (2) patients who presented with fewer than two complaints, and (3) patients for whom students had full responsibility for care. This level of internal consistency in the type of patients seen and assessed has not been undertaken in previous studies. The patient encounter generated from this selection would be problem specific, with high levels of patient interaction, including taking a history, conducting a physical examination, writing a plan, writing orders, and communicating with the patient. The CPX was structured to measure skills in history taking, physical exam, and patient/physician communication. We hypothesized that a full, immersive learning experience with patient encounters during the core clerkships will predict performance on similar cases assessed in an OSCE setting.

Back to Top | Article Outline

Method

A retrospective study design based on PDA patient log data and CPX performance scores gathered in 2004–2005 was employed to examine performance outcomes. The study, conducted at the David Geffen School of Medicine at University of California–Los Angeles, was approved by the institution’s internal review board. A total of 166 student records were analyzed containing students’ PDA log data and CPX scores.

From the raw dataset, we included patients for whom students had documented as having “full” responsibility for patient care, and those who presented with not more than two complaints for six clinical problems included in the CPX. This subset of patient data ensured that the learning experience would most closely match the CPX assessment criteria.

Students reported patient encounters using an institutionally developed PDA patient log, which is a required exercise for all third-year medical students rotating through the seven core clerkships. Students were required to document a predetermined number of patients, which was unique for each clerkship (a minimum of 50 patients each for AM and FM, and 12 for IM). The complaints and diagnoses considered most important for adequate clinical experience by each clerkship were included on the log using pull- down menus that allowed multiple selections. Students entered the level of responsibility assigned for each patient from the following choices: observed, minimal, moderate, and full. The entire process of patient entry was explained (and validated) by using six mock cases at the beginning of the clerkships. Full responsibility was defined as having taken a patient’s history, conducted a physical exam, written orders, and developed a plan under supervision.

The CPX is an eight-station OSCE, from which we selected the following six cases for the study: chronic cough, hypertension, abdominal pain, back pain, chest pain, and diabetes. The internal consistency of the CPX component scores were assessed using Cronbach’s alpha. On the basis of the reliability of the component scores and their relevance to the study, we selected history taking (α = 0.63), physical examination (α = 0.70), and patient–physician interaction (α = 0.90) to generate CPX case scores (Cronbach’s α = 0.34–0.65) used as the outcome variable in all analyses. Univariate regression analyses were performed on each case, using comprehensive patient encounters (number of patients with full responsibility) on CPX case scores, using SPSS version 14.0. Independent sample t tests were performed to examine whether there were group differences in performance between the lowest and the highest quartiles according to the number of patients seen.

Back to Top | Article Outline

Results

There was wide variability in patient encounters by problem seen (Table 1). Cough has minimal entries, whereas hypertension was reported in considerably more cases (mean = 5.82 and 21.44, respectively). The subset of patients identified with full responsibility represented roughly two thirds of the total reported for each problem (Table 1).

Table 1
Table 1
Image Tools

Regression analyses using the CPX case scores as the dependent variable and the number of patients seen with full responsibility in each of the clinical problems as the independent variable showed that immersive patient encounters did not predict students’ performance on the CPX. Residual plots showed that assumptions of distribution normality were met.

Independent-sample t tests, performed to examine differences in CPX case scores between the lowest (first) and the highest (fourth) quartiles according to number of patients seen, were not statistically significant for any of the six cases (Table 2).

Table 2
Table 2
Image Tools
Back to Top | Article Outline

Discussion

In this study we explored whether the amount of exposure to immersive, comprehensive patient encounters across three primary care clerkships would predict performance on analogous skills in similar cases in a clinical performance examination using standardized patients. Despite the wide range of reported exposure, the number of patients seen did not predict performance on the CPX. Students who reported few encounters with a specific patient problem were equally as likely to score high as low on the matching case in the CPX. These findings suggest that clinical performance cannot be predicted by the number of patients for whom students assume full responsibility in real clinical settings. The failure to find any significant differences between the lowest and the highest quartiles in terms of patient volume, and in considering patient distributions which are almost identical, the results seem to imply that patient volume in our study may not be sufficient to predict performance—indicating a possible ceiling effect. The time allotted for students to complete clerkships may not be sufficient to expose them to the number of patients needed to generate a significant effect on clinical performance. If clinical education could be restructured to allow students sufficient practice to attain mastery in clinical skills, we may be able to see a positive effect of patient volume on performance outcomes. However, given that time and resources are finite, we need to make students’ limited clinical experiences meaningful by packing them with opportunities for deep learning. By focusing on patient encounters that potentially promoted deeper learning, our study still begs the question implicit in LCME requirement—what is the optimum number of patients to which students must be exposed to generate a difference in clinical performance?

In research examining the role of deliberate practice on continuing professional growth and expertise, Ericsson6 found that effective practice is characterized by clearly focused goals, time commitment, immediate feedback, and reflection; thus, it is the quality of practice that determines success. Learning experiences in the current clinical setting may not support a full implementation of Ericsson’s theory. Time to practice is restricted by the limited face time available with patients in the actual clinic. Students may lack clear goals even though they are motivated to improve their clinical skills. Other key features in the theory of deliberate practice which have been difficult to capture in a clinical learning environment are the quality and quantity of feedback, as well as reflection. Feedback was implied in the current study by ascertaining from log data that all student–patient encounters, at a general level, were supervised. However, the specific nature of supervision, and whether it entailed our assumptions of feedback, were not verified from the log. Consequently, in light of our findings, Ericsson’s6 theoretical framework implemented in its entirety would be a useful starting point in structuring clerkship experiences that lead to deeper learning. Studies that incorporate multiple dimensions of learning experiences, including explicit goals, patient volume, level of responsibility, and faculty involvement, including extent and type of supervision and feedback, are called for to explore the effectiveness of clinical education in fostering clinical skills, which also account for performance variance.

Our study had several limitations, some of which are commonly reported with data generated from patient logbooks.5 All patient data were self-reported, with no monitoring system in place other than the honor code and supervision of these data by site directors. Reliability of such data can be questioned on the basis of the erratic nature of clerkship experiences. Although students were trained on when to enter full responsibility for patients, the expression could still be interpreted differently by individuals and could vary by settings. Finally, the requirement for entering a predetermined number of patients in clerkships may be limited in capturing the full experience of students’ patient exposure.

In conclusion, our findings replicate the results of previous studies, which have consistently shown the lack of learning effects from patient exposure alone.2–4 This adds to the body of evidence that supports the quantification of patient encounters previously required in the LCME accreditation standard ED-2. We propose greater efforts to ensure accuracy and reliability of data gathered during clinical education, and further research into the quality of learning experiences associated with patient encounters which result in favorable performance outcomes. The complexity of “learning” with clinical encounters needs to be investigated, using multiple perspectives, to help develop a generalizable, performance-driven model of clinical education.

Back to Top | Article Outline

References

1Liaison Committee on Medical Education (LCME). LCME accreditation standards. Available at: (http://www.lcme.org/standard.htm). Accessed July 20, 2007.

2Pipas CF, Carney PA, Eliassen MS, Mengshol SC, Fall LH, Olson AL. Development of a handheld computer documentation system to enhance an integrated primary care clerkship. Acad Med. 2002;77:600–609.

3Markham FW, Rattner S, Hojat M, Louis DZ, Rabinowitz C, Gonnella JS. Evaluations of medical students’ clinical experiences in a family medicine clerkship: differences in patient encounters by disease severity in different clerkship sites. Fam Med. 2002;34:451–454.

4Rattner SL, Louis DZ, Rabinowitz C, et al. Documenting and comparing medical students’ clinical experiences. JAMA. 2001;286:1035–1040.

5Denton GD, DeMott C, Pangaro LN, Hemmer PA. Narrative review: use of student-generated logbooks in undergraduate medical education. Teach Learn Med. 2006;18:153–164.

6Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81.

7Chatenay M, Maguire T, Skakun E, Chang G, Cook D, Warnock GL. Does volume of clinical experience affect performance of clinical clerks on surgery exit examinations? Am J Surg. 1996;172:366–372.

8Neumayer L, McNamara RM, Dayton M, Kim B. Does volume of patients seen in an outpatient setting impact test scores? Am J Surg. 1998;175:511–514.

9Gruppen LD, Wisdom K, Anderson DS, Woolliscroft JO. Assessing the consistency and educational benefits of students’ clinical experiences during an ambulatory care internal medicine rotation. Acad Med. 1993;68:674–680.

10Wimmers PF, Schmidt HG, Splinter TA. Influence of clerkship experiences on clinical competence. Med Educ. 2006;40:450–458.

*The LCME has since revised its accreditation standards ED-2 to eliminate the requirement of “quantifying the types of patients or clinical conditions that students are expected to encounter in order to meet clinical learning objectives” due to numerous reports of difficulties in carrying out this task, as well as the lack of empirical evidence supporting its claim. The revised standards can be found on their Website, (http://www.lcme.org/ed2change.htm). Cited Here...

© 2007 Association of American Medical Colleges

Login

Article Tools

Images

Share