Measuring Medical Students' Orientation Toward Lifelong Learning: A Psychometric Evaluation

Wetzel, Angela P.; Mazmanian, Paul E.; Hojat, Mohammadreza; Kreutzer, Kathleen O.; Carrico, Robert J.; Carr, Caroline; Veloski, Jon; Rafiq, Azhar

Section Editor(s): Artino, Anthony MD; Bordage, Georges MD, PhD

Academic Medicine:
doi: 10.1097/ACM.0b013e3181ed1ae9
Promoting Improvement in Student Performance

Background: The principle of lifelong learning is pervasive in regulations governing medical education and medical practice; yet, tools to measure lifelong learning are lagging in development. This study evaluates the Jefferson Scale of Physician Lifelong Learning (JeffSPLL) adapted for administration to medical students.

Method: The Jefferson Scale of Physician Lifelong Learning–Medical Students (JeffSPLL-MS) was administered to 732 medical students in four classes. Factor analysis and t tests were performed to investigate its construct validity.

Results: Maximum likelihood factor analysis identified a three-factor solution explaining 46% of total variance. Mean scores of clinical and preclinical students were compared; clinical students scored significantly higher in orientation toward lifelong learning (P < .001).

Conclusions: The JeffSPLL-MS presents findings consistent with key concepts of lifelong learning. Results from use of the JeffSPLL-MS may reliably inform curriculum design and education policy decisions that shape the careers of physicians.

Author Information

Correspondence: Angela P. Wetzel, MEd, Office of Evaluation Studies, 730 East Broad Street, Box 980466, Richmond, VA 23298-0466; e-mail:

Article Outline

The principle of lifelong learning was included in medical oaths taken over 2,000 years ago,1 and it continues to guide curricular and other policy decisions that shape the careers of physicians today.2–4 In its Maintenance of Certification program, the American Board of Medical Specialties requires that practicing physicians meet standards for lifelong learning and self-assessment.2 The Accreditation Council for Graduate Medical Education specifies that “residents must demonstrate the ability to investigate and evaluate their care of patients … appraise and assimilate scientific evidence … [and] … continuously improve patient care based on constant self-evaluation and life-long learning.”3 Programs of undergraduate medical education accredited by the Liaison Committee on Medical Education4 are required to “include instructional opportunities for active learning … [and to] … provide students with the skills to support lifelong learning.”

Despite all the recommendations for individual competency and directives to uphold standards of education for lifelong learning, there is no single theory of lifelong learning that predominates in medical education and no commonly accepted set of tools unique to the exploration of learning across stages of a physician's career.5 Several instruments6–14 have been tested for reliability and validity, each with more or less potential to enable the execution of prospective studies that allow the assessment of lifelong learning—or self-directed learning which is often seen as an aspect of lifelong learning15—and better-informed standards to guide the development of lifelong learning in practice. Among the tested instruments, the Jefferson Scale for Physician Lifelong Learning (JeffSPLL) succeeded most consistently in measuring the lifelong learning orientation of medical practitioners and academic physicians.8–10 Accordingly, it seems most favorable for measuring the orientation of medical students toward lifelong learning, but the JeffSPLL offers no companion instrument for measuring lifelong learning in interns, residents, or undergraduate medical students.10 The present study helps to fill that gap by evaluating the construct validity and other psychometric characteristics of an adapted version of the JeffSPLL—a JeffSPLL for medical students (JeffSPLL-MS).

Back to Top | Article Outline


All medical students in four classes (N = 732) were invited to participate in spring 2009. Six hundred fifty-two usable questionnaires in the final sample represented an 89% response rate overall. For each class, the response rate was M1 (156 of 196 = 80%); M2 (146 of 178 = 82%); M3 (182 of 190 = 96%); and M4 (168 of 168 = 100%). For test–retest analysis using the Pearson correlation coefficient, 40 questionnaires were solicited; 22 questionnaires were returned (55% response rate), yielding a sample size with power of 0.66.16

With 14 items, the scale for JeffSPLL scores ranges from 14 to 56, where higher scores indicate a stronger orientation toward lifelong learning. A recent study demonstrated that internal consistency measured by Cronbach alpha was large (0.86), as was test–retest reliability (r = 0.75).10

To create a version of the JeffSPLL with face and content validity for medical students, the current investigators from Virginia Commonwealth University (VCU) and two investigators from the original studies of the JeffSPLL (M.H. and J.V.) reviewed and modified 8 of 14 JeffSPLL items. The first draft was presented to 30 medical students at VCU and 30 VCU faculty involved in undergraduate medical education who reviewed each item for face and content validity, respectively. The construct of lifelong learning was defined as “a concept that involves a set of self-initiated activities and information seeking skills that are activated in individuals with a sustained motivation to learn and the ability to recognize their own learning needs.”8 Using a four-point Likert scale, respondents rated the relevance of each item as an indicator of a medical student's orientation toward lifelong learning. They also were invited to suggest alternate wording for ambiguous items or to recommend additions or deletions to strengthen content validity. One item was revised based on respondent feedback (60% response rate) and consensus among investigators (A.W., P.M., M.H., K.K., J.V., A.R.).

With approval of the VCU institutional review board, hard copies of the JeffSPLL-MS were distributed to all first-, second-, third-, and fourth-year medical students. Unique identifiers were assigned to each questionnaire to allow for longitudinal tracking and test–retest reliability analysis. For the initial distribution, verbal and written instructions explained the purpose of the study and informed students about confidentiality of the results. Participation was optional. Approximately eight weeks later, 10 students from each class were selected randomly and asked to complete an electronic version of the 14-item scale to estimate test–retest reliability.

Maximum likelihood factor analysis was performed on a sample of 652 medical students, to extract the underlying components of the JeffSPLL-MS scale and to test model adequacy. Orthogonal and oblique rotations were applied to the solution with an orthogonal rotation selected for interpretation. Cronbach alpha was calculated for the total scale and for each factor, as a measure of internal consistency.

Back to Top | Article Outline


The overall mean score for the sample of 652 medical students was 43.52 (SD = 4.65) (see Table 1). Cronbach alpha was 0.77. Test–retest reliability over a two-month period yielded a value of 0.65. Given the potential for social desirability bias in student responses, frequencies for each item were examined for variability. Students responded using the full range of choices, and item mean scores ranged from 2.17 (SD = 0.77) for “I routinely attend meetings of student study groups” to 3.90 (SD = 0.30) for “Lifelong learning is a professional responsibility for all physicians.”

Adapting the JeffSPLL for use with medical students required examination of the factor structure of medical student data to determine whether components of lifelong learning were consistent across physician and medical student samples. Maximum likelihood factor analysis was performed on the correlation matrix to extract underlying components of the scale. At the 0.01 level, the χ2 goodness-of-fit statistic tested the null hypothesis of no discrepancy between the observed and predicted covariance matrices and indicated that a three-factor solution was not sufficient (P = .001), but a four-factor solution was adequate (P = .032). However, the fourth factor, which accounted for 6.7% of the variance, comprised only question 1, “Searching for the answer to a question is, in and by itself rewarding,” which did not load on the first three factors. The first three factors had eigenvalues greater than one,17 a three-factor model was supported by the scree test,18 and the fourth factor made little theoretical sense.19 On the basis of these criteria, the fourth factor was rejected and a three-factor solution retained, accounting for 46.3% of the total variance in the data (see Table 2). This percentage is moderate for attitudinal questionnaires.20

Orthogonal (varimax) and oblique (promax) rotations were applied to the solution, and both yielded similar structures. Therefore, a varimax rotation was used because of its ease of interpretability and reduced sensitivity to sampling error, promoting replicability of analysis.21

A 0.32 factor loading threshold was applied.22 Factor 1 (α = 0.70), named “learning beliefs and motivation,” was driven mainly by items 9, 8, and 11. It accounted for 26.4% of the total variance. The second factor (α = 0.61), labeled “skills in seeking information,” was driven mostly by items 6 and 5. It accounted for 12% of the variance. Finally, factor 3 (α = 0.59), a construct related to “attention to learning opportunities,” included items 13 and 12; it accounted for 7.9% of the total variance.

Because a similar factor structure emerged within both the physician and student samples, exploration of development across the continuum of training was considered. Physicians' mean JeffSPLL scores were higher than the mean for medical students' JeffSPLL-MS scores (see Table 1). At an institution with a 2 × 2 curriculum, first- and second-year students were grouped for analysis as preclinical medical students, and third- and fourth-year students were grouped as clinical medical students, to allow for comparisons. Independent samples t test confirmed that clinical medical students (X̄ = 44.16, SD = 4.58) scored significantly higher than preclinical medical students (X̄ = 42.79, SD = 4.63) (P < .001), illustrating a pattern in these samples of increasing total mean scores from preclinical medical students to clinical medical students to physicians.

Back to Top | Article Outline


Medical schools are required to teach the skills necessary for lifelong learning and to assess students' progress toward accomplishment of that goal. Yet, few rigorously tested tools are available to assess students in this area, to inform curriculum design, or to guide accreditation policy across the continuum of medical education. Results in the present study support use of the JeffSPLL-MS as a measure of lifelong learning for medical students. The three-factor solution complements the three-factor solution found in prior studies of academic physicians and practicing physicians.10 A consistent underlying structure emerged for both medical students and physicians, with more clinically involved medical students scoring significantly higher than preclinical medical students. Across groups, the construct stability of the three factors—learning beliefs and motivation, skills in seeking information, and attention to learning opportunities—suggests important implications. First, it implies that competencies may be identified to describe habits of the most successful lifelong learners. Second, there may be instructional strategies available that will enhance lifelong learning skills. Finally, the JeffSPLL could be adapted for use with interns and residents, thereby generating additional data for analysis in curricular and policy decisions that involve medical education and physicians.

Based on the orthogonal rotation factor solution with a 0.32 factor loading criterion, one question did not load on any of the three factors in the medical student population of VCU. With the discretion afforded investigators in selecting an appropriate number of factors,19 theoretical fit of the three factors was meaningful without the item. Subsequent research with larger samples, potentially more diverse demographics, and varied curricula may suggest other methods of factor analysis to determine whether the item is to be revised or dropped from the JeffSPLL-MS.

Building on the foundations of the JeffSPLL, this study of the JeffSPLL-MS offers strong support for its reliability and validity for use with medical students and new evidence to support the three-factor structure of lifelong learning. As medical education endeavors to understand how to teach and assess skills associated with lifelong learning in a field that lacks an established theory to explain the phenomenon, an inductive approach to research and analysis allows elements of lifelong learning to be identified and eventually tested for correlations with the principle of lifelong learning and its measurement in practice.23

The JeffSPLL measures orientation or attitudes toward lifelong learning, and orientation often leads to relevant behaviors.23 Empirical evidence shows a significant link between scores on the scale and behavioral manifestations such as publications or presentations in public media and professional accomplishments including awards and honors.10,23 Subsequent research should include further examination of behavioral outcomes, elaboration of a theory of lifelong learning, and testing of hypotheses for predictive validity throughout the career span of physicians. In that way, policy making, curriculum planning, and evaluation may be informed with science to improve teaching and learning.

Back to Top | Article Outline



Back to Top | Article Outline

Other disclosures:


Back to Top | Article Outline

Ethical approval:

The Virginia Commonwealth University institutional review board approved this study.

Back to Top | Article Outline


1 Rancich AM, Perez ML, Morales C, Gelpi RJ. Beneficence, justice, lifelong learning expressed in medical oaths. J Contin Educ Health Prof. 2005;25:211–220.
2 American Board of Medical Specialties. Standards for ABMS MOC (Parts 1–4) Program. Available at: Accessed February 17, 2010.
3 Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: Accessed February 17, 2010.
4 Liaison Committee on Medical Education. Accreditation Standards. Available at: Accessed February 17, 2010.
5 Institute of Medicine. Health Professions Education: A Bridge to Quality. Washington, DC: The National Academies Press; 2003.
6 Fisher MJ, King J. The Self-Directed Learning Readiness Scale for nursing education revisited: A confirmatory factor analysis. Nurse Educ Today. 2010;30:44–48.
7 Huynh D, Haines ST, Plaza CM, et al. The impact of advance pharmacy practice experiences on students' readiness for self-directed learning. Am J Pharm Educ. 2009;73(4):Article 65.
8 Hojat M, Nasca TJ, Erdmann JB, Frisby AJ, Veloski JJ, Gonnella JS. An operational measure of physician lifelong learning: Its development, components, and preliminary psychometric data. Med Teach. 2003;25:433–437.
9 Hojat M, Veloski J, Nasca TJ, Erdmann JB, Gonnella JS. Assessing physicians' orientation toward lifelong learning. J Gen Intern Med. 2006;21:931–936.
10 Hojat M, Veloski J, Gonnella JS. Measurement and correlates of physicians' lifelong learning. Acad Med. 2009;84:1066–1074.
11 Harvey BJ, Rothman AI, Frecker RC. Effect of an undergraduate medical curriculum on students' self-directed learning. Acad Med. 2003;78:1259–1265.
12 Hoban JD, Lawson SR, Mazmanian PE, Best AM, Seibel HR. The Self-Directed Learning Readiness Scale: A factor analysis study. Med Educ. 2005;39:370–379.
13 Shokar GS, Shokar NK, Romero CM, Bulik RJ. Self-directed learning: Looking at outcomes with medical students. Fam Med. 2004;34:197–200.
14 Hendry GD, Ginns P. Readiness for self-directed learning: Validation of a new scale with medical students. Med Teach. 2009;31:918–920.
15 Candy PC. Self-Direction for Lifelong Learning: A Comprehensive Guide to Theory and Practice. San Francisco, Calif: Jossey-Bass; 1991.
16 Dixon WJ, Massey FJ. Introduction to Statistical Analysis. 4th ed. New York, NY: McGraw-Hill Book Co; 1983.
17 Kaiser H. The application of electronic computer factor analysis. Educ Psychol Meas. 1964;20:141–145.
18 Cattel RB. The scree test for the number of factors. Multivariate Behav Res. 1966;1:245–276.
19 Johnson RA, Wichern DW. Applied Multivariate Statistical Analysis. 5th ed. Upper Saddle River, NJ: Prentice Hall; 2002.
20 Pett MA, Lackey NR, Sullivan JJ. Making Sense of Factor Analysis: The Use of Factor Analysis for Instrument Development in Health Care Research. Thousand Oaks, Calif: Sage Publications, Inc.; 2003.
21 Pedhauzer EJ, Schmelkin LP. Measurement, Design, and Analysis: An Integrated Approach. Mahwah, NJ: Lawrence Erlbaum Associates; 1991.
22 Tabachnick BG, Fidell LS. Using Multivariate Statistics. 5th ed. Boston, Mass: Allyn & Bacon; 2007.
23 Hojat M, Veloski JJ, Gonnella JS. Physician lifelong learning: Conceptualization, measurement, and correlates in full-time clinicians and academic clinicians. In: Caltone MP, ed. Handbook of Lifelong Learning Developments. Hauppauge, NY: Nova Science Publishers; 2009:37–78.
© 2010 Association of American Medical Colleges