Sustaining knowledge and skills through continuing medical education (CME) is a characteristic of medical professionalism.1 To ensure that CME is relevant and effective, the Accreditation Council for Continuing Medical Education (ACCME) now mandates that CME be built around “the educational needs (knowledge, competence, or performance) that underlie the professional practice gaps of their own learners.”2 The ACCME defines the professional practice gap as “the difference between health care processes and outcomes observed in practice, and those potentially achievable on the basis of current professional knowledge.”2
A Logical Approach for Defining Gap Statements
Writing gap statements is challenging for CME course directors because there is not a universally accepted format for writing these statements. We thus created a logical approach to defining gap statements for CME accreditation; it involves four steps indicated by the mnemonic LASO (learner, assessment, standard, outcomes): (1) Define the learner population’s characteristics, (2) create a learning needs assessment, (3) determine if the standard is met, and (4) state educational outcomes for the CME activity. (These four steps are explained in Table 1.) Based on this model, the difference between the practice standard and the current practice represents the gap in practice. In this article, we discuss each component of LASO in terms of its rationale and the supporting literature. Finally, we provide practical examples of how to develop CME around practice gaps using the LASO approach.
Step 1: Define the learner population’s characteristics
Step 1 in creating a practice gap involves defining the learner’s stage of professional development and his or her practice context.3–6 Learners could be at various stages of professional development, including junior practicing physicians, midcareer physicians preparing to recertify in their specialties, or experienced physicians who have been practicing for decades. Defining the learner’s professional stage will shape the content and scope of the CME activity. For example, midcareer physicians may benefit from a comprehensive review of the field of medicine, whereas more experienced physicians may need updates only on specific topics that are commonly encountered in practice. It has also been observed that there may be an inverse relationship between a physician’s clinical experience and the quality of care that he or she provides.7
Additionally, CME providers must consider the learner’s practice context, which broadly includes specialty (e.g., internal medicine, general surgery), patient care environment (e.g., outpatient, emergency room, hospital), or scope (e.g., local, national, international). For instance, a pulmonary diseases update for family medicine physicians might discuss asthma management, community-acquired pneumonia, and diagnostic approach to cough, whereas the same update for critical care specialists might discuss Acute Respiratory Distress Syndrome (ARDS), health-care-associated pneumonia, and acute pulmonary embolism.
Step 2: Create a learning needs assessment
Step 2 involves performing a needs assessment by determining to what extent a standard is being met, analyzing why the standard may not be met, and identifying the learning competency to be assessed. CME activities that are founded on learning-needs assessments have the greatest impact on physician practice behavior.6,8 There are multiple types of learning needs that can be assessed.9 Specifically, normative needs are decided by professional organizations and constitute the standards for what a learner should know. Comparative needs are determined by educators and compare two groups of learners. Perceived needs are what learners think they should know; unfortunately, physicians’ self-assessments of their learning needs may be unreliable.10 Unperceived needs are determined by academic institutions and are those needs that learners do not realize they need to know.9,11
Standards of practice found in guidelines, such as the Glycemic Control and Type 2 Diabetes Mellitus Guideline of the American College of Physicians,12 can serve as a gold standard that should be known and met by all practitioners. Certain authorities, like the Agency for Healthcare Research and Quality (AHRQ) National Guideline Clearinghouse,13 provide lists of guidelines. Additionally, standards of care may be based on recommendations from groups like the AHRQ, which suggest evidence-based benchmarks for quality.14,15
There are various methods for determining educational needs and whether standards are being met.8,9,16 Normative needs can be established based on experts’ awareness of new guidelines, recent publications, and problems typically encountered in subspecialty practice. Learners’ perceived needs could be generated from clinicians’ personal journals,17 reflecting on sentinel events,8 surveys,18 and from standardized patients, chart audits, and focus groups.19
Learners’ needs can also be obtained from published literature, public health data, and expert consensus about what practicing physicians should know. For example, searching the literature with the key words “guideline” and “adherence” reveals useful information regarding physician practice patterns, which can support CME needs statements. For example, our use of these search terms identified a national study showing that only 14% of eligible nursing home patients were on antihypertensive medications when indicated and not contraindicated.20
Needs assessments can also be derived from public health data. For instance, the Minnesota Department of Human Services reports health care disparities for unique populations with respect to breast cancer screening, cervical cancer screening, colorectal cancer screening, and childhood immunizations.21 Similar information is provided by the federal government, private organizations, and some states.22,23
Once the CME participants’ needs are identified, then the topic should be aligned with one or more of the competencies mandated by the Accreditation Council for Graduate Medical Education (ACGME). The ACGME24 and the American Board of Medical Specialties25 have established six physician competencies: medical knowledge, professionalism, communication, patient care, practice-based learning and improvement, and systems-based practice. Identifying the appropriate ACGME competencies for the CME topic can help to clarify the CME objectives and ensure that the CME activity is aligned with the desired educational outcomes.
Step 3: Determine if the standard is met
Step 3 is determining the gap, which is defined as the difference between current practice and the standard that should be met. As an example, the AHRQ’s quality measures state that patients with pneumonia should receive an antibiotic within six hours of presentation to the hospital.26 A survey of patient records may show that only 40% of hospitalized patients with pneumonia receive antibiotics with six hours of admission to a regional hospital. In this example, the gap is that antibiotics are not being given according to guidelines. The needs assessment may determine the root cause of the problem, such as no standardized orders, failure of the pharmacy to respond, or a failure in communication. Because the hospital’s antibiotic delivery does not achieve the standard, a gap in practice exists that may benefit from a CME intervention.
Step 4: State the educational outcome
Step 4 is to state how the CME activity will fill the practice gap with an educational outcome, what that outcome is, and how that outcome will be measured. CME education interventions could include workshops, lectures, articles, rounds, and electronic modules.27 Regardless of the intervention type, research suggests that the most effective CME activities are interactive, use small groups, and focus on a narrow topic.28,29
Kirkpatrick’s hierarchy of outcomes,30,31 widely used in medical education, provides a useful means for organizing CME outcomes and has been adapted as a model by the ACCME. According to Kirkpatrick’s hierarchy, outcomes are categorized into the following levels: (1) reaction (learner satisfaction with the CME activity), (2) learning (acquisition of knowledge or skills), (3) behavior (transferring CME learning to the workplace), and (4) results (improved patient outcomes as a result of CME learning).30 Progressing up this hierarchy from reaction to results yields outcomes that are increasingly relevant, yet more difficult to measure. Notably, the ACCME recognizes that filling a practice gap in knowledge is insufficient without demonstrating some ability of the CME participant to apply that knowledge.2 Therefore, CME providers must measure outcomes beyond Level 1. Additional outcomes frameworks for assessing CME exist, including those by Miller32 and Moore et al.33
Various assessment strategies for each of the outcome levels can be employed. Reaction is typically assessed using satisfaction surveys completed by the course participants. Learning can be determined using pre- and posttests to assess knowledge acquisition. Participants’ behaviors can be measured using self-reports, objective structured clinical examinations, chart reviews, or provider performance information from patient databases. Results can be assessed by examining specific patient outcomes as documented in the medical record, or as quality measures reported by oversight agencies.
LASO Gap Analysis Examples
Below, we provide two examples of how the LASO mnemonic has been successfully used at the Mayo Clinic to frame educational gaps for CME activities.
Medical grand rounds example
* Learner—Faculty physicians at the Mayo Clinic who attend medical grand rounds for CME credit. Learners that attend medical grand rounds include faculty, allied health professionals, fellows, residents, and medical students.
* Assessment—Dabigatran is a new medication approved for prophylaxis against thromboembolism in patients with atrial fibrillation. The medical grand rounds advisory committee has received several requests for information about dabigatran based on perceived needs of physicians. The committee recognized that all physicians in the Department of Medicine will likely care for patients on dabigatran and, therefore, believed that all physicians need information on this new medication.
* Standard—Because dabigatran is a new medication unfamiliar to many internists in the Mayo Clinic Department of Medicine, an educational gap exists.
* Outcome—A thrombophilia expert with experience with dabigatran gave a lecture at medical grand rounds to educate faculty members about dabigatran. The specific objectives were (1) to list approved indication for dabigatran, (2) to compare and contrast dabigatran with warfarin, and (3) to apply initiation of dabigatran to a clinical case. The CME intervention was assessed with a multiple-choice, pre- and posttest of knowledge administered using an audience response system, thus measuring Kirkpatrick’s second level of educational outcomes (i.e., learning). This assessment showed that physicians better understood the indications, dosing, and contraindications of dabigatran and correctly applied it to a clinical scenario after attending the medical grand rounds lecture.
Annual CME course example
* Learner—The Mayo Clinic Selected Topics in Internal Medicine CME course is offered annually to all practicing internists in the United States, Canada, and Australia. Learners that attend this course typically include general practitioners in academic and private practice and subspecialists interested in maintaining their general internal medicine knowledge.
* Assessment—The American Board of Internal Medicine mandates that internists demonstrate basic competency at quality improvement (QI), including the concept of systems-based practice.34 Additionally, the Institute for Healthcare Improvement states that root cause analysis (RCA) is a fundamental concept for QI.35 Learners’ needs were assessed by administering a questionnaire to all past participants of the CME course to determine perceived and unperceived needs. Most learners responded that instruction on QI and RCA would be a beneficial addition to the course because application of QI methodologies to actual practice was identified as an area of weakness.
* Standard—Because skill at QI is mandated and physicians are uncomfortable with its techniques, a learning gap exists.
* Outcome—The course directors will add a QI workshop to the annual course next year. The workshop will provide didactic instruction on RCA and will reinforce this knowledge by having participants apply RCA to realistic cases. Learners will begin a QI project that can be taken from the workshop and implemented in their home practice. Specific objectives are to list the components of an RCA and to apply the principles of an RCA in case-based scenarios. The CME intervention will be assessed by having the participants send a report of their QI project and RCA three months after completion of the workshop. The reports will be scored using validated measures of QI projects, thus measuring Kirkpatrick’s third level of educational outcomes (i.e., behaviors).
LASO Approach for CME Gap Statements
Successful CME courses require that CME providers identify the intended audience, assess learners’ needs, identify gaps in knowledge, and evaluate CME activities according to meaningful levels of outcomes.2,30,31 Although professional practice gaps are the cornerstone of CME activities, the process of defining a professional gap has been challenging for CME providers. As described above, we are proposing a stepwise approach, based on the LASO mnemonic, for creating CME gap statements: (1) define the learner population’s characteristics, (2) create a learning needs assessment, (3) determine if the standard is met, and (4) state the educational outcomes. We have used the LASO method to define professional practice gaps for Mayo Clinic CME activities. In our experience, the LASO approach has assisted in making CME content learner centered, relevant, and measurable. We believe that LASO should be able to provide all CME course directors with a practical approach to defining educational gaps for CME accreditation.
Other disclosures: None.
Ethical approval: Not applicable.
1. American Board of Internal Medicine; ACP-ASIM Foundation; European Federation of Internal Medicine.. Medical professionalism in the new millennium: A physician charter. Ann Intern Med. 2002;136:243–246
2. Accreditation Council for Continuing Medical Education Web site. http://www.accme.org/
Accessed January 16, 2012
3. Beckman TJ, Lee MC. Proposal for a collaborative approach to clinical teaching. Mayo Clin Proc. 2009;84:339–344
4. Wolpaw TM, Wolpaw DR, Papp KK. SNAPPS: A learner-centered model for outpatient education. Acad Med. 2003;78:893–898
5. Palmer PJ. The Courage to Teach: Exploring the Inner Landscape of a Teacher’s Life. 2007. San Francisco, Calif Jossey-Bass, Inc.
6. Fox RD, Bennett NL. Learning and change: Implications for continuing medical education. BMJ. 1998;316:466–468
7. Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: The relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260–273
8. Norman GR, Shannon SI, Marrin ML. The need for needs assessment in continuing medical education. BMJ. 2004;328:999–1001
10. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA. 2006;296:1094–1102
11. Gordon MJ. A review of the validity and accuracy of self-assessments in health professions training. Acad Med. 1991;66:762–769
12. Qaseem A, Vijan S, Snow V, Cross JT, Weiss KB, Owens DK. Glycemic control and type 2 diabetes mellitus: The optimal hemoglobin A1C targets—A guideline statement from the American College of Physicians. Ann Intern Med. 2007;147:417–422
13. Agency for Healthcare Research and Quality.. National Guideline Clearinghouse. http://www.guideline.gov/
Accessed January 16, 2012
16. Grant J. Learning needs assessment: Assessing the need. BMJ. 2002;324:156–159
17. Perol D, Boissel J, Broussolle C, Cetre J, Stagnara J, Chauvin F. A simple tool to evoke physicians’ real training needs. Acad Med. 2002;77:407–410
18. Knoll M, Olivieri JJ. Audience-specific needs assessment: Using a gap analysis survey of CME conference registrants to assess presentation content. J Contin Educ Health Prof. 2008;28:284–285
19. Cohen R, Amiel GE, Tann M, Shechter A, Weingarten M, Reis S. Performance assessment of community-based physicians: Evaluating the reliability and validity of a tool for determining CME needs. Acad Med. 2002;77:1247–1254
20. Drawz PE, Bocirnea C, Greer KB, Kim J, Rader F, Murray P. Hypertension guideline adherence among nursing home patients. J Gen Intern Med. 2009;24:499–503
22. Agency for Healthcare Research and Quality Web site. http://www.ahrq.gov/
Accessed January 16, 2012
27. Mazmanian PE, Davis DA. Continuing medical education and the physician as learner. JAMA. 2002;288:1057–1060
28. Mansouri M, Lockyer J. A meta-analysis of continuing medical education effectiveness. J Contin Educ Health Prof. 2007;27:6–15
29. Davis D, O’Brien MAT, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education—Do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282:867–874
30. Kirkpatrick D. Revisiting Kirkpatrick’s four-level model. Train Dev J. 1996;50:54–59
31. Beckman TJ, Cook DA. Developing scholarly projects in education: A primer for medical teachers. Med Teach. 2007;29:210–218
32. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 suppl):S63–S67
33. Moore DE Jr, Green JS, Gallis HA. Achieving desired results and improved outcomes: Integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29:1–15