Professional development (PD), including both formal for-credit continuing medical education (CME)1–3 and informal learning pursued in response to patient-oriented clinical questions or other self-identified learning needs,4–7 is vital to every physician’s professional success.1,2,6,8 Yet experts have raised concerns regarding how CME is delivered, financed, regulated, and evaluated.1,8 Evidence suggests that common CME approaches, such as educational meetings,9 printed materials,10 and audit and feedback,11 have only limited impact, while interactive, multimodal activities reinforced over time have a greater likelihood of changing practice.12,13 Additionally, physicians incompletely understand their own PD needs or how they might effectively identify and address these needs.7,14–17 Moreover, physicians tend to focus on their clinical knowledge and skills, rather than on other important competencies such as communication, teamwork, practice improvement, and lifelong learning. In response, leaders have called for system-wide transformations in PD delivery1–3,6,8 and the fostering of physicians’ ability to identify and meet their own learning needs.6,7,16
Professional organizations, certification boards, nonprofit foundations, for-profit businesses, state medical boards, and other government organizations have all contributed to discussions around the optimization of PD for physicians. By contrast, research documenting the beliefs, attitudes, and desires of physicians regarding their PD is relatively sparse. Recent surveys have focused on a narrow subset of PD issues, including information-seeking behaviors,18 online CME,19–21 assessment and feedback,22 and industry support.23 A survey conducted in 2000 among physicians in one U.S. health care organization addressed several important issues, but these findings are outdated.24 More recent surveys in Canada, Australia, the United Kingdom, and Germany have examined modality preferences and usage (e.g., face-to-face, journal, or online) and barriers to PD,25–28 but only one quantified perceived PD needs,25 and none looked at how physicians identify their learning needs. Most surveys are also limited by a narrow sampling frame (e.g., a single specialty, institution, or geographic region), and recent information on U.S. physicians is notably lacking. A broadly representative physician survey addressing a range of current issues in PD and CME would inform efforts to create a PD system that effectively targets the greatest needs and creates meaningful support rather than unnecessary barriers.
We conducted a nationwide, cross-specialty survey of U.S. physicians to answer the following questions:
- What do physicians perceive as their highest-priority PD needs?
- What do physicians believe regarding how PD needs are and should be identified?
- What barriers do physicians encounter in their PD?
- How do these beliefs vary by specialty and practice type?
We formulated specific hypotheses for each question and list these together with a short summary of our actual findings in Supplemental Digital Appendix 1 (available at http://links.lww.com/ACADMED/A434).
We surveyed licensed U.S. physicians using a self-administered Internet and paper questionnaire, from September 2015 through April 2016. Survey items addressed beliefs about PD and maintenance of certification; the latter findings have been published separately.29
Sampling and human subjects
We identified a random sample of 4,648 licensed U.S. physicians from the LexisNexis Provider Data Management and Services database (LexisNexis Risk Solutions, Alpharetta, Georgia). The contact dataset included information on age, gender, location, and specialty. We tracked Internet survey completion, but all survey responses were anonymized. We offered all participants a nominal gift (a book costing less than $12) for participation. This study was approved by the Mayo Clinic institutional review board.
The authors, all experienced educators with backgrounds working in academic medical centers, integrated care delivery systems, and medical specialty boards, collaborated to create the survey questionnaire. We reviewed expert panel reports1,8 and prior surveys18,24,26,30 and consulted with colleagues to identify key current PD issues. These issues included how physicians prioritize professional competencies, identify learning needs, meet those needs, learn in the workplace, learn with other professionals, accumulate CME credits, pay for PD, and anticipate using new educational technologies and other resources. We used these reports and surveys to generate survey items addressing each issue. We based our list of core competencies on the Accreditation Council for Graduate Medical Education competencies31 and on the CanMEDS 2015 framework,32 with an additional competency related to physician well-being.33 Most items consisted of either seven-point bipolar Likert-type items (1 = strongly disagree; 7 = strongly agree) or five-point unipolar response options (1 = nothing or not at all; 5 = a very large amount or extremely). To keep the survey length manageable, we divided the questionnaire into two sections of approximately equal length and allowed participants to submit the survey after completing the first section (“primary items”); those willing to continue could respond to the additional “secondary items.” We also included items about demographic information and burnout.34
Four CME experts reviewed the draft items to identify omissions and redundancies. Mayo Clinic Survey Research Center staff with expertise in questionnaire development verified item structure and wording. We piloted the questionnaire among 17 physicians representing anesthesiology, dermatology, emergency medicine, family medicine, internal medicine, neurology, pathology, psychiatry, and surgery, and we revised items based on their feedback.
Through iterative discussion, we identified five key items: competency priority scores for knowledge/skills and for practice/systems improvement, the barriers of time and of cost, and the desire for help identifying PD gaps.
The survey instructions defined PD as:
All activities intended to improve your professional knowledge, skills, or performance. This includes a variety of activities such as studying journal articles, reading UpToDate, participating in a live or online course, or doing a practice audit-and-improvement. It also includes learning for both clinical and non-clinical responsibilities such as teaching, research, and leadership.
It further defined CME as “a subset of professional development that awards formal credit for completing professional development activities.”
We first contacted physicians via e-mail. Initial and reminder e-mails contained an individualized link to an Internet-based questionnaire administered using Qualtrics (www.qualtrics.com). We sent up to 10 reminder e-mails. A subset of physicians received one paper mail reminder as part of a randomized subexperiment.35
We sent a paper questionnaire to those who did not respond to the Internet survey within three months. The paper questionnaire had no identifying information such that responses could not be tracked. Mayo Clinic Survey Research Center staff processed all Internet and paper responses without investigator involvement to preserve respondent anonymity.
To characterize the sample, we used respondent-reported demographic information when available; we used information from LexisNexis to fill in missing data. We created an overall priority score for each professional competency by multiplying the respondent’s ratings of importance and learning need.
We explored the possibility that nonrespondents were systematically different from respondents in two ways. First, we compared specialty, practice location, and gender (i.e., information from the LexisNexis dataset) among respondents and nonrespondents using the chi-square test. Second, because evidence suggests that the beliefs of late respondents closely resemble the beliefs of those who never respond,36 we compared responses from those who responded near the end of the survey period (the last 15% of responses) with those who responded earlier, using the five key items defined above. We also compared the distribution of respondents’ specialties against the national distribution published in the Association of American Medical Colleges 2014 Physician Specialty Data Book.37
We planned a priori analyses exploring variations in responses by specialty (generalist [nonsubspecialist family medicine, internal medicine, and pediatrics], surgical specialist [surgery, anesthesiology, and obstetrics–gynecology], and nonsurgical specialist [all others]); practice type (self-employed, group [including government], and academic); and burnout. We conducted additional exploratory analyses, which we identify as such in the Results.
We used Spearman rho to evaluate correlations among item responses. We used both the paired t test and the Wilcoxon signed rank test to compare within-subject differences in responses and obtained essentially identical results; we report the signed rank results. We used general linear models to test associations between survey responses and respondent characteristics and to compare responses on primary survey items among those who did versus did not complete the secondary items. Because of the large sample size and multiple comparisons, we used a two-tailed alpha of 0.01 as the threshold of statistical significance in all analyses. We used SAS version 9.4 (SAS Institute Inc., Cary, North Carolina) to conduct these tests.
Survey response and sample characteristics
We sent 4,648 survey invitations, of which 646 e-mails and 223 paper questionnaires were undeliverable and 65 were undeliverable via either e-mail or paper. We received 988 responses (631 Internet, 357 paper). Using the conservative denominator of 4,583 potential respondents (excluding the 65 undeliverable by either method), our response rate was 21.6%. A less conservative estimate excluding all 934 undeliverable invitations leaves 3,714 potential respondents, yielding a response rate of 26.6%.
Demographic characteristics of invitees and respondents are reported in Table 1. About half the respondents completed the second half of the questionnaire. Those who completed the second half reported a greater desire for help identifying learning gaps (mean 4.2 vs. 3.9; 95% confidence interval for difference 0.1–0.5; P = .002), whereas we found no statistically significant differences in responses for the other four key items (listed above) among those who did versus did not complete the second half (data not shown).
Respondents and nonrespondents were comparable across all available characteristics, except that the proportion of responding pediatric subspecialists was greater than that of nonresponding pediatric subspecialists (see Table 1). The distribution of specialties among respondents closely mirrored the published data for all U.S. physicians37 (P > .06), except that our sample included slightly fewer general internal medicine and family medicine physicians (absolute difference about 4% for both, P < .001). We compared the responses of those responding early versus late in the survey period and found no statistically significant differences for the five key items.
Of respondents, 38% met the criteria for being burned out, defined as experiencing either emotional exhaustion (34%) or depersonalization (18%) on at least a weekly basis.34
Attitudes about PD and CME
From respondent-reported attitudes about various PD issues, we identified five key points or messages. Table 2 contains verbatim wording of the survey items and response summary data; Supplemental Digital Appendix 2 (available at http://links.lww.com/ACADMED/A434) reports responses using the full 1–7 scale. For simplicity, we report only means in the paragraphs below, but note that the standard deviations varied from 1.1 (suggesting relatively uniform or homogenous attitudes) to 2.0 (reflecting less uniform attitudes) (see Table 2).
First, responding physicians strongly and uniformly agreed that they already know what they need to learn (mean rating 5.8 [1 = strongly disagree; 7 = strongly agree]), and they did not strongly desire help identifying learning gaps (4.0). When asked about specific sources to identify gaps, information on patient outcomes was rated higher (4.8) than support from “someone I trust” (4.4) or objective tests (4.4) (95% confidence interval for difference 0.3–0.6; P < .001 for both comparisons). However, few reported currently using objective performance data to identify learning needs (3.8). Respondents did not agree that CME credit is a dominant influence in choosing learning activities (3.3).
Second, physicians would like more credit for the things they learn while addressing the needs of specific patients (5.1). They strongly and uniformly agreed that they can quickly find answers to patient-specific questions using already-available resources (5.9) but agreed less strongly that their practice provides adequate point-of-care knowledge resources at no direct cost (4.3). They strongly and uniformly agreed that point-of-care online learning is vital to effective patient care (5.3) and that CME should ideally occur in the context of their clinical practice (5.9).
Third, physicians did not indicate much difficulty in finding (3.3) or accumulating (3.1) needed CME credits. Although most PD is done in their personal time (5.5), they did not strongly endorse concerns about feeling behind in their PD activities (3.7). About half perceived that they can meet their CME requirements using activities already available on-site in their workplace (4.3). Responses were neutral regarding both the value of money spent on CME (4.1) and the financial burden of PD activities (4.1). Other items about the benefits of accredited CME received favorable responses (4.5–5.2). Respondents indicated that they would not stop doing accredited CME even if it were not required (2.8). They did express strong interest in centralized tools that would list CME opportunities and track CME completion (5.7).
Fourth, physicians strongly and uniformly endorsed the value of discussions with peers when learning, with slightly greater perceived value when learning controversial topics (5.9) than clearly defined practices (5.6). Opportunities to learn with peers were more highly valued (5.2) than opportunities to learn with nonphysician health care professionals (4.3). Only a minority desired more education on working as interprofessional teams (3.7).
Finally, physicians expressed the desire for more online learning (4.6) but somewhat less interest in more simulation-based learning (4.2).
We conducted preplanned subgroup analyses based on specialty and practice type for selected items (see Supplemental Digital Appendix 3, Table 1, available at http://links.lww.com/ACADMED/A434, for details). Self-employed physicians reported doing more PD in their personal time (5.9) than those in group practice (5.5), and both groups reported higher ratings than those in academic practice (4.9; P < .001 for each pairwise analysis). Those in group practice perceived less burden from the total cost of PD activities (3.9) than self-employed physicians (4.4; P = .003). We found no statistically significant differences across practice type (P ≥ .05) for “I know what I need to learn to do my job well,” desire for help identifying gaps, or finding answers using available resources. We found no statistically significant differences across specialty (P ≥ .05) for any items.
Prioritizing physicians’ PD needs
We asked physicians to prioritize professional competencies in three ways (see Table 3 and Supplemental Digital Appendix 4, available at http://links.lww.com/ACADMED/A434)—the importance to their professional practice, their perceived need for learning, and the difficulty in finding learning activities in that domain. We found substantial differences across these three measures. For example, medical knowledge/skills was ranked highest for importance but lowest on difficulty to find and midrange for perceived learning need. Professionalism was ranked second highest in importance but lowest in perceived learning need. Research was judged to be of lowest importance and was also rated middle or low for difficulty to find and learning need.
We calculated a priority score for each competency (importance × learning need). Medical knowledge/skills, wellness, informatics, and practice/systems improvement had the highest priority scores, while research, teaching, and professionalism had the lowest.
In analyzing priority scores across practice type, we found a consistent pattern—namely, those in academic practice had the highest priority scores for all competencies, and self-employed physicians had the lowest (see Table 3). These differences were statistically significant for medical knowledge/skills, communication, practice/systems improvement, teaching, research, and management. We found no significant differences in priority scores across specialties (P ≥ .05; see Supplemental Digital Appendix 3, Table 2, available at http://links.lww.com/ACADMED/A434). Medical knowledge/skills had the highest priority score in all practice types and in all specialties.
How physicians identify PD needs
We sought to further understand how physicians identify the gaps in their knowledge and skills that lead them to pursue PD. We asked them to rate the importance of eight information sources that might prompt them to study a specific topic (see Table 4 and Supplemental Digital Appendix 5, available at http://links.lww.com/ACADMED/A434), both in their current practice and in an ideal setting (as it “ought to be”). Immediate patient care needs, personal awareness (self-assessment), and new practice updates were rated highest (means for current importance: 4.1, 3.8, and 3.7, respectively); the lay press, objective tests of knowledge or skill, and topic listings for a CME course were rated lowest (2.3, 2.7, and 2.8, respectively). Most sources were rated as slightly more important in the ideal setting, with actual practice data and objective tests showing the largest positive difference (although ratings were still low compared with other sources). Two sources—the lay press and the topic listings for CME courses—were rated slightly lower in an ideal setting.
We selected in advance three current information sources for subgroup analysis: personal awareness, objective measurement, and actual practice data. Physicians in group practice reported greater importance for objective measurement (mean 2.8) than self-employed physicians (2.5; P < .001), and surgeons reported greater importance for actual practice data (3.2) than nonsurgical specialists (3.0; P = .003). We found no other statistically significant differences across practice type or specialty (P ≥ .015).
Barriers to PD
Respondents did not strongly endorse any of the barriers to pursuing PD (see Table 5 and Supplemental Digital Appendix 6, available at http://links.lww.com/ACADMED/A434). About half identified time as a “very” or “extremely” important barrier, and about one-third identified cost as such. Fewer than 20% identified as important the other anticipated barriers, which included finding activities, getting CME credit, accessing information, and selecting topics.
We found statistically significant differences across specialties for “I’m not sure what is most important to learn about” (generalists highest, nonsurgical specialists lowest) and for “There isn’t much new to learn” (surgeons highest) (see Table 5). Across practice type, respondents in academic practice reported time as a greater barrier (mean 3.8) than those in group practice (3.5) or self-employed practice (3.4) (P = .009; see Supplemental Digital Appendix 3, Table 3, available at http://links.lww.com/ACADMED/A434). We found no other statistically significant differences in perceived barriers across practice type (P ≥ .13).
Associations with burnout
We found statistically significant associations (P ≤ .007) in all planned analyses exploring associations with burnout (see Supplemental Digital Appendix 7, available at http://links.lww.com/ACADMED/A434). Compared with those not meeting the criteria for burnout, respondents who were burned out reported greater barriers arising from time (mean 4.0 vs. 3.3), difficulty finding relevant activities (2.5 vs. 2.3), and cost (3.1 vs. 2.8). They also indicated a stronger desire for help identifying gaps (4.3 vs. 3.9), a greater burden from cost (4.5 vs. 3.8), and a higher burnout/wellness competency priority score (13.3 vs. 8.9).
Subgroup analyses for key items
We performed subgroup analyses across all demographics for the five key items (see Table 6). Years since training showed statistically significant associations in all analyses, with longer-practicing respondents reporting lower priority scores for medical knowledge/skills and practice/systems improvement, less of a barrier for time and cost, and less desire for help identifying gaps. We found no statistically significant differences across specialty, geographic region, community size, or practice size.
Other exploratory analyses
We hypothesized that younger physicians would express stronger desires for interprofessional training. We found a statistically significant association (P < .001), with those in practice for 1 to 10 years (mean 3.9) and 11 to 20 years (4.0) rating this desire higher than those in practice for 21 to 30 (3.4) or > 30 years (3.4).
We also found a statistically significant inverse relationship between perceived ability to self-assess learning needs (“I know what I need to learn to do my job well”) and a desire for help identifying learning needs (rho = −0.30, P < .001).
We conducted a national cross-specialty survey of U.S. physicians to identify their PD priorities, practices, beliefs, and needs. We found that physicians generally believe that they already know what they need to learn and do not desire help identifying learning gaps, that finding and accumulating formal CME credits is not a significant concern, and that they would like more credit for the learning they accomplish while caring for patients. Medical knowledge/skills was rated as the most important and highest-priority professional competency, and PD activities for this competency were viewed as the easiest to find. Skills for wellness, practice/systems improvement, and informatics also received high priority rankings, and related activities were considered moderately hard to find. Physicians rated immediate patient care needs, personal awareness (self-assessment), and new practice updates as the most important means of identifying learning gaps, while objective tests of knowledge or skills were rated as the second-to-least important. Time was the only barrier rated as important by more than half of respondents. Physicians expressed little desire for additional interprofessional education activities.
In general, our findings remained consistent across physician subgroups. Self-employed physicians reported doing more PD in their personal time and perceived a greater burden from the cost of PD activities, while those in academic practice provided higher priority scores for all PD competencies. Surgeons rated actual practice data as more important in identifying learning needs than did generalists or nonsurgical specialists. Longer-practicing physicians reported lower priority scores for medical knowledge/skills and practice/systems improvement, less of a barrier related to time and cost, and less desire for help identifying gaps. Most of the subgroup differences we hypothesized were not statistically significant, and the higher priority scores provided by academic physicians ran counter to expectations (see Supplemental Digital Appendix 1, available at http://links.lww.com/ACADMED/A434).
Limitations and strengths
We do not know the attitudes of nonresponding physicians, and if those with strong beliefs preferentially responded, it could have biased our results. However, respondents and nonrespondents were similar across available demographic features, and the distribution of respondents’ specialties aligned with that of U.S. physicians generally. Moreover, late respondents had attitudes similar to those who responded early. Insofar as those who responded late (i.e., after several reminders) have attitudes similar to those who never responded,36 our findings do not underrepresent nonrespondents. Although this study provides useful information about physicians’ beliefs regarding their PD needs and practices, it does not provide direct guidance on actual learning needs or potential systems-level solutions.
The large number of statistical analyses we conducted raises concerns about spuriously significant P values.38,39 We view all of the analyses as exploratory and consider statistical significance as an indicator of potentially interesting relationships rather than as a reflection of certainty. For many subgroup analyses, we also outlined in advance our expected findings (see Supplemental Digital Appendix 1, available at http://links.lww.com/ACADMED/A434).
Strengths of our study include our adherence to best practices in survey development, implementation, and delivery, including pilot testing, expert review, and the use of a dedicated survey research center; a nationwide, cross-specialty sample that closely reflected U.S. demographics37; exploration of responses by specialty, practice type, and other subgroups; and ample power for these analyses.
Integration with prior research
Our findings align with the results of previous surveys indicating that time and/or schedule are the chief barriers to PD engagement,24,26,27,40 that cost is a moderate barrier,24,26 and that physicians would continue CME even if they were not required to do so.26 The findings from one survey agreed with our own regarding less desire among physicians for peer input or objective practice data,24 while the findings from another conversely suggested that physicians want more feedback on their knowledge and clinical performance.22 In an analysis reported separately, we found that physicians who responded to the present survey perceived low relevance and value in maintenance of certification (an important program for physician PD) as it is currently operationalized.29
Evidence indicates that physicians cannot reliably self-assess their own learning needs in general14,17 and that they resist information that differs from their self-perceptions of competence.7,41 However, individuals can recognize knowledge gaps when faced with a specific question,42 suggesting that, in the moment of patient care, physicians might be able to recognize such gaps.
Our findings intersect with findings from previous research on clinical questions,4,43 information seeking,44,45 and point-of-care learning.18,46,47 Physicians seek answers to only a minority of the questions that arise in clinical practice.4,43 Our findings and those of prior research43,48 suggest the need to better support physicians in quickly finding relevant information. Information technologies will likely play a key role in identifying gaps in knowledge and performance and in synthesizing, selecting, and delivering timely, relevant, accurate, and up-to-date information.49–52
Implications for practice and future research
Physicians’ perceptions must be taken seriously. Even if responses like those we received reflect erroneous beliefs about barriers, self-assessment, professional priorities, or learning in general, they represent a starting point for discussions and activities that acknowledge—and if necessary correct—these beliefs.
Physicians reported that time is the greatest barrier to PD, followed by cost and difficulty finding relevant activities. Topic selection and accrual of CME credit were not viewed as problematic, although physicians would like credit for what they are already doing (i.e., workplace learning). Stating these barriers in positive terms, physicians desire time-efficient, low-cost, practice-relevant learning on topics of their choosing. Receiving CME credits may not be a strong incentive to participate in a given PD activity, especially one that appears to magnify other barriers (e.g., time, personal effort, or financial cost) or constrain physicians’ choices of topic or approach.
Physicians generally did not want external help identifying their learning needs, preferring instead to rely on point-of-care questions, personal awareness, and practice updates. This attitude conflicts with what we know about self-assessment—namely, that people (physicians or otherwise) cannot accurately identify the things they do not know.14,41 Actual practice data, guidance from others (an educational “coach”), and objective tests can more accurately identify true needs,2,6,16 yet respondents rated these approaches as less important. Reconciling this disconnect remains an area of active research.
Physicians’ prioritization of competencies in this study highlights two interrelated themes. First, CME providers need to understand the desires of potential participants (i.e., the topics physicians are interested in pursuing). Physicians’ ratings of importance, difficulty in finding activities, and need to learn for specific competencies can inform such programmatic prioritizations. Second, identifying discrepancies between the competency prioritizations of experts and those of frontline physicians is important. Changes in both medical practice and the practice environment have dramatically affected all aspects of patient care, but perhaps these changes have not affected physicians’ understanding of their own PD needs and preferences.53,54 Our respondents rated medical knowledge/skills as the most important professional competency and of the overall highest priority. By contrast, professionalism and communication/teamwork skills were of high importance but low learning need, suggesting that physicians believe that “these skills are important, but we already possess them and don’t need to learn more.” Overall, these findings highlight the differences between what physicians need for competencies, what they can access, and what they pursue.
Multidimensional needs demand multifaceted solutions. Identifying and addressing physicians’ PD needs will require the integrated efforts of professional societies, certification boards, academic institutions, hospitals and health care delivery systems, specialists in point-of-care information delivery, and local communities of practice, among others.2,6,8 Each entity has characteristics that make it better suited to address some needs and contexts and less suited to address others. Because physicians often encounter barriers when implementing change locally after learning something new,55 the potential to support local practice change following a learning activity should be considered in addition to efficient content delivery and the effective promotion of learning.6 Future research might explore how to objectively determine physicians’ PD needs, share this information with physicians in a manner that engages rather than alienates them, and address gaps at individual, peer, interprofessional, and systems levels.
Our findings suggest that frontline physicians feel pushed to engage in activities that others believe are good for them, such as accepting guidance in determining their learning needs, training to work in interprofessional teams, and developing nonknowledge competencies. Although we agree with the need for PD/CME reform,1–3,6,8,53,56 we suspect that a solely top-down approach will continue to meet resistance. Thus, we suggest that reform include efforts to transform the lifelong learning beliefs, attitudes, and skills of frontline physicians such that they recognize the need for help identifying learning gaps, pursue activities that remedy these gaps, and accept that effective learning requires engagement and effort. The resulting demand-driven market will better support physicians in maintaining professional competence and delivering high-quality care.
Acknowledgments: The authors thank Graham McMahon (Accreditation Council for Continuing Medical Education), Paul Mazmanian (Virginia Commonwealth University School of Medicine), the late Alex Djuricich (Indiana University School of Medicine), and one anonymous reviewer for providing external expert review of the survey questionnaire. They also thank Ann Harris and Wendlyn Daniels (Mayo Clinic Survey Research Center) for their help in planning, testing, and implementing the survey.
1. Hager M, Russell S, Fletcher SW. Continuing Education in the Health Professions: Improving Healthcare Through Lifelong Learning. 2007.New York, NY: Josiah Macy, Jr. Foundation.
2. McMahon GT. What do I need to learn today?—The evolution of CME. N Engl J Med. 2016;374:14031406.
3. Nissen SE. Reforming the continuing medical education system. JAMA. 2015;313:18131814.
4. Ely JW, Osheroff JA, Ebell MH, et al. Analysis of questions asked by family doctors regarding patient care. BMJ. 1999;319:358361.
5. Del Fiol G, Haug PJ. Use of classification models based on usage data for the selection of infobutton resources. AMIA Annu Symp Proc. October 11, 2007:171175.
6. Davis DA, Prescott J, Fordis CM Jr, et al. Rethinking CME: An imperative for academic medicine and faculty development. Acad Med. 2011;86:468473.
7. Eva KW, Bordage G, Campbell C, et al. Towards a program of assessment for health professionals: From training into practice. Adv Health Sci Educ Theory Pract. 2016;21:897913.
8. Institute of Medicine. Redesigning Continuing Education in the Health Professions. 2010.Washington, DC: National Academies Press.
9. Forsetlund L, Bjørndal A, Rashidian A, et al. Continuing education meetings and workshops: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev. April 15, 2009;(2):CD003030.
10. Grudniewicz A, Kealy R, Rodseth RN, Hamid J, Rudoler D, Straus SE. What is the effectiveness of printed educational materials on primary care physician knowledge, behaviour, and patient outcomes: A systematic review and meta-analyses. Implement Sci. 2015;10:164.
11. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: Effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. June 13, 2012;(6):CD000259.
12. Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: Guide to the evidence. JAMA. 2002;288:10571060.
13. Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep). 2007;149:169.
14. Eva KW, Regehr G. Self-assessment in the health professions: A reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46S54.
15. Norcini JJ, Lipner RS, Grosso LJ. Assessment in the context of licensure and certification. Teach Learn Med. 2013;25(suppl 1):S62S67.
16. Sargeant J, Bruce D, Campbell CM. Practicing physicians’ needs for assessment and feedback as part of professional development. J Contin Educ Health Prof. 2013;33(suppl 1):S54S62.
17. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA. 2006;296:10941102.
18. Salinas GD. Trends in physician preferences for and use of sources of medical information in response to questions arising at the point of care: 2009–2013. J Contin Educ Health Prof. 2014;34(suppl 1):S11S16.
19. Nicolaou M, Armstrong R, Hassell AB, Walker D, Birrell F. Musculoskeletal health professional use of Internet resources for personal and patient education: Results from an online national survey. Open Rheumatol J. 2012;6:190198.
20. Olivieri JJ, Knoll MB, Arn PH. Education format and resource preferences among registrants of a pediatric-focused CME website. Med Teach. 2009;31:e333e337.
21. Wang AT, Sandhu NP, Wittich CM, Mandrekar JN, Beckman TJ. Using social media to improve continuing medical education: A survey of course participants. Mayo Clin Proc. 2012;87:11621170.
22. Gallagher TH, Prouty CD, Brock DM, Liao JM, Weissman A, Holmboe ES. Internists’ attitudes about assessing and maintaining clinical competence. J Gen Intern Med. 2014;29:608614.
23. Tabas JA, Boscardin C, Jacobsen DM, Steinman MA, Volberding PA, Baron RB. Clinician attitudes about commercial support of continuing medical education: Results of a detailed survey. Arch Intern Med. 2011;171:840846.
24. Price DW, Overton CC, Duncan JP, et al. Results of the first national Kaiser Permanente continuing medical education needs assessment survey. Perm J. 2002;6:7684.
25. Curran VR, Keegan D, Parsons W, et al. A comparative analysis of the perceived continuing medical education needs of a cohort of rural and urban Canadian family physicians. Can J Rural Med. 2007;12:161166.
26. Stewart GD, Khadra MH. The continuing medical education activities and attitudes of Australian doctors working in different clinical specialties and practice locations. Aust Health Rev. 2009;33:4756.
27. Stewart GD, Teoh KH, Pitts D, Garden OJ, Rowley DI. Continuing professional development for surgeons. Surgeon. 2008;6:288292.
28. Vollmar HC, Rieger MA, Butzlaff ME, Ostermann T. General practitioners’ preferences and use of educational media: A German perspective. BMC Health Serv Res. 2009;9:31.
29. Cook DA, Blachman MJ, West CP, Wittich CM. Physician attitudes about maintenance of certification: A cross-specialty national survey. Mayo Clin Proc. 2016;91:13361345.
30. Kempkens D, Dieterle WE, Butzlaff M, et al. German ambulatory care physicians’ perspectives on continuing medical education—A national survey. J Contin Educ Health Prof. 2009;29:259268.
31. Green ML, Holmboe E. Perspective: The ACGME toolbox: Half empty or half full? Acad Med. 2010;85:787790.
32. Frank JR, Snell L, Sherbino J. CanMEDS 2015 Physician Competency Framework. 2015.Ottawa, Ontario, Canada: Royal College of Physicians and Surgeons of Canada.
33. Shanafelt TD, Hasan O, Dyrbye LN, et al. Changes in burnout and satisfaction with work–life balance in physicians and the general US working population between 2011 and 2014. Mayo Clin Proc. 2015;90:16001613.
34. West CP, Dyrbye LN, Sloan JA, Shanafelt TD. Single item measures of emotional exhaustion and depersonalization are useful for assessing burnout in medical professionals. J Gen Intern Med. 2009;24:13181321.
35. Cook DA, Wittich CM, Daniels WL, West CP, Harris AM, Beebe TJ. Incentive and reminder strategies to improve response rate for Internet-based physician surveys: A randomized experiment. J Med Internet Res. 2016;18:e244.
36. Miller LE, Smith KL. Handling nonresponse issues. J Ext. 1983;21:4550.
37. Association of American Medical Colleges. 2014 Physician Specialty Data Book. 2014.Washington, DC: Association of American Medical Colleges Center for Workforce Studies.
38. Nuzzo R. Scientific method: Statistical errors. Nature. 2014;506:150152.
39. Chavalarias D, Wallach JD, Li AH, Ioannidis JP. Evolution of reporting P values in the biomedical literature, 1990–2015. JAMA. 2016;315:11411148.
40. Goodyear-Smith F, Whitehorn M, McCormick R. Experiences and preferences of general practitioners regarding continuing medical education: A qualitative study. N Z Med J. 2003;116:U399.
41. Eva KW, Regehr G. “I’ll never play professional football” and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28:1419.
42. Eva KW, Regehr G. Knowing when to look it up: A new conception of self-assessment ability. Acad Med. 2007;82(10 suppl):S81S84.
43. Del Fiol G, Workman TE, Gorman PN. Clinical questions raised by clinicians at the point of care: A systematic review. JAMA Intern Med. 2014;174:710718.
44. Ebell MH, Shaughnessy A. Information mastery: Integrating continuing medical education with the information needs of clinicians. J Contin Educ Health Prof. 2003;23(suppl 1):S53S62.
45. Pluye P, Grad RM, Johnson-Lafleur J, et al. Number needed to benefit from information (NNBI): Proposal from a mixed methods research study with practicing family physicians. Ann Fam Med. 2013;11:559567.
46. Cook DA, Sorensen KJ, Wilkinson JM, Berger RA. Barriers and decisions when answering clinical questions at the point of care: A grounded theory study. JAMA Intern Med. 2013;173:19621969.
47. Bright TJ, Wong A, Dhurjati R, et al. Effect of clinical decision-support systems: A systematic review. Ann Intern Med. 2012;157:2943.
48. Del Fiol G, Haug PJ, Cimino JJ, Narus SP, Norlin C, Mitchell JA. Effectiveness of topic-specific infobuttons: A randomized controlled trial. J Am Med Inform Assoc. 2008;15:752759.
49. Davis D, Evans M, Jadad A, et al. The case for knowledge translation: Shortening the journey from evidence to effect. BMJ. 2003;327:3335.
50. Cook DA, Sorensen KJ, Hersh W, Berger RA, Wilkinson JM. Features of effective medical knowledge resources to support point of care learning: A focus group study. PLoS One. 2013;8:e80318.
51. Cook DA, Sorensen KJ, Nishimura RA, Ommen SR, Lloyd FJ. A comprehensive information technology system to support physician learning at the point of care. Acad Med. 2015;90:3339.
52. Lobach D, Sanders GD, Bright TJ, et al. Enabling Health Care Decisionmaking Through Clinical Decision Support and Knowledge Management. 2012. Rockville, MD: Agency for Healthcare Research and Quality; Evidence report no. 203.
53. Lucey CR. Medical education: Part of the problem and part of the solution. JAMA Intern Med. 2013;173:16391643.
54. Irby DM, Cooke M, O’Brien BC. Calls for reform of medical education by the Carnegie Foundation for the Advancement of Teaching: 1910 and 2010. Acad Med. 2010;85:220227.
55. Price DW, Miller EK, Rahm AK, Brace NE, Larson RS. Assessment of barriers to changing practice as CME outcomes. J Contin Educ Health Prof. 2010;30:237245.
56. Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: Transforming education to strengthen health systems in an interdependent world. Lancet. 2010;376:19231958.