Secondary Logo

Journal Logo

Educational Technologies for Physician Continuous Professional Development: A National Survey

Cook, David, A., MD, MHPE; Blachman, Morris, J., PhD; Price, David, W., MD; West, Colin, P., MD, PhD; Baasch Thomas, Barbara, L., BSN, MA; Berger, Richard, A., MD, PhD; Wittich, Christopher, M., MD, PharmD

doi: 10.1097/ACM.0000000000001817
Research Reports
Free
SDC

Purpose To determine the past experiences with, current use of, and anticipated use of online learning and simulation-based education among practicing U.S. physicians, and how findings vary by age.

Method The authors surveyed 4,648 randomly sampled board-certified U.S. physicians, September 2015 to April 2016, using Internet-based and paper questionnaires. Survey items (some optional) addressed past and current technology usage, perceived technology effectiveness, and anticipated future use of specific technology innovations.

Results Of 988 respondents, 444 completed optional items. Of these, 429/442 (97.1%) had used online learning and 372/442 (84.2%) had used simulation-based education in the past five years. Desire for more online learning was modest (mean [standard deviation], 4.6 [1.5]; 1 = strongly disagree, 7 = strongly agree), as was desire for more simulation-based education (4.2 [1.7]). Both online learning and simulation-based education were perceived as effective (5.2 [1.4]; 5.0 [1.4]). Physicians believed they possess adequate skills for online learning (5.8 [1.2]) and that point-of-care learning is vital to effective patient care (5.3 [1.3]). Only 39.0% used objective performance data to guide their learning choices, although 64.6% agreed that such information would be useful. The highest-rated innovations included a central repository for listing educational opportunities and tracking continuing education credits, an app to award credit for answering patient-focused questions, 5-minute and 20-minute clinical updates, and an e-mailed “question of the week.” Responses to most survey items were similar across age groups.

Conclusions Practicing physicians generally seem receptive and prepared to use a variety of educational technologies, regardless of age.

D.A. Cook is professor of medicine and professor of medical education; associate director, Mayo Clinic Online Learning; director of research, Office of Applied Scholarship and Education Science; and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota.

M.J. Blachman is clinical professor, Department of Neuropsychiatry and Behavioral Science, and associate dean, Continuous Professional Development & Strategic Affairs, University of South Carolina School of Medicine, Columbia, South Carolina.

D.W. Price is senior vice president, American Board of Medical Specialties (ABMS) Research & Education Foundation, and executive director, ABMS Multispecialty Portfolio Program, Chicago, Illinois; and professor of family medicine, University of Colorado School of Medicine, Aurora, Colorado.

C.P. West is professor of medicine, professor of biostatistics, and professor of medical education; associate program director, Internal Medicine Residency Program; and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota.

B.L. Baasch Thomas is administrator, Mayo School of Continuous Professional Development, Mayo Clinic College of Medicine, Rochester, Minnesota.

R.A. Berger is professor of orthopedics; dean, Mayo School of Continuous Professional Development; medical director, Mayo Clinic Online Learning; and consultant, Department of Orthopedic Surgery and Department of Anatomy, Mayo Clinic College of Medicine, Rochester, Minnesota.

C.M. Wittich is associate professor of medicine; associate program director, Internal Medicine Residency Program; and practice chair, Division of General Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota.

Funding/Support: None reported.

Other disclosures: The authors are not aware of any conflicts of interest. Author D.A.C. had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. All of the authors jointly defined the study objectives and hypotheses, created the survey instrument, interpreted the study results, drafted and revised the manuscript, and approved the final manuscript.

Ethical approval: The study was approved by the Mayo Clinic Institutional Review Board (#15-003437).

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A464.

Correspondence should be addressed to David A. Cook, Division of General Internal Medicine, Mayo Clinic College of Medicine, Mayo 17-W, 200 First Street SW, Rochester, MN 55905; telephone: (507) 284-2269; e-mail: cook.david33@mayo.edu.

Contemporary educational technologies such as online learning and technology-enhanced simulation offer tools that might help practicing physicians in their continuous professional development (CPD). Research across a wide spectrum of health professions learners confirms that both approaches offer consistent and significant benefits when compared with no intervention.1,2 Online learning (i.e., use of the Internet to support and mediate educational activities1,3) is, on average, neither more nor less effective than other educational approaches in promoting knowledge, skills, and behaviors,1,4 whereas learning with technology-enhanced simulation (“an educational tool or device with which the learner physically interacts to mimic an aspect of clinical care for the purpose of teaching or assessment”5) is associated with modest and statistically significant increases in these outcomes.5 These approaches differ substantially in their efficiency, flexibility, cost, and complexity, and support complementary learning strategies. Online and simulation-based tools can also be used to assess learning and thus identify gaps in knowledge, skill, and practice performance.6,7 Information technologies, including online learning and clinical decision support tools embedded in electronic health records, have been further proposed as potentially helpful in managing the exponential growth of medical information.8–11

Although these technologies are widely used in undergraduate and postgraduate physician training,1,2,12,13 it is less clear how well practicing physicians accept and use these technologies to guide the identification and remediation of CPD gaps.14 Few studies have explored the attitudes and beliefs of practicing physicians regarding their use of educational technologies, and those studies have incompletely addressed relevant issues. A survey in 2013 of practicing U.S. physicians found respondents moderately likely to participate in an online course, and that time spent seeking online information had increased from 2009 to 2013.15 In a 2008 survey of users of an online pediatric continuing medical education (CME) Web site, physicians indicated that the most important features of an online course were that it be free of charge and address an important topic.16 A survey in 2000 of physicians in a large U.S. health network found that 30% believed online learning was useful,17 and a survey of German physicians in 2003 found that technology was not a barrier to online learning.18 A review of online CME courses in 2008 found that a small minority of Web sites accounted for the majority of offerings, most were free or charged very little, and few required learner interactivity.19 This paucity of evidence leaves a significant gap in our understanding of the beliefs and expectations of practicing physicians regarding educational technologies.

To address this gap, we conducted a nationwide, cross-specialty survey of U.S. physicians to determine their current use of, past experiences with, and anticipated future use of online learning and simulation-based education, and how these experiences and beliefs vary by age, practice type, and specialty.

We hypothesized that:

  • Older physicians would have less favorable attitudes about educational technologies;
  • Physicians in academic practice would have had more, and more favorable, experiences with both online learning and simulation-based education than physicians in group or self-employed practice;
  • Surgeons would perceive a greater role for simulation-based education than generalists and nonsurgical specialists;
  • Self-employed physicians would report less support for online learning and lower integration with their practice environment, but would anticipate a greater role for online learning in the future;
  • Physicians would prefer technology-mediated activities that are short, case based, and can be done without leaving the office or buying special equipment;
  • Prior favorable experiences with online learning and simulation, and current integration/support, would be associated with interest in future offerings.
Back to Top | Article Outline

Method

From September 2015 through April 2016, we surveyed licensed physicians in the United States about their CPD beliefs and experiences. Survey results regarding broad issues in CPD have been published separately20,21; this report focuses on the unpublished subset of items dedicated to educational technologies.

Back to Top | Article Outline

Sampling and participants

We identified a random sample of 4,648 licensed U.S. physicians from the LexisNexis Provider Data Management and Services database (LexisNexis Risk Solutions, Alpharetta, Georgia). We obtained the name, contact information, specialty, age, and gender for each physician. Internet survey completion was tracked, but responses were anonymized upon receipt. Paper surveys were entirely anonymous. All participants were offered a small gift (a book valued at about $12). The Mayo Clinic Institutional Review Board approved the study.

Back to Top | Article Outline

Instrument

The authors, representing diverse CME leadership experiences working within academic medical centers, integrated managed care networks, and medical specialty boards, collaborated to create survey items addressing three domains: prior experiences with online learning and simulation-based education; beliefs about educational technology effectiveness, personal preparedness for technology use, and workplace supports; and anticipated future use of specific, diverse educational technology innovations. All items consisted of seven-point bipolar Likert-type items (1 = strongly disagree; 7 = strongly agree). The survey did not provide definitions of “online learning” or “simulation-based education.” The number of items exceeded what we anticipated would be acceptable to many respondents. Thus, to allow advertising of a shorter survey and thereby encourage participation, we divided the questionnaire into two sections and allowed participants to submit the survey after completing the first section (“primary items”). This article reports five primary items; the remaining items come from the second half (optional or “secondary” items), for which the response rate was lower.

Four CME experts at nonaffiliated institutions reviewed the draft survey for content (i.e., important omitted topics or irrelevant items). Mayo Clinic Survey Research Center personnel with expertise in questionnaire development reviewed each item to verify structure and wording. We asked 17 physicians (representing anesthesiology, dermatology, emergency medicine, family medicine, internal medicine, neurology, pathology, psychiatry, and surgery) to pilot test the survey and provide feedback on item relevance and wording. We revised the survey at each stage of testing.

Back to Top | Article Outline

Survey administration

We used Qualtrics (www.qualtrics.com), a survey research tool, to administer the Internet survey. We sent each physician an individually tracked link via e-mail, and sent follow-up e-mail reminders to nonrespondents. Those not responding to the Internet survey within three months were mailed a paper questionnaire that had no identifying information, and a stamped return envelope.

Back to Top | Article Outline

Analyses

In reporting demographic characteristics we used respondent-reported information when available, and filled in missing data using information from LexisNexis. To evaluate the representativeness of the sample, we compared the distribution of respondents’ specialties against the national distribution published in the Association of American Medical Colleges 2014 Physician Specialty Data Book.22 We explored possible differences between respondents and nonrespondents in two ways. First, we used the chi-square test to compare respondents and nonrespondents for demographic features available from the LexisNexis dataset (specialty, practice location, age, and gender). Second, we compared the responses of late responders (the last 15% of responses) with those responding earlier, since research suggests that the beliefs of late responders closely approximate the beliefs of nonrespondents.23 To estimate the representativeness of the secondary item findings (since only about half the respondents completed these items), we compared primary item responses and respondent demographics for those who did versus did not complete the secondary items. We also compared survey responses for Internet and paper modalities.

We planned a priori analyses exploring variation in responses by age (< 45 years, 45–59 years, or ≥ 60 years); specialty (generalist [nonsubspecialist family medicine, internal medicine, and pediatrics], surgical specialist [surgery, anesthesiology, and obstetrics–gynecology], or nonsurgical specialists [all others]); and practice type (self-employed, group, or academic). We also planned to evaluate potential associations among past experiences and beliefs about effectiveness, usefulness, and expected future use; desired use of online learning and online learning support, skills, and integration with the practice environment; online learning support and online skills; and beliefs about online learning and simulation.

We used general linear models to test associations between opinions (outcomes) and respondent characteristics (predictors), including selected demographics and whether or not they completed the second half of the survey. We evaluated the correlation among survey items using Spearman rho. Because of the large sample size and multiple comparisons, we used a two-tailed alpha of 0.01 as the threshold of statistical significance in all analyses. We used SAS version 9.4 (SAS Institute Inc., Cary, North Carolina) for statistical analyses.

Back to Top | Article Outline

Results

Survey response and sample characteristics

Of 4,648 attempted contacts, 646 e-mail invitations and 223 paper questionnaires were undeliverable, and 65 invitations were undeliverable via either e-mail or paper. We received 631 responses via Internet and 357 via paper. After excluding the 65 invitations undeliverable by either method, our response rate was 988/4,583 (21.6%). A less conservative estimate excluding all 934 undeliverable invitations suggests a response rate of 26.6% (988/3,714).

The demographic characteristics of respondents and nonrespondents were similar, except that older physicians were less likely to respond and pediatric subspecialists were more likely (see Table 1). The distribution of respondents’ specialties was similar to that of published data for all U.S. physicians22 (P > .06), except that our sample contained relatively fewer family medicine and general internal medicine physicians (absolute difference about 4% for both; P < .001). We compared the responses to the five primary survey items for those responding early versus late in the survey period and found no statistically significant differences. Finally, we compared the five primary items between Internet and paper survey modalities, and again found no statistically significant differences.

Table 1

Table 1

The secondary items were explicitly labeled “optional.” To determine whether the 444 (44.9%) respondents who completed at least one secondary item were similar to those responding only to the primary items, we compared both item responses and demographics. We found no statistically significant differences in responses to the five primary items (P ≥ .04) among those who did versus did not complete the secondary items. Relatively fewer respondents ≥ 60 years old completed the secondary items (38.5%, compared with 48.1% of those < 45 and 52.2% of those 45–59; P = .01). We found no statistically significant differences in completion of secondary items for other demographics (specialty, practice type, gender, location, practice size, or revenue model).

Back to Top | Article Outline

Prior experience with educational technologies

In the preceding five years, 97.1% of respondents (429/442) had used online learning for their professional development, 92.1% (407/442) had used online learning for personal purposes, and 84.2% (372/442) had used simulation-based education (Table 2). Among those with experience, the mean rating (1 = strongly disagree, 7 = strongly agree) for online learning effectiveness was 5.2 (standard deviation [SD], 1.5) compared with 4.5 (1.7) for simulation-based education. Supplemental Digital Appendix 1 (http://links.lww.com/ACADMED/A464) contains responses in the full 1–7 scale for all survey items.

Table 2

Table 2

Physicians generally agreed that point-of-care learning is vital to effective patient care (mean 5.3 [SD 1.3]; see Table 3) and that they have adequate workplace resources to answer patient-related questions (5.9 [1.1]). Only 39.0% (165/423) agreed that they currently use objective performance data to guide their CPD choices, although 64.6% (605/936) agreed that such information would be useful.

Table 3

Table 3

Back to Top | Article Outline

Perceived effectiveness and future role of online learning and simulation for CPD

Desire for more online learning was modest (mean 4.6 [SD 1.5]), as was desire for more simulation-based education (4.2 [1.7]; see Table 3). Physicians’ impressions of the effectiveness of online learning and simulation-based education were similar (5.2 [1.4] vs. 5.0 [1.4]; P = .06), yet they anticipated a more vital role for online learning in CPD compared with simulation (5.7 [1.1] vs. 5.1 [1.4]; P < .0001). They perceived that they currently possess adequate access to (5.4 [(1.3]), skills for (5.8 [1.2]), and technical support for (5.5 [1.4]) online learning.

Back to Top | Article Outline

Anticipated use of specific technology innovations

We asked physicians to rate their anticipated use of various specific innovations (Table 4). The most highly rated innovations provided support for formal CME activities: a central repository for listing CME opportunities (mean 5.7 [SD 1.2]), tracking CME completion (5.7 [1.3]), and receiving credit for answering patient-focused questions (5.2 [1.7]). Other highly rated innovations included 5-minute and 20-minute clinical updates and an e-mailed “question of the week.”

Table 4

Table 4

Back to Top | Article Outline

Variation by age

There were no significant differences across age groups in the number of physicians with prior experience using online learning or simulation (P > .35), or in physicians’ ratings of these prior experiences (P ≥ .03; see Table 2).

Responses regarding effectiveness of, access to, and future role of educational technologies varied minimally across age groups (see Table 3). We found small but statistically significant differences across age groups in self-perceived skill for online learning (older physicians perceived lower skills) and interest in information about patient outcomes (younger physicians indicated stronger interest). Other differences across age groups did not reach statistical significance.

For nearly all of the technology innovations, physicians < 45 years old rated the helpfulness or likelihood of regular use highest, and those ≥ 60 rated these lowest (Table 4). These differences by age group were statistically significant for six innovations: an app with case-based questions, an app with patient-focused questions, a 5-minute clinical update, an app that monitored clinical practice, an educational game, and a central CME tracking repository.

Back to Top | Article Outline

Variation by practice type and specialty

We report subgroup analyses by practice type and specialty in Supplemental Digital Appendix 2 (http://links.lww.com/ACADMED/A464). Ratings of prior experiences with online learning and simulation-based education were similar by practice type (P ≥ .06) and specialty (P ≥ .46). As expected, we found a statistically significant difference across practice types in access to point-of-care knowledge resources (P < .0001), with self-employed physicians reporting less access, and physicians in academic practice reporting greater access, than those in group practices. We found an unanticipated difference in the perceived benefit of patient outcomes information, which was higher for physicians in group practice than for self-employed physicians (P = .002). We did not find any other statistically significant differences by practice type.

We found an unanticipated difference across specialties in the perceived value of point-of-care learning, with generalists and nonsurgical specialists rating this higher than surgeons (P = .0005). Generalists reported better access to point-of-care knowledge resources than surgeons or nonsurgical specialists (P = .0004). We did not confirm the anticipated differential preference of surgeons for simulation-based education, nor did we identify any other significant differences across specialty (P > .06).

Back to Top | Article Outline

Associations with other technology beliefs

We explored associations among selected survey ratings. Ratings for the effectiveness of prior online professional development activities correlated significantly with ratings of the future effectiveness (rho = 0.73), desired use (rho = 0.36), and vital role (rho = 0.60) of online learning (all P < .0001). Likewise, ratings of the effectiveness of prior simulation-based education correlated with ratings of the future effectiveness (rho = 0.69), desired use (rho = 0.45), and vital role (rho = 0.63) of simulation (all P < .0001). We found only weak correlations (explaining ≤ 3.6% of the score variance [R2 ≤ 3.6]) between desired use for online learning and current online learning integration (rho = 0.19, P < .0001), support (rho = 0.04, P = .37), and personal skills (rho = 0.16, P = .0006). Online learning support was significantly associated with personal skills (rho = 0.54, P < .0001). Finally, we found relatively strong correlations between online learning and simulation in terms of beliefs about past effectiveness (rho = 0.44) and future desired use (rho = 0.44; P < .0001).

Back to Top | Article Outline

Discussion

In this national survey, we found that nearly all responding physicians had used online learning, and the vast majority had experience with simulation-based education. Although perceptions of past experience with online learning and simulation were similar, physicians anticipated a greater role for future online learning. Physicians generally perceived adequate personal skills and access to support for online learning. Specific innovations that were highly rated included a central repository for listing CME opportunities and tracking their CME completion, an app awarding credit for answering patient-focused questions, 5-minute and 20-minute clinical updates, and an e-mail “question of the week.” By contrast, responses regarding current and future use of clinical practice data were only moderately favorable. Responses for nearly all beliefs and past experiences were similar across age groups. Differences across practice types and specialties were few, small, and showed no meaningful pattern.

Back to Top | Article Outline

Limitations

The survey response rate was lower than ideal, and those choosing to respond might have been systematically different from those who did not respond. However, the demographic characteristics of those invited and those responding were similar, and respondents largely reflected the national distribution of specialties. Additionally, to the extent that those responding late have opinions similar to those who never respond,23 our finding of similar responses among early and late responders suggests that our findings do not misrepresent nonrespondents. Moreover, the invitation to complete the survey did not specifically mention educational technologies, such that decisions to complete the survey would be unlikely to be based on particular interests in or beliefs about this topic. Most of the data in this report derive from survey items completed by only half the respondents (secondary items), but those who did not respond to these items were similar to those who did in both demographic characteristics and actual responses.

Strengths include the national cross-specialty sample, large sample size, planned subgroup analyses, robust process for questionnaire development (including external expert review and pilot testing by physicians in multiple specialties), and adherence to best practices in survey implementation and delivery (including use of a dedicated Survey Research Center).

Back to Top | Article Outline

Integration with prior work

Our findings corroborate prior research suggesting that practicing physicians are willing to use online learning and simulation in their CPD.15–19,24,25 Two qualitative studies highlighted the importance of being able to trust the quality of content,26,27 and suggested that physicians may be slow to adopt a new technology if their current approach seems to be working.27 Although we did not directly address the issue of cost, other studies indicate that physicians prefer (and believe they can find) free online CME,16 and that free online CME is readily available.19

Our study also highlights interest in technology for learning at the point of care.15,28,29 Technology innovations that integrate learning and patient care include knowledge resources and search tools that facilitate finding information to answer clinical questions,14,30,31 clinical alerts and practice advisories that provide information before it is requested,32,33 and patient data reports that highlight gaps in knowledge or performance.34,35 Most clinical decision support systems are designed to expedite immediate patient-related decisions rather than to promote actual learning (i.e., linking new information with existing knowledge structures to promote retention and transfer to new settings).36 When and how to optimize point-of-care information technologies to promote learning constitutes an important topic for future study.

Back to Top | Article Outline

Implications

Our findings have implications for education practice and future research. First, practicing physicians generally seem receptive to using a variety of educational technologies. They seem especially attracted to short, high-relevance, patient-focused activities, and innovations that automate administrative tasks (e.g., monitoring or awarding CME credit). Favorable past experiences appear to be strong predictors of the anticipated future effectiveness and role of these technologies. Respondents rated simulation-based education slightly lower than online learning, and although we lack data to explain this finding directly, we speculate that it may reflect the perceived high cost and low accessibility of simulation (e.g., the need for specialized equipment or inability to complete tasks on-site). Resources and skills are currently perceived as adequate for online learning.

Second, although physicians believe that patient outcomes information would help them make better CPD choices, they expressed only modest interest in a technology solution to provide such information. We lack data to fully explain this disparity, but propose two potential explanations. First, physicians might have had or heard about poor experiences using such technologies. If true, then improving the technology and highlighting advantages of the new approach could address the concern. Alternatively, this could reflect a general resistance to feedback,37 in which case better technology alone will not solve the problem. We need to better understand this issue, and the related issues of self-assessment versus external guidance.38–40

Third, physicians of all ages seem to have interest, willingness, and capability to use online learning and simulation-based education. Although older physicians generally reported a lower likelihood of regularly using our example innovations than younger physicians, beliefs about effectiveness and future roles were similar across age groups.

Fourth, beliefs about educational technologies vary little across practice types or specialties. Even those in small practices seem interested in and capable of using new technologies, and integration of and local support for online learning are weak predictors of anticipated future use. Nonetheless, there appears to be room for improvement in the integration of online learning activities into most physicians’ practices and in the availability of point-of-care resources (especially for self-employed practitioners).

Finally, building it won’t guarantee that they will use it. The findings in Table 4 suggest potential interest in software to support point-of-care learning, clinical updates, or CME tracking. Yet, avowed desires may not translate to actual future usage, and physicians might not really know what they do or don’t want or need. This is especially important given the up-front infrastructure investment that some technologies entail. We need to focus on true educational needs rather than the hype and glamour of the latest innovation, listening attentively to our potential customers—practicing physicians—to understand the problems they face, and then iteratively test and refine potential technology solutions. We need further research on what to build and how to build it, focusing beyond technical issues to consider matters of usability, implementation, integration, and educational effectiveness.

Back to Top | Article Outline

Acknowledgments:

The authors thank Graham McMahon (Accreditation Council for Continuing Medical Education), Paul Mazmanian (Virginia Commonwealth University School of Medicine), the late Alex Djuricich (Indiana University School of Medicine), and one anonymous reviewer for providing external expert review of the survey questionnaire. Additionally, the authors thank Ann Harris and Wendlyn Daniels (Mayo Clinic Survey Research Center) for their help in planning, testing, and implementing the survey.

Back to Top | Article Outline

References

1. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: A meta-analysis. JAMA. 2008;300:1181–1196.
2. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA. 2011;306:978–988.
3. Cook DA, Ellaway RH. Evaluating technology-enhanced learning: A comprehensive framework. Med Teach. 2015;37:961–970.
4. Wutoh R, Boren SA, Balas EA. eLearning: A review of Internet-based continuing medical education. J Contin Educ Health Prof. 2004;24:20–30.
5. Cook DA, Brydges R, Hamstra SJ, et al. Comparative effectiveness of technology-enhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simul Healthc. 2012;7:308–320.
6. Ellaway R, Masters K. AMEE guide 32: e-Learning in medical education Part 1: Learning, teaching and assessment. Med Teach. 2008;30:455–473.
7. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Technology-enhanced simulation to assess health professionals: A systematic review of validity evidence, research methods, and reporting quality. Acad Med. 2013;88:872–883.
8. Prorok JC, Iserman EC, Wilczynski NL, Haynes RB. The quality, breadth, and timeliness of content updating vary substantially for 10 online medical texts: An analytic survey. J Clin Epidemiol. 2012;65:1289–1295.
9. Cook DA, Sorensen KJ, Wilkinson JM, Berger RA. Barriers and decisions when answering clinical questions at the point of care: A grounded theory study. JAMA Intern Med. 2013;173:1962–1969.
10. McDonald FS, Zeger SL, Kolars JC. Factors associated with medical knowledge acquisition during internal medicine residency. J Gen Intern Med. 2007;22:962–968.
11. Del Fiol G, Workman TE, Gorman PN. Clinical questions raised by clinicians at the point of care: A systematic review. JAMA Intern Med. 2014;174:710–718.
12. Kamin C, Souza KH, Heestand D, Moses A, O’Sullivan P. Educational technology infrastructure and services in North American medical schools. Acad Med. 2006;81:632–637.
13. Passiment M, Sacks H, Huang G. Medical Simulation in Medical Education: Results of an AAMC Survey. 2011.Washington, DC: Association of American Medical Colleges.
14. Cook DA, Sorensen KJ, Hersh W, Berger RA, Wilkinson JM. Features of effective medical knowledge resources to support point of care learning: A focus group study. PLoS One. 2013;8:e80318.
15. Salinas GD. Trends in physician preferences for and use of sources of medical information in response to questions arising at the point of care: 2009–2013. J Contin Educ Health Prof. 2014;34(suppl 1):S11–S16.
16. Olivieri JJ, Knoll MB, Arn PH. Education format and resource preferences among registrants of a pediatric-focused CME website. Med Teach. 2009;31:e333–e337.
17. Price DW, Overton CC, Duncan JP, et al. Results of the first national Kaiser Permanente continuing medical education needs assessment survey. Perm J. 2002;1:76–84.
18. Kempkens D, Dieterle WE, Butzlaff M, et al. German ambulatory care physicians’ perspectives on continuing medical education—A national survey. J Contin Educ Health Prof. 2009;29:259–268.
19. Harris JM Jr, Sklar BM, Amend RW, Novalis-Marine C. The growth, characteristics, and future of online CME. J Contin Educ Health Prof. 2010;30:3–10.
20. Cook DA, Blachman MJ, Price DW, West CP, Berger RA, Wittich CM. Professional development perceptions and practices among U.S. physicians: A cross-specialty national survey. Acad Med. 2017;92:1335–1345.
21. Cook DA, Blachman MJ, West CP, Wittich CM. Physician attitudes about maintenance of certification: A cross-specialty national survey. Mayo Clin Proc. 2016;91:1336–1345.
22. Association of American Medical Colleges. Physician Specialty Data Book 2014. 2014.Washington, DC: AAMC Center for Workforce Studies.
23. Miller LE, Smith KL. Handling nonresponse issues. J Ext. 1983;21:45–50.
24. Casebeer L, Bennett N, Kristofco R, Carillo A, Centor R. Physician Internet medical information seeking and on-line continuing education use patterns. J Contin Educ Health Prof. 2002;22:33–42.
25. Sinusas K. Internet point of care learning at a community hospital. J Contin Educ Health Prof. 2009;29:39–43.
26. Young KJ, Kim JJ, Yeung G, Sit C, Tobe SW. Physician preferences for accredited online continuing medical education. J Contin Educ Health Prof. 2011;31:241–246.
27. Sargeant J, Curran V, Jarvis-Selinger S, et al. Interactive on-line continuing medical education: Physicians’ perceptions and experiences. J Contin Educ Health Prof. 2004;24:227–236.
28. Lobach D, Sanders GD, Bright TJ, et al. Enabling Health Care Decisionmaking Through Clinical Decision Support and Knowledge Management. 2012. Rockville, MD: Agency for Healthcare Research and Quality; Evidence report no. 203.
29. Jones SS, Rudin RS, Perry T, Shekelle PG. Health information technology: An updated systematic review with a focus on meaningful use. Ann Intern Med. 2014;160:48–54.
30. Chen ES, Bakken S, Currie LM, Patel VL, Cimino JJ. An automated approach to studying health resource and infobutton use. Stud Health Technol Inform. 2006;122:273–278.
31. Del Fiol G, Cimino JJ, Maviglia SM, Strasberg HR, Jackson BR, Hulse NC. A large-scale knowledge management method based on the analysis of the use of online knowledge resources. AMIA Annu Symp Proc. 2010;2010:142–146.
32. Shojania KG, Jennings A, Mayhew A, Ramsay CR, Eccles MP, Grimshaw J. The effects of on-screen, point of care computer reminders on processes and outcomes of care. Cochrane Database Syst Rev. 2009;(3):CD001096.
33. Murphy DR, Reis B, Sittig DF, Singh H. Notifications received by primary care practitioners in electronic health records: A taxonomy and time analysis. Am J Med. 2012;125:209.e1–209.e7.
34. Holmboe ES. Assessment of the practicing physician: Challenges and opportunities. J Contin Educ Health Prof. 2008;28(suppl 1):S4–S10.
35. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: Effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;(6):CD000259.
36. Cook DA, Sorensen KJ, Linderbaum JA, Pencille LJ, Rhodes DJ. Information needs of generalists and specialists using online best-practice algorithms to answer clinical questions [published online ahead of print February 19, 2017]. J Am Med Inform Assoc. doi: 10.1093/jamia/ocx002.
37. Eva KW, Bordage G, Campbell C, et al. Towards a program of assessment for health professionals: From training into practice. Adv Health Sci Educ Theory Pract. 2016;21:897–913.
38. Eva KW, Regehr G. Self-assessment in the health professions: A reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46–S54.
39. Eva KW, Regehr G. “I’ll never play professional football” and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28:14–19.
40. Sargeant J, Bruce D, Campbell CM. Practicing physicians’ needs for assessment and feedback as part of professional development. J Contin Educ Health Prof. 2013;33(suppl 1):S54–S62.

Supplemental Digital Content

Back to Top | Article Outline
© 2018 by the Association of American Medical Colleges