Skip Navigation LinksHome > July 2006 - Volume 81 - Issue 7 > Educational Technology Infrastructure and Services in North...
Academic Medicine:
doi: 10.1097/01.ACM.0000232413.43142.8b
IT in Medical Education

Educational Technology Infrastructure and Services in North American Medical Schools

Kamin, Carol MS, EdD; Souza, Kevin H. MS; Heestand, Diane EdD; Moses, Anna MEd; O’Sullivan, Patricia EdD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Kamin is associate professor and director, educational research and development, Department of Pediatrics, University of Colorado School of Medicine, Denver, Colorado. She is also the director of Project LIVE Consortium.

Mr. Souza is director, Office of Educational Technology, and associate director for operations, Office of Medical Education, University of California, San Francisco, San Francisco, California.

Dr. Heestand is a professor and director, Office of Educational Development, University of Arkansas for Medical Sciences, Little Rock, Arkansas.

Ms. Moses is an instructor and instructional development specialist, Office of Educational Development, University of Arkansas for Medical Sciences, Little Rock, Arkansas.

Dr. O’Sullivan is associate director for educational research, Office of Medical Education, University of California, San Francisco, San Francisco, California.

Correspondence should be addressed to Dr. Kamin, 1668 South Rosemary St., Denver, CO 80231; telephone: (303) 860-4639 or (303) 550-2814; fax: (303) 764-8189; e-mail: (Carol.Kamin@uchsc.edu).

Collapse Box

Abstract

Purpose: To describe the current educational technology infrastructure and services provided by North American allopathic medical schools that are members of the Association of American Medical Colleges (AAMC), to present information needed for institutional benchmarking.

Method: A Web-based survey instrument was developed and administered in the fall of 2004 by the authors, sent to representatives of 137 medical schools and completed by representatives of 88, a response rate of 64%. Schools were given scores for infrastructure and services provided. Data were analyzed with one-way analyses of variance, chi-square, and correlation coefficients.

Results: There was no difference in the number of infrastructure features or services offered based on region of the country, public versus private schools, or size of graduating class. Schools implemented 3.0 (SD = 1.5) of 6 infrastructure items and offered 11.6 (SD = 4.1) of 22 services. Over 90% of schools had wireless access (97%), used online course materials for undergraduate medical education (97%), course management system for graduate medical education (95%) and online teaching evaluations (90%). Use of services differed across the undergraduate, graduate, and continuing medical education continuum. Outside of e-portfolios for undergraduates, the least-offered services were for services to graduate and continuing medical education.

Conclusions: The results of this survey provide a benchmark for the level of services and infrastructure currently supporting educational technology by AAMC-member allopathic medical schools.

Medical schools spend considerable funds on information technology to improve the teaching and learning of students, residents, faculty, and staff. According to a 2002 survey by the Group on Information Resources (GIR) of the Association of American Medical Colleges (AAMC),1 the median expenditure at a medical school on information technology is $5.5 million with a median of 50 full-time employees to support the technology. The questionnaire contained few items addressing educational technology, and so it is difficult to determine how many of these resources are devoted to that technology. For the purpose of the present study, educational technology is defined as electronic and other forms of technology used to support teaching and learning. These include but are not restricted to computer-based learning programs, computerized mannequins, instructional Web sites, video/audio production, application development, and online course management. Synonyms for educational technology include instructional technology, academic computing, and instructional computing.

Current standards from the Liaison Committee on Medical Education2 (LCME) make indirect or passing reference to educational technology, with few guidelines on how it should be organized or supported. The standards call for curricula that foster self-directed, independent study; access to educational technology is implied. There should be “appropriate educational infrastructure” to include computers, audiovisual aids, and laboratories.2 Given the ongoing refinement of recommendations to assist medical schools plan for the best use of educational technology and the current LCME standards, it is helpful to know what educational technology infrastructure and services are currently provided by AAMC medical schools.

With the ever-increasing dependency on information technology and the ever-changing role of educational technology in medical education, there have been attempts to survey medical schools to glean a view of what technologies are used and how. More than 20 years ago, the AAMC Organization of Student Representatives3 surveyed North American medical schools to assess which courses were using computers and which schools were using computer-assisted instruction. The California Consortium for Informatics in Medical Education and Development4 began surveying its eight members in the early 1990s to compile an inventory of software. Computer Resources in Medical Education (CRIME) of the Western Group on Educational Affairs (WGEA) continued many of the efforts of the California consortium and surveyed educational technology services of the WGEA medical schools in 2002. Their study was replicated with medical schools of the Southern Group on Educational Affairs (SGEA) in 2003.

This present study is the first attempt to assess educational technology infrastructure and services provided by AAMC-member allopathic medical schools in terms of what, how, and who. The importance of this kind of information was foretold in several earlier major reports from the AAMC. For example, the “GPEP Report”5 called for medical schools to designate an academic unit to lead the development of the application of information sciences and computer technology to medical education. The ACME-TRI Report6 followed up on the efforts of medical schools to implement the GPEP recommendations and suggested the (1) establishment of a library of reviewed educational software, (2) national and regional workshops for faculty on the use of computers for education, and (3) establishment of “some organizational structure to promote the use of computers in medical education” at individual medical schools. The Steering Committee on the Evaluation of Medical Information Science in Medical Education7 recommended that the AAMC should sponsor seminars on new technologies and provide information on computer applications in medical education. They also recommended that the National Library of Medicine (NLM) coordinate the assessment of medical software, including the medical education applications.

More recently the second report of the Medical School Objectives Project (MSOP II)8 and the current standards for accreditation of medical education programs include educational technology within the broader area of medical informatics. MSOP II specifies that future physicians should know about and be able to use the instructional resources available through the newer information and educational technologies.

In the present report, we describe the development and results of a questionnaire to inventory educational technology infrastructure and services at all North American allopathic medical schools that are members of the AAMC. We used the questionnaire to gain answers to the following questions:

1. What infrastructure and services do allopathic medical schools have for undergraduate, graduate, and continuing medical education?

2. Is there an association between the infrastructure and services used and the selected characteristics of the school?

3. Do services differ across the levels of medical education?

Outside funding was not provided for this study. All three of our schools received institutional review board approval for this study.

Back to Top | Article Outline

Method

Population studied

In September of 2004, we contacted representatives from 137 allopathic AAMC-member schools in the United States, Canada, and Puerto Rico by e-mail. Each contact received an e-mail invitation with a letter informing him or her of data confidentiality and other relevant information needed for informed consent. Consent was assumed when the individual clicked on the URL address that took them to the Web-based survey instrument. Only one respondent completed the questionnaire for each school.

Back to Top | Article Outline
Instrument

Our questionnaire was based largely on the CRIME questionnaire previously distributed to WGEA and SGEA schools, as described earlier. Our questionnaire addressed two main questions in detail: What educational services does the school offer? What infrastructure is provided to support these services? There were two demographic questions (name of school and position of respondent) and 26 specific questions with checkboxes and pull-down menus forming an inventory of educational technology services and infrastructure. There was one open-ended question that asked for information on educational technology initiatives not mentioned in the questionnaire. We defined infrastructure items as those technologies that include hardware, networking, and software solutions that are offered campus wide but are not restricted to educational purposes. Six of the 26 items pertained to infrastructure, consisting of questions about computer requirements, Personal Digital Assistant (PDA) requirements, wireless Internet access, and Web-portals. Educational services were defined as technologies specifically designed to enhance the educational experience. Twenty-two educational services were listed in the questionnaire (See Table 1). Some of these items had follow-up questions as to whether software was purchased or developed in-house. From the name of the school we could classify the respondents by region of the country and public or private status.

Table 1
Table 1
Image Tools
Back to Top | Article Outline
Procedures

We revised the questionnaire, based on our experience with the previous CRIME surveys. We selected a Web-based survey tool (Prezze Technologies’ Ultimate Survey) and tested the application prior to release. We compiled the e-mail list by reviewing CurrMIT’s primary curriculum contact for each school. If this approach failed, we conducted a search on individual school websites for educational technology leaders and the GIR membership database.

We first e-mailed our questionnaire in September 2004 and completed that process in March 2005, although 78 of the questionnaires (88%) were returned between September and November. While we initially closed the survey at the end of November, we reopened it after we had requests from a few remaining schools to allow them to respond. Three reminders were sent.

Back to Top | Article Outline
Analyses

The analyses were primarily descriptive, usually in terms of frequencies and percentages. An infrastructure score was determined by summing the responses to the six infrastructure items on the questionnaire. A services score was determined by summing the 22 services. Infrastructure and service scores were compared across regions, public versus private schools, and size of school based on number of 2004 graduates. We analyzed the data with one-way analyses of variance, chi-square, and correlation coefficients. Also, we used Friedman tests when comparing responses concerning undergraduate (UGME), graduate (GME), and continuing medical education (CME) use of services. Since we conducted a number of analyses, the level of significance was set at p < .01 for each analysis to reduce the level of Type 1 error.

Back to Top | Article Outline

Results

Of the 137 schools contacted, 88 responded to the request, a 64% return rate. Sixty of the respondents (68%) were from public medical schools, while 88 of the 137 schools (64%) were public. Forty-one of the 88 respondents (64%) were directors or managers. Twenty-five (18%) of the respondents were deans with the remaining scattered among faculty, coordinators, vice-presidents, and developers. The return rate by AAMC region was Western region, 79% (15/19 schools), Central region, 75% (24/32 schools), Southern region, 66% (29/44 schools), and Northeast region, 48% (20/42 schools). As these figures show, the response rates were significantly different across the regions (χ2 = 18.2, df = 3, p < .01). There was no association between responders’ AAMC region and whether or not the school was public (p > .05). Also, there were no differences between infrastructure and service scores for those who completed the survey when it was reopened and those who completed it within the original time period. Table 1 reports the number and percentage of schools reporting each item for infrastructure and educational services.

Back to Top | Article Outline
Infrastructure

On average, the responding schools had implemented 3.0 (SD = 1.5) of the six infrastructure items covered in the survey. There were no significant differences in infrastructure score by region of the country (p = .39) or public and private institutions (p = .65). There was a small correlation between the number of students graduating and the infrastructure score (r2 = 0.04, p = .04).

Back to Top | Article Outline
Equipment.

A total of 30 of the schools (34%) we surveyed required computers. An additional 23 (26%) recommended computers. Almost a third of the schools required PDAs but another 22 (16%) recommended them. Only 32 of the schools (36%) reported having no PDA initiatives for undergraduate medical students. The use of PDAs in residency programs was cited most often for primary care specialties.

Back to Top | Article Outline
Wireless Internet access.

Wireless Internet access was provided most commonly in the library (77; 88%), followed by common areas (65, 74%), small-group rooms (62, 70%), lecture halls (58, 66%) and hospitals (28, 32%). Only four (4%) of the respondents reported no wireless Internet access on campus.

Back to Top | Article Outline
Portals.

Web portals were defined as a Web site or service that offers a broad array of services and resources. Schools reported having portals for medical students (59, 67%), residents (24, 27%), and teaching faculty (45, 51%). There was a significant difference across these three groups (p < .001). Over 70% of schools have bought (56, 64%) and/or developed in-house (42, 48%) portal systems. Over 50% of the portals contained administrative information, course materials, access to e-mail accounts and event calendars. A smaller number had user customization and personal calendars, as shown in Table 2.

Table 2
Table 2
Image Tools
Back to Top | Article Outline
Educational services

On average schools offered 11.6 (SD = 4.1) of the 22 services covered in the survey. We found no difference in the number of services offered based on region of the country (p = .09) or public versus private schools (p = .92). There was a small correlation relating the number of services offered to a larger graduating class (r = .19; p = .045). There was no relationship between the services and the infrastructure scores (r = .13, p = .12).

Many of the services are offered across all three levels of medical education. The results of comparing these services are shown in Table 3. Nearly every school (85, 97%) had some use of online course materials for medical students. Slightly over a third (32, 36%) offered online course materials for residents and fellows and 24 of the schools (27%) did for continuing medical education courses. As would be expected there was a significant difference across the three education levels in terms of use of a course management software for online courses (84, 96%) for undergraduate, (69, 78%) for graduate, and (60, 68%) for continuing medical education, p < .001). Most schools purchased a system (69, 78%) with a fair number of schools doing in-house development (37, 42%).

Table 3
Table 3
Image Tools

We did further analyses to see if schools that bought computers or PDAs for students were more likely to have a greater number of services. Those schools requiring a computer purchase (or providing one) had 12.3 (SD = 3.8) services and did not differ significantly from those who did not (11.2; SD = 4.3) services; p = .26). On average, those schools that required PDAs for the students had nearly two additional services (12.7; SD = 3.7) compared to 10.9 (SD = 4.3; p = .05) than the schools that did not buy PDAs, and those who bought PDAs for the students provided more services (13.0; SD = 4.2) compared to those who did not (11.0; SD = 4.0; p = .035), We also examined the relationship between number of services and in-house development of support software. Only for development of online examinations was there an association. Those schools that developed their own system offered significantly more services (13.2; SD = 4.0) than those schools that did not develop their own system (10.9; SD = 4.0; p = .01).

Back to Top | Article Outline
Personal digital assistant initiatives.

The PDA initiatives were primarily driven by departments (43, 49%), followed by students (33, 38%), central services (33, 37%), residents (29, 33%), and hospitals (14, 16%). Overall, 37 (42%) of the schools supported PDA initiatives centrally either through the school or affiliated hospitals. Currently, 38 (43%) of the schools have either bought (27, 31%) or developed in-house (21, 24%) PDA software. This software includes functions such as synchronizing patient logs or delivering course content.

Back to Top | Article Outline
Streaming video.

We asked schools if they videotaped lectures and special events and streamed them over the Internet. There was a difference in the level reported for undergraduate, graduate, or continuing medical education (p = .002). Graduate and continuing medical education did not use streaming video for any required activities.

Back to Top | Article Outline
Patient simulations.

There was a significant difference in the use of computerized mannequins across years of education, with the use in continuing medical education (ten, 11%) being significantly lower than in residency education (33, 38%), clinical undergraduate (50, 57%) and preclinical undergraduate (30, 34%).

The finding was similar for the use of virtual patients or computer-based simulations (p < .001). In the undergraduate years 52 (59%) of the schools used these simulations in preclinical education and 54 (61%) during clinical education and 70 (80%) somewhere either in preclinical or clinical. The reported use was less in residency (29, 33%) and continuing medical education (13, 15%).

Back to Top | Article Outline
E-portfolios.

Some institutions provide electronic portfolios for trainees to maintain a record of evidence about individual proficiencies. This had limited implementation with 23 (26%) of the schools reporting use of an e-portfolio at the undergraduate level and 31 (35%) at the graduate level (p = .06). Both levels primarily use e-portfolios for some courses or programs rather than as part of an institutional initiative. Schools were equally likely to purchase (12, 14%) or develop (12, 14%) software for e-portfolios.

Back to Top | Article Outline
Online teaching evaluations.

Only nine schools (10%) did not provide online teaching evaluations. Regarding the percentages of schools reporting centralized online teaching, 72 (87%) did so for evaluations for courses, 63 (72%) did so for rotations, 61 (69%) instructors, 33 (38%) attending physicians, and 36 (41%) residents. Schools both bought (38, 43%) and developed evaluation systems (51, 58%).

Schools also supported online examinations of learners. There was an expected but not a statistical difference across the three levels (p = .04). Online exam usage was 61 (69%) for undergraduate, 52 (60%) for graduate, and 49 (56%) for continuing medical education. Schools purchased (42, 48%) and/or developed (27, 31%) their own online examination software.

Back to Top | Article Outline

Discussion and Conclusions

We set out to provide a description of educational technology infrastructure and services offered by AAMC-member allopathic medical schools. Our findings present a snapshot of these ever-changing services and indicate that infrastructure and services vary but not in a way related to regional or institutional variables. We did not make judgments about quality or what should or should not be offered; rather we described what is currently offered, pointing out possible trends. Given the lower rate of response from the Northeast region, results may be more applicable to medical schools from other AAMC regions. The Northeast has a larger proportion of private schools (52%) than public schools (48%), and their region’s responses reflect this, with 55% from private schools and 45% from public schools. Overall, the percentages of public and private schools that responded did not differ very much across regions, with 60 (68%) of the respondents being from public schools and 88 (64%) of the population being from public schools.

Technology is spreading across the medical education continuum, although services generally are better developed for those in undergraduate medical education (UGME). Our data may reflect an underrepresentation of educational services for GME because many respondents commented that they did not know what was being offered in GME. Across the medical education continuum, it is apparent that technology is increasingly integrated into the education of residents and practicing physicians. Fewer online course materials were offered for residents and practicing physicians than for medical students, but technology has been used in examinations for these groups at the majority of schools. Use of simulation was reported as lower in GME and CME than for medical students, indicating that these levels of the continuum rely on more traditional pedagogical approaches to teaching. This is an area for potential growth in the use of educational technology, especially to respond to challenges in addressing the competencies mandated by the Accreditation Council for Graduate Medical Education and to the pressure to improve physicians’ lifelong learning.

We found computers and/or PDAs were becoming well integrated into the curriculum (fully half had the requirement for a computer and/or a PDA). Some schools are going so far as to provide computers to students. However, we did not determine exactly how these devices were being used. One can speculate that requiring computers was related to the extensive use of online course materials reported by our respondents. PDAs could well be related to the demand for patient encounter tracking required by the LCME and by various residency review committees. “Smartphones” (PDA/mobile phone units) may offer even greater learning opportunities.

The least reported item under infrastructure was Web portals for GME. Also, for educational services, relatively few schools reported having e-portfolios. Portals and e-portfolios represent technologies that facilitate integration of services and learners’ control of their educational environment. As a result, they are more difficult to implement, increase, and maintain. We believe some of these areas are on the rise in medical education. We found a twofold increase of e-portfolio articles from 2004 to 2005 (as shown by the results of our search conducted in Educational Resources Information Center in early 2006).

Respondents identified several initiatives in their open-ended responses that we had not asked about on the questionnaire. Seven schools had replaced microscopes with digital microscopy, five reported recording lectures in a MPEG (MP3) format, four reported a case-authoring tool, three stated they are using Tablet PCs, and one school mentioned the addition of an audience response system. Since we did not specifically ask about these services, these numbers could vastly under represent what is happening in these areas across institutions.

These added initiatives point to additional trends in educational technology, which we will briefly describe. The replacement of microscopes by digital microscopy makes it easier to file, store, retrieve, annotate, and archive specimens. Specimens are also easier to duplicate and distribute and have led to the development of large collections such as Web Path9 and MedPix.10

The recording, streaming, and sharing of MP3 recordings of lectures came about as part of the iPOD revolution and the phenomenon known as “podcasting.” Podcasting is a method of publishing audio broadcasts via the Internet, allowing users to subscribe to a feed of audio files (usually MP3s).11 It became popular in late 2004, about the time our questionnaire was distributed. The University of California system responded quickly to protect faculty with implementation of the “Policy on Use of Recordings of Course Presentations.”12 The policy allows students to record and share recordings with other students in their class for the purpose of studying and for faculty to record and use recordings of themselves.

While we asked about the use of virtual patients, we did not ask about case-authoring tools. Cases and their authoring programs vary significantly both in their educational application and the media formats in which the patient is presented. The AAMC has conducted a virtual patient inventory of cases used by their member schools.13 They have also established a national working group to agree upon technical standards, which would make it easier to share cases across different schools.

Another tool we failed to ask about was the Tablet PC, which gives learners the power to incorporate searchable handwritten notes with the power of a laptop. Tablet PCs also give instructors the power to markup and save notes written over PowerPoint slides. One example of an innovative use of this technology is carried out by the pediatric emergency medicine specialists at the University of Alberta School of Medicine. They are using Tablet PCs side by side with their X-ray viewing box. With a patient’s X-ray on the viewing box, trainees use the digital pen to retrieve images of normal X-rays that they can choose to be labeled or not, and to create supplemental images with variants (e.g., age-specific changes), animations of pathophysiology mechanisms, and embedded radiology tutorials.14

Audience response systems (ARS) were also not mentioned on our questionnaire but were noted by some respondents. ARS are handheld keypads used during a lecture to create a more interactive environment. Instructors can poll their audience and display cumulative results immediately providing feedback for the instructor and the learner. These results can be tracked or anonymous. Some software programs can also be used with Tablet PCs or PDAs. There have been studies on the use of an ARS for CME15,16 and GME17 that found increased knowledge and satisfaction of the participants. The authors cited also stated that effective teaching strategies and questioning were critical to the success of using this technology.

While integrating educational technology presents a challenge to medical educators, the greater challenge is adopting the wise use of educational technology to improve learning and to increase the efficiency and effectiveness of educational programs.18 Our data suggest that medical schools are implementing educational technology solutions despite concerns expressed by many that such changes were unrealistic.19,20 Other studies will be needed to try to determine just how wisely these solutions are being implemented and how efficient and effective they are.

The results of this survey provide a benchmark for the educational technology services currently provided by medical schools. We hope that this information will assist medical schools as they plan and implement educational technology services at their institutions. We also hope that it provides some guidance in riding the waves of technological change that confront us all.

Back to Top | Article Outline

References

1 The Group on Information Resources. Medical School IT Summary Report. Washington, DC: Association of American Medical Colleges, 2003.

2 Liaison Committee on Medical Education. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the M.D. Degree (http://www.lcme.org/standard.htm). Accessed 27 March 2006.

3 Organization of Student Representatives. OSR Compendium of Computer Activity in Medical Education. Washington, DC: Association of American Medical Colleges, 1985.

4 Heestand DE, Hoffman HM. IMED: the development of a consortium. J Biocommun. 1995;22 (1):2–6.

5 Muller S (chairman). Physicians for the Twenty-First Century: Report of the Project Panel on the General Professional Education of the Physician and College Preparation for Medicine. J Med Educ. 1984;59(11 Pt 2).

6 Educating medical students: assessing change in medical education–the road to implementation (ACME-TRI Report). Acad Med. 1993;68(6 suppl):S1–S48.

7 Association of American Medical Colleges. Medical Education in the Information Age. Proceedings of the Symposium on Medical Informatics, 1985, March 7–8, Washington. Washington, DC: Association of American Medical Colleges, 1986.

8 Informatics Panel, Population Health Perspective Panel. Contemporary issues in medicine: medical informatics and population health: Report II of the Medical School Objectives Project. Acad Med. 1999;74:130–41.

9 The Internet Pathology Laboratory for Medical Ed (WebPath). Hosted by Florida State University School of Medicine (http://www-medlib.med.utah.edu/WebPath/webpath.html). Accessed 27 March 2006.

10 MedPix: An Image Library for Medical Education. Hosted by University of California, San Diego (http://medpics.ucsd.edu/index.cfm?curpage=main&course=hist&mode=browse). Accessed 27 March 2006.

11 Brooks D. Podcasting. WikiPedia: The Free Encyclopedia (http://en.wikipedia.org/wiki/Podcasting). Accessed 27 March 2006.

12 Standing Committee on Copyright’s Policy on Use of Recordings of Course Presentations. (http://www.ucop.edu/ucophome/coordrev/policy/PP092305policy.pdf). Accessed 27 March 2006. University of California, 2005.

13 Huang G, Reynolds R, Candler C. Virtual Patient Reference Center (http://www.aamc.org/meded/mededportal/vp/start.htm). Accessed 27 March 2006.

14 Athwal K, Esani K, Youssif A, Pusic M. Learning x-ray interpretation in context: a Tablet PC resource for the emergency department (http://www.slice.utah.edu/2005/documents/athwal_k_54.pdf). Accessed on 27 March 2006. Presented at the 2005 Slice of Life Multimedia Conference.

15 Miller RG, Ashar BH, Getz KJ. Evaluation of an audience response system for the continuing education of health professionals. J Contin Educ Health Prof. 2003;23:109–15.

16 Latessa R, Mouw D. Use of an audience response system to augment interactive learning. Fam Med. 2005;37:12–14.

17 Schackow TE, Chavez M, Loya L, Friedman M. Audience response system: effect on learning in family medicine residents. Fam Med. 2004;36:496–504.

18 Whitcomb ME. The information technology age is dawning for medical education. Acad Med. 2003;78:247–48.

19 Oppenheimer T. The Flickering Mind: The False Promise of Technology in the Classroom, and How Learning Can Be Saved. 1st ed. New York: Random House, 2003.

20 Bates T, Poole G. Effective Teaching with Technology in Higher Education: Foundations for Success. 1st ed. San Francisco: Jossey-Bass, 2003.

Cited By:

This article has been cited 8 time(s).

Medical Teacher
Sharing innovation: the case for technology standards in health professions education
Smothers, V; Greene, P; Ellaway, R; Detmer, DE
Medical Teacher, 30(2): 150-154.
10.1080/01421590701874082
CrossRef
Medical Teacher
'Net Generation' medical students: technological experiences of pre-clinical and clinical students
Kennedy, G; Gray, K; Tse, J
Medical Teacher, 30(1): 10-16.
10.1080/01421590701798737
CrossRef
Teaching and Learning in Medicine
A Review of Portfolio Use in Residency Programs and Considerations before Implementation
Colbert, CY; Ownby, AR; Butler, PM
Teaching and Learning in Medicine, 20(4): 340-345.
10.1080/10401330802384912
CrossRef
Advances in Health Sciences Education
Defining the correctness of a diagnosis: differential judgments and expert knowledge
Kanter, SL; Brosenitsch, TA; Mahoney, JF; Staszewski, J
Advances in Health Sciences Education, 15(1): 65-79.
10.1007/s10459-009-9168-0
CrossRef
Journal of Medical Internet Research
Impact of Interactive Web-Based Education With Mobile and Email-Based Support of General Practitioners on Treatment and Referral Patterns of Patients with Atopic Dermatitis: Randomized Controlled Trial
Schopf, T; Flytkjaer, V
Journal of Medical Internet Research, 14(6): 88-99.
ARTN e171
CrossRef
Academic Medicine
Organizational Models of Educational Technology in U.S. and Canadian Medical Schools
Souza, KH; Kamin, C; O’Sullivan, P; Moses, A; Heestand, D
Academic Medicine, 83(7): 691-699.
10.1097/ACM.0b013e3181782fdc
PDF (97) | CrossRef
Academic Medicine
Development of eMed: A Comprehensive, Modular Curriculum-Management System
Mobbs, SL; Leeper, JB; McNeil, HP; Watson, EG; Moloney, PJ; Toohey, SM; Hughes, CS
Academic Medicine, 82(4): 351-360.
10.1097/ACM.0b013e3180334d41
PDF (908) | CrossRef
Academic Medicine
Virtual Patient Simulation at U.S. and Canadian Medical Schools
Huang, G; Reynolds, R; Candler, C
Academic Medicine, 82(5): 446-451.
10.1097/ACM.0b013e31803e8a0a
PDF (144) | CrossRef
Back to Top | Article Outline

© 2006 Association of American Medical Colleges

Login

Article Tools

Images

Share