Medical schools spend considerable funds on information technology to improve the teaching and learning of students, residents, faculty, and staff. According to a 2002 survey by the Group on Information Resources (GIR) of the Association of American Medical Colleges (AAMC),1 the median expenditure at a medical school on information technology is $5.5 million with a median of 50 full-time employees to support the technology. The questionnaire contained few items addressing educational technology, and so it is difficult to determine how many of these resources are devoted to that technology. For the purpose of the present study, educational technology is defined as electronic and other forms of technology used to support teaching and learning. These include but are not restricted to computer-based learning programs, computerized mannequins, instructional Web sites, video/audio production, application development, and online course management. Synonyms for educational technology include instructional technology, academic computing, and instructional computing.
Current standards from the Liaison Committee on Medical Education2 (LCME) make indirect or passing reference to educational technology, with few guidelines on how it should be organized or supported. The standards call for curricula that foster self-directed, independent study; access to educational technology is implied. There should be “appropriate educational infrastructure” to include computers, audiovisual aids, and laboratories.2 Given the ongoing refinement of recommendations to assist medical schools plan for the best use of educational technology and the current LCME standards, it is helpful to know what educational technology infrastructure and services are currently provided by AAMC medical schools.
With the ever-increasing dependency on information technology and the ever-changing role of educational technology in medical education, there have been attempts to survey medical schools to glean a view of what technologies are used and how. More than 20 years ago, the AAMC Organization of Student Representatives3 surveyed North American medical schools to assess which courses were using computers and which schools were using computer-assisted instruction. The California Consortium for Informatics in Medical Education and Development4 began surveying its eight members in the early 1990s to compile an inventory of software. Computer Resources in Medical Education (CRIME) of the Western Group on Educational Affairs (WGEA) continued many of the efforts of the California consortium and surveyed educational technology services of the WGEA medical schools in 2002. Their study was replicated with medical schools of the Southern Group on Educational Affairs (SGEA) in 2003.
This present study is the first attempt to assess educational technology infrastructure and services provided by AAMC-member allopathic medical schools in terms of what, how, and who. The importance of this kind of information was foretold in several earlier major reports from the AAMC. For example, the “GPEP Report”5 called for medical schools to designate an academic unit to lead the development of the application of information sciences and computer technology to medical education. The ACME-TRI Report6 followed up on the efforts of medical schools to implement the GPEP recommendations and suggested the (1) establishment of a library of reviewed educational software, (2) national and regional workshops for faculty on the use of computers for education, and (3) establishment of “some organizational structure to promote the use of computers in medical education” at individual medical schools. The Steering Committee on the Evaluation of Medical Information Science in Medical Education7 recommended that the AAMC should sponsor seminars on new technologies and provide information on computer applications in medical education. They also recommended that the National Library of Medicine (NLM) coordinate the assessment of medical software, including the medical education applications.
More recently the second report of the Medical School Objectives Project (MSOP II)8 and the current standards for accreditation of medical education programs include educational technology within the broader area of medical informatics. MSOP II specifies that future physicians should know about and be able to use the instructional resources available through the newer information and educational technologies.
In the present report, we describe the development and results of a questionnaire to inventory educational technology infrastructure and services at all North American allopathic medical schools that are members of the AAMC. We used the questionnaire to gain answers to the following questions:
- What infrastructure and services do allopathic medical schools have for undergraduate, graduate, and continuing medical education?
- Is there an association between the infrastructure and services used and the selected characteristics of the school?
- Do services differ across the levels of medical education?
Outside funding was not provided for this study. All three of our schools received institutional review board approval for this study.
In September of 2004, we contacted representatives from 137 allopathic AAMC-member schools in the United States, Canada, and Puerto Rico by e-mail. Each contact received an e-mail invitation with a letter informing him or her of data confidentiality and other relevant information needed for informed consent. Consent was assumed when the individual clicked on the URL address that took them to the Web-based survey instrument. Only one respondent completed the questionnaire for each school.
Our questionnaire was based largely on the CRIME questionnaire previously distributed to WGEA and SGEA schools, as described earlier. Our questionnaire addressed two main questions in detail: What educational services does the school offer? What infrastructure is provided to support these services? There were two demographic questions (name of school and position of respondent) and 26 specific questions with checkboxes and pull-down menus forming an inventory of educational technology services and infrastructure. There was one open-ended question that asked for information on educational technology initiatives not mentioned in the questionnaire. We defined infrastructure items as those technologies that include hardware, networking, and software solutions that are offered campus wide but are not restricted to educational purposes. Six of the 26 items pertained to infrastructure, consisting of questions about computer requirements, Personal Digital Assistant (PDA) requirements, wireless Internet access, and Web-portals. Educational services were defined as technologies specifically designed to enhance the educational experience. Twenty-two educational services were listed in the questionnaire (See Table 1). Some of these items had follow-up questions as to whether software was purchased or developed in-house. From the name of the school we could classify the respondents by region of the country and public or private status.
We revised the questionnaire, based on our experience with the previous CRIME surveys. We selected a Web-based survey tool (Prezze Technologies’ Ultimate Survey) and tested the application prior to release. We compiled the e-mail list by reviewing CurrMIT’s primary curriculum contact for each school. If this approach failed, we conducted a search on individual school websites for educational technology leaders and the GIR membership database.
We first e-mailed our questionnaire in September 2004 and completed that process in March 2005, although 78 of the questionnaires (88%) were returned between September and November. While we initially closed the survey at the end of November, we reopened it after we had requests from a few remaining schools to allow them to respond. Three reminders were sent.
The analyses were primarily descriptive, usually in terms of frequencies and percentages. An infrastructure score was determined by summing the responses to the six infrastructure items on the questionnaire. A services score was determined by summing the 22 services. Infrastructure and service scores were compared across regions, public versus private schools, and size of school based on number of 2004 graduates. We analyzed the data with one-way analyses of variance, chi-square, and correlation coefficients. Also, we used Friedman tests when comparing responses concerning undergraduate (UGME), graduate (GME), and continuing medical education (CME) use of services. Since we conducted a number of analyses, the level of significance was set at p < .01 for each analysis to reduce the level of Type 1 error.
Of the 137 schools contacted, 88 responded to the request, a 64% return rate. Sixty of the respondents (68%) were from public medical schools, while 88 of the 137 schools (64%) were public. Forty-one of the 88 respondents (64%) were directors or managers. Twenty-five (18%) of the respondents were deans with the remaining scattered among faculty, coordinators, vice-presidents, and developers. The return rate by AAMC region was Western region, 79% (15/19 schools), Central region, 75% (24/32 schools), Southern region, 66% (29/44 schools), and Northeast region, 48% (20/42 schools). As these figures show, the response rates were significantly different across the regions (χ2 = 18.2, df = 3, p < .01). There was no association between responders’ AAMC region and whether or not the school was public (p > .05). Also, there were no differences between infrastructure and service scores for those who completed the survey when it was reopened and those who completed it within the original time period. Table 1 reports the number and percentage of schools reporting each item for infrastructure and educational services.
On average, the responding schools had implemented 3.0 (SD = 1.5) of the six infrastructure items covered in the survey. There were no significant differences in infrastructure score by region of the country (p = .39) or public and private institutions (p = .65). There was a small correlation between the number of students graduating and the infrastructure score (r2 = 0.04, p = .04).
A total of 30 of the schools (34%) we surveyed required computers. An additional 23 (26%) recommended computers. Almost a third of the schools required PDAs but another 22 (16%) recommended them. Only 32 of the schools (36%) reported having no PDA initiatives for undergraduate medical students. The use of PDAs in residency programs was cited most often for primary care specialties.
Wireless Internet access.
Wireless Internet access was provided most commonly in the library (77; 88%), followed by common areas (65, 74%), small-group rooms (62, 70%), lecture halls (58, 66%) and hospitals (28, 32%). Only four (4%) of the respondents reported no wireless Internet access on campus.
Web portals were defined as a Web site or service that offers a broad array of services and resources. Schools reported having portals for medical students (59, 67%), residents (24, 27%), and teaching faculty (45, 51%). There was a significant difference across these three groups (p < .001). Over 70% of schools have bought (56, 64%) and/or developed in-house (42, 48%) portal systems. Over 50% of the portals contained administrative information, course materials, access to e-mail accounts and event calendars. A smaller number had user customization and personal calendars, as shown in Table 2.
On average schools offered 11.6 (SD = 4.1) of the 22 services covered in the survey. We found no difference in the number of services offered based on region of the country (p = .09) or public versus private schools (p = .92). There was a small correlation relating the number of services offered to a larger graduating class (r = .19; p = .045). There was no relationship between the services and the infrastructure scores (r = .13, p = .12).
We did further analyses to see if schools that bought computers or PDAs for students were more likely to have a greater number of services. Those schools requiring a computer purchase (or providing one) had 12.3 (SD = 3.8) services and did not differ significantly from those who did not (11.2; SD = 4.3) services; p = .26). On average, those schools that required PDAs for the students had nearly two additional services (12.7; SD = 3.7) compared to 10.9 (SD = 4.3; p = .05) than the schools that did not buy PDAs, and those who bought PDAs for the students provided more services (13.0; SD = 4.2) compared to those who did not (11.0; SD = 4.0; p = .035), We also examined the relationship between number of services and in-house development of support software. Only for development of online examinations was there an association. Those schools that developed their own system offered significantly more services (13.2; SD = 4.0) than those schools that did not develop their own system (10.9; SD = 4.0; p = .01).
Personal digital assistant initiatives.
The PDA initiatives were primarily driven by departments (43, 49%), followed by students (33, 38%), central services (33, 37%), residents (29, 33%), and hospitals (14, 16%). Overall, 37 (42%) of the schools supported PDA initiatives centrally either through the school or affiliated hospitals. Currently, 38 (43%) of the schools have either bought (27, 31%) or developed in-house (21, 24%) PDA software. This software includes functions such as synchronizing patient logs or delivering course content.
We asked schools if they videotaped lectures and special events and streamed them over the Internet. There was a difference in the level reported for undergraduate, graduate, or continuing medical education (p = .002). Graduate and continuing medical education did not use streaming video for any required activities.
There was a significant difference in the use of computerized mannequins across years of education, with the use in continuing medical education (ten, 11%) being significantly lower than in residency education (33, 38%), clinical undergraduate (50, 57%) and preclinical undergraduate (30, 34%).
The finding was similar for the use of virtual patients or computer-based simulations (p < .001). In the undergraduate years 52 (59%) of the schools used these simulations in preclinical education and 54 (61%) during clinical education and 70 (80%) somewhere either in preclinical or clinical. The reported use was less in residency (29, 33%) and continuing medical education (13, 15%).
Some institutions provide electronic portfolios for trainees to maintain a record of evidence about individual proficiencies. This had limited implementation with 23 (26%) of the schools reporting use of an e-portfolio at the undergraduate level and 31 (35%) at the graduate level (p = .06). Both levels primarily use e-portfolios for some courses or programs rather than as part of an institutional initiative. Schools were equally likely to purchase (12, 14%) or develop (12, 14%) software for e-portfolios.
Online teaching evaluations.
Only nine schools (10%) did not provide online teaching evaluations. Regarding the percentages of schools reporting centralized online teaching, 72 (87%) did so for evaluations for courses, 63 (72%) did so for rotations, 61 (69%) instructors, 33 (38%) attending physicians, and 36 (41%) residents. Schools both bought (38, 43%) and developed evaluation systems (51, 58%).
Schools also supported online examinations of learners. There was an expected but not a statistical difference across the three levels (p = .04). Online exam usage was 61 (69%) for undergraduate, 52 (60%) for graduate, and 49 (56%) for continuing medical education. Schools purchased (42, 48%) and/or developed (27, 31%) their own online examination software.
Discussion and Conclusions
We set out to provide a description of educational technology infrastructure and services offered by AAMC-member allopathic medical schools. Our findings present a snapshot of these ever-changing services and indicate that infrastructure and services vary but not in a way related to regional or institutional variables. We did not make judgments about quality or what should or should not be offered; rather we described what is currently offered, pointing out possible trends. Given the lower rate of response from the Northeast region, results may be more applicable to medical schools from other AAMC regions. The Northeast has a larger proportion of private schools (52%) than public schools (48%), and their region’s responses reflect this, with 55% from private schools and 45% from public schools. Overall, the percentages of public and private schools that responded did not differ very much across regions, with 60 (68%) of the respondents being from public schools and 88 (64%) of the population being from public schools.
Technology is spreading across the medical education continuum, although services generally are better developed for those in undergraduate medical education (UGME). Our data may reflect an underrepresentation of educational services for GME because many respondents commented that they did not know what was being offered in GME. Across the medical education continuum, it is apparent that technology is increasingly integrated into the education of residents and practicing physicians. Fewer online course materials were offered for residents and practicing physicians than for medical students, but technology has been used in examinations for these groups at the majority of schools. Use of simulation was reported as lower in GME and CME than for medical students, indicating that these levels of the continuum rely on more traditional pedagogical approaches to teaching. This is an area for potential growth in the use of educational technology, especially to respond to challenges in addressing the competencies mandated by the Accreditation Council for Graduate Medical Education and to the pressure to improve physicians’ lifelong learning.
We found computers and/or PDAs were becoming well integrated into the curriculum (fully half had the requirement for a computer and/or a PDA). Some schools are going so far as to provide computers to students. However, we did not determine exactly how these devices were being used. One can speculate that requiring computers was related to the extensive use of online course materials reported by our respondents. PDAs could well be related to the demand for patient encounter tracking required by the LCME and by various residency review committees. “Smartphones” (PDA/mobile phone units) may offer even greater learning opportunities.
The least reported item under infrastructure was Web portals for GME. Also, for educational services, relatively few schools reported having e-portfolios. Portals and e-portfolios represent technologies that facilitate integration of services and learners’ control of their educational environment. As a result, they are more difficult to implement, increase, and maintain. We believe some of these areas are on the rise in medical education. We found a twofold increase of e-portfolio articles from 2004 to 2005 (as shown by the results of our search conducted in Educational Resources Information Center in early 2006).
Respondents identified several initiatives in their open-ended responses that we had not asked about on the questionnaire. Seven schools had replaced microscopes with digital microscopy, five reported recording lectures in a MPEG (MP3) format, four reported a case-authoring tool, three stated they are using Tablet PCs, and one school mentioned the addition of an audience response system. Since we did not specifically ask about these services, these numbers could vastly under represent what is happening in these areas across institutions.
These added initiatives point to additional trends in educational technology, which we will briefly describe. The replacement of microscopes by digital microscopy makes it easier to file, store, retrieve, annotate, and archive specimens. Specimens are also easier to duplicate and distribute and have led to the development of large collections such as Web Path9 and MedPix.10
The recording, streaming, and sharing of MP3 recordings of lectures came about as part of the iPOD revolution and the phenomenon known as “podcasting.” Podcasting is a method of publishing audio broadcasts via the Internet, allowing users to subscribe to a feed of audio files (usually MP3s).11 It became popular in late 2004, about the time our questionnaire was distributed. The University of California system responded quickly to protect faculty with implementation of the “Policy on Use of Recordings of Course Presentations.”12 The policy allows students to record and share recordings with other students in their class for the purpose of studying and for faculty to record and use recordings of themselves.
While we asked about the use of virtual patients, we did not ask about case-authoring tools. Cases and their authoring programs vary significantly both in their educational application and the media formats in which the patient is presented. The AAMC has conducted a virtual patient inventory of cases used by their member schools.13 They have also established a national working group to agree upon technical standards, which would make it easier to share cases across different schools.
Another tool we failed to ask about was the Tablet PC, which gives learners the power to incorporate searchable handwritten notes with the power of a laptop. Tablet PCs also give instructors the power to markup and save notes written over PowerPoint slides. One example of an innovative use of this technology is carried out by the pediatric emergency medicine specialists at the University of Alberta School of Medicine. They are using Tablet PCs side by side with their X-ray viewing box. With a patient’s X-ray on the viewing box, trainees use the digital pen to retrieve images of normal X-rays that they can choose to be labeled or not, and to create supplemental images with variants (e.g., age-specific changes), animations of pathophysiology mechanisms, and embedded radiology tutorials.14
Audience response systems (ARS) were also not mentioned on our questionnaire but were noted by some respondents. ARS are handheld keypads used during a lecture to create a more interactive environment. Instructors can poll their audience and display cumulative results immediately providing feedback for the instructor and the learner. These results can be tracked or anonymous. Some software programs can also be used with Tablet PCs or PDAs. There have been studies on the use of an ARS for CME15,16 and GME17 that found increased knowledge and satisfaction of the participants. The authors cited also stated that effective teaching strategies and questioning were critical to the success of using this technology.
While integrating educational technology presents a challenge to medical educators, the greater challenge is adopting the wise use of educational technology to improve learning and to increase the efficiency and effectiveness of educational programs.18 Our data suggest that medical schools are implementing educational technology solutions despite concerns expressed by many that such changes were unrealistic.19,20 Other studies will be needed to try to determine just how wisely these solutions are being implemented and how efficient and effective they are.
The results of this survey provide a benchmark for the educational technology services currently provided by medical schools. We hope that this information will assist medical schools as they plan and implement educational technology services at their institutions. We also hope that it provides some guidance in riding the waves of technological change that confront us all.