Share this article on:

Evaluation of the Use of an Interactive, Online Resource for Competency-Based Curriculum Development

Beach, Patricia S. MD; Bar-on, Miriam MD; Baldwin, Constance PhD; Kittredge, Diane MD; Trimm, R Franklin MD; Henry, Rebecca PhD

doi: 10.1097/ACM.0b013e3181b18b21
Competency-Based Education

Purpose To evaluate pediatric educators’ use of and satisfaction with the Academic Pediatric Association’s Educational Guidelines for Pediatric Residency.

Method The authors used customized programming to document all registered users and downloaded files from the Guidelines Web site for a 30-month period after site completion. An online survey of volunteer users was conducted.

Results Between July 2005 and December 2007, 1,747 individuals registered on the Web site, and 8,754 files were downloaded. Registrants who downloaded files (n = 1,239) represented 97% of the pediatric residency programs in 2008 that were approved by the Accreditation Council for Graduate Medical Education. During 30 months, the frequency of downloads remained robust, peaking each spring. Curriculum-building tools were downloaded by 97% of programs using the site; the majority chose predesigned formats rather than self-selected lists of goals and objectives. Resident evaluation forms and tutorials were downloaded less frequently. A survey was completed by 111 site users, who indicated that the Guidelines tools were useful for Residency Review Committee site visit preparation. Most respondents said that the curriculum-building tools were easy to use, adaptable, and helpful in integration of competencies into residency programs. Respondents rated tutorials highly for educational content and clarity.

Conclusions The data collection methods offer a practical strategy for evaluating access to online curriculum development tools. The majority of U.S. pediatric residency programs have accessed Guidelines’ resources for curriculum development; patterns of use have been sustained over time. Most users preferred the predesigned versions of the materials. Users surveyed found the tools useful for planning rotations and integrating competencies into their programs and reported high satisfaction with the Guidelines.

Dr. Beach is professor, Department of Pediatrics, University of Texas Medical Branch, Galveston, Texas.

Dr. Bar-on is associate dean for graduate medical education and professor of pediatrics, University of Nevada School of Medicine, Reno, Nevada.

Dr. Baldwin is professor, Department of Pediatrics, University of Rochester Medical Center, Rochester, New York.

Dr. Kittredge is professor, Department of Pediatrics, Dartmouth Medical School, Hanover, New Hampshire.

Dr. Trimm is professor, Department of Pediatrics, University of South Alabama College of Medicine, Mobile, Alabama.

Dr. Henry is professor, Office of Medical Education Research and Development, Michigan State University College of Human Medicine, East Lansing, Michigan.

Correspondence should be addressed to Dr. Beach, University of Texas Medical Branch, 301 University Dr., Galveston, TX 77555-1119; telephone: (409) 772-1444; fax: (409) 747-0784; e-mail: (psbeach@utmb.edu).

During the past two decades, medical education has undergone a paradigm shift,1,2 requiring program directors to define clear statements of expectations for learning and appropriate assessment of learner and program outcomes.3–5 Residency program directors must now provide learning experiences and evaluation of residents’ performance in six broad competency domains to receive accreditation.6 Pediatric residency programs, like those in other specialties, are in a state of transition as they move toward competency-based education.7 Implementing major changes in pediatric graduate medical education is resource-intensive and poses a significant challenge to the educators responsible for transforming curricula to meet the new standards. In response, the Academic Pediatric Association (APA, formerly Ambulatory Pediatric Association) launched in May 2004 a new web-based resource, the APA Guidelines for Pediatric Residency, 8 to help programs develop customized curricula that meet the new requirements.

The Web site was adapted from an earlier paper and disk document, the 1996 Educational Guidelines for Residency Training in General Pediatrics, 9 which provided the first set of nationally recognized goals and objectives for the full three years of pediatric residency. The shift from paper to Web made possible interactive access to resources that could be customized to local needs. The online Guidelines, developed from 2002 to 2005,* offer tools that are designed to be practical and efficient in the real world of residency training. The development process included an assessment of program needs, collaborative content review and updating, and creation of Web functions to enable users to download curricular materials, and faculty development tools for local adaptation. Following the model developed in earlier collaborative projects, creation of the Guidelines involved input from multiple sources and extensive beta-testing.10

In this article, we report the findings of our study of registered Web site users and their downloading of Guidelines materials during a 2.5-year period, when residency programs nationwide were moving rapidly toward competency-based curricula. We also report the responses by a subset of users to an online survey that queried them about the utility and quality of the Web site functions they used.

Back to Top | Article Outline

Method

Web site description.

The Guidelines are mounted on a free public Web site (http://www.academicpeds.org/egwebnew). Registration (with demographic information) and log-in are required, with a password chosen by the user. Information on development and use of the Guidelines is reported by Kittredge and colleagues10 in a companion article in this issue of Academic Medicine. At the Web site’s core is a database of 334 goals with objectives that define the knowledge, skills, and attitudes of a competent resident at the end of training. Also included are lists of competency elements for each of the six broad competency domains mandated by the Accreditation Council for Graduate Medical Education (ACGME). The terminology generally parallels that used by the ACGME for pediatrics and is similar to that used for common program requirements for other specialties.

Other online functions offer tutorials describing steps of competency-based curriculum development, procedure lists, resident evaluation forms, and tools for rotation and program planning. All of these tools can be downloaded for further adaptation. In addition, the site offers online help, resource lists, and additional search functions.

Back to Top | Article Outline

Evaluation overview.

To understand how educators have been using the site, we developed software to track all registrants and all files downloaded over time by program and by user name and password. In addition, we surveyed volunteer registrants 10 months after the site’s completion to assess how well we had anticipated the needs of these pediatric educators and how they were actually using the material. Results were analyzed using descriptive statistics. The Dartmouth Medical School’s institutional review board approved our evaluation of the Guidelines.

Back to Top | Article Outline

Data tracking Web site use.

At registration, users were informed that activity on the Web site would be tracked and that data would be reported anonymously and in aggregate only. They were asked to indicate their willingness to be contacted regarding their use of the site. Customized programming was created to continuously track the number of log-ins and types of downloads for each user name and password. Registrant, log-in, and download data were reviewed at six-month intervals from July 1, 2005 through December 31, 2007. The reported data exclude data from site use during the development phase by beta-testers and project team members.

Back to Top | Article Outline

User survey.

An online survey of registrants who agreed to be contacted for this purpose was carried out during a 6-week period starting on March 1, 2006, using SurveyMonkey (http://www.SurveyMonkey.com; accessed May 21, 2009), a commercial, secure online survey tool. During the survey period, up to four e-mail reminders were sent to nonrespondents.

The survey instrument was pilot tested extensively before its distribution. Respondents were asked to provide demographic information plus their program’s status in the ACGME review cycle. Likert-style, forced-choice, and open-ended questions addressed users’ experiences with the Guidelines as a whole and specific Guidelines functions. For each component of the Guidelines, respondents were asked about clarity of instructions, ease of use, adaptability to program needs, and whether materials were helpful in implementation of competencies into their program. Respondents were urged to make comments about each tool. Unlike conventional mailed surveys, this online survey was constructed with a branching design to facilitate ease of response and interpretation of results. Respondents were first asked if they had used a particular function, and only those who answered positively were asked questions about their experience and satisfaction with that function.

Back to Top | Article Outline

Results

Data tracking Web site use

A total of 1,747 individuals (105 outside the United States) registered during the study period (July 2005 through December 2007). Most (1,642; 94%) were from the United States and represented 100% of the pediatric categorical residency training programs that were ACGME approved in 2008. Registrants were from 47 states and the District of Columbia, representing all states with pediatric residencies. An additional 105 registrants came from 33 countries outside the United States. More than a quarter of registrants (470) were program directors or assistant program directors; 1,045 (60%) were general pediatricians, and 523 (30%) were subspecialists (Table 1).

Table 1

Table 1

Among the registrants, 71% downloaded files (1,239; 56 non-U.S.), for a total of 8,754 downloaded files (see Table 2). Examination of download data by six-month intervals showed that the number of downloads was sustained at an average of 1,750 downloads per interval, with higher rates of downloads for rotation goals and objectives between January and June (data not shown). Table 2 describes the frequency of downloaded files by type. The most frequently downloaded tools were lists of goals and objectives for rotations and supplemental learning experiences: 5,009 of these files were downloaded (57% of all downloads). Of the standard and subspecialty rotation tools, predesigned lists were selected nearly 11 times more often than the “build-your-own” lists, which take longer to construct. Tutorials were the second-most-commonly downloaded component of the Guidelines (13% of downloads), followed by pediatric competencies (11%) and resident evaluation forms (8%).

Table 2

Table 2

Analyzing Web site use by programs, rather than registrants, indicates that nearly all pediatric training programs had downloaded files from the Guidelines (188 = 97% of 194 ACGME-approved pediatric residency programs in 2008). Among these 188 programs, 182 (97%) downloaded rotation goals and objectives. As shown in Figure 1, 140 (74%) downloaded competency tools and 105 (56%) downloaded resident evaluation forms. Other tools that were downloaded commonly were tutorials (50% of programs) and products of online search functions (48%).

Figure 1

Figure 1

Download frequencies by program varied from <6 to 252, with 19 programs (10%) downloading only 1 to 5 files, whereas 56 programs (28%) downloaded more than 50 files.

Back to Top | Article Outline

User survey data

Surveys were sent to all 182 registrants who agreed at registration to be contacted, out of a total of 692 Web site users as of March 2006. Of these 182 volunteers, 128 responded with completed surveys, a response rate of 70%. Because we were interested primarily in the response from U.S. pediatric training programs, we excluded 17 international respondents, leaving a total of 111. Table 1 shows that respondents to the survey had demographics comparable with those of the registrant group tallied in December 2007, based on discipline and role in their training program. Table 3 (column 1) shows the number of survey respondents who evaluated each Web site function. The number evaluating each function varies because only those who indicated that they had used a particular function were allowed to respond.

Table 3

Table 3

Back to Top | Article Outline

Residency Review Committee preparations.

Fifty-seven respondents reported that they had used the Guidelines tools in preparation for program review by the ACGME Residency Review Committee (RRC), and all found the Guidelines either very useful (37; 64%) or somewhat useful (21; 36%) in RRC preparations. A large proportion of these respondents indicated that use of the Guidelines gave them a better understanding of the relationship between ACGME competencies and resident education, helped them design better curricular offerings, provided new educational tools, and enhanced their efficiency in educational tasks (89%–93% across four questions).

Back to Top | Article Outline

Curriculum-building tools.

Curriculum-building tools were used by 57 (51%) respondents; another 20 respondents said they planned to use these tools in the future but had only reviewed them so far. The 57 users of these tools evaluated each separately. Responses are summarized in Table 3. Representative samples of written comments are shown in List 1.

List 1 Samples of Written Evaluation Comments From Respondents to theGuidelinesUtilization Survey

List 1 Samples of Written Evaluation Comments From Respondents to theGuidelinesUtilization Survey

Overall, survey respondents rated instructions for use of tools as clear (91%–100% across all tools). Individual tools were rated as “easy to use” by 86% to 100% of users. Although some downloaded tables were very large, most respondents (79%–89%) considered the tables to be manageable. A substantial proportion of respondents (40%–42%) felt that too much information was included in the documents they created by the “build-your-own rotation” function; fewer (19%–24%) considered the predesigned rotation lists too long. A large majority of respondents (84%–92%) indicated that the various tools helped them to integrate competencies into their program.

The Pediatric Competencies-with-Elements were used by 34 respondents, who gave positive ratings for clarity, ease of use, and adaptability (88%–94%). Most (31; 91%) agreed that the tool was helpful in integrating competencies into their program. Responses about an alternative, briefer version of the competency list were equally positive.

Back to Top | Article Outline

Other Web site functions.

Evaluation and Planning Tools, which offer templates for rotation planning, program planning, and resident evaluation, were designed in part as faculty development tools. These were used or reviewed by 58 respondents. Among the users, 74% to 83% agreed that these tools helped conceptualize part of their evaluation or planning processes. Grouping responses to these tools together, 54% to 69% of users found them easy to adapt to program needs, and 61% to 66% found that they offered a useful framework for educational development. These Guidelines tools received less positive reviews than other functions. For example, of 42 respondents (38%) who had used or reviewed the Program Planning Tool, only 31 (74%) agreed that the tool helped with program planning; 5 disagreed and 10 were unsure that the tool was useful as a framework for program infrastructure. The Rotation Planning tool received similarly mixed ratings.

Search Functions were used by 30 (27%) respondents; most agreed that these functions were helpful and manageable.

Tutorials, which went online only four months before the survey, were evaluated by only 22 survey participants (20%). Most used them for their own learning (95%) or to share with others (63%). Users rated the tutorials very favorably for quality of educational content (100% rated 4 or 5 on a 5-point scale), and 95% gave similar high ratings to the tutorials for clarity and ease of use.

Free-text comments/feedback were requested for each component of the Guidelines. Respondents provided 76 comments, the majority of which were very positive about the utility of the Web site, such as “The tool was quick and straightforward to use.” However, other comments were not as positive, for example, “Some of the material was redundant, but we managed to extract what we needed.” Representative comments are included in List 1. These were selected by the project evaluator (R.H.) independently from the project team members, based on a guiding principle of balancing positives and negatives and including content that expressed a user’s point of view.

Back to Top | Article Outline

Discussion

The APA Guidelines for Pediatric Residency are a technologic advance in the field of residency education—to our knowledge, this is the first interactive Web site for curriculum building developed to assist residency program directors. The Guidelines have been very widely used by our target audience; virtually all U.S. pediatric residency programs include registered users who have downloaded files. In addition, 56 individuals from 27 other countries have downloaded files. Most registrants have been pediatric residency program directors, rotation directors, or other faculty. It seems reasonable that similar interactive, Web-based curriculum-building tools would be useful to other disciplines, to facilitate the efforts of program directors who are typically overworked and challenged by the tasks of major curriculum revision.

Use of wide variety of functions on the Web site began soon after the site was completed and has been sustained throughout a 30-month period. The frequency of file downloads from the Web site has averaged nearly 300 per month since July 2005. A qualitative survey of volunteer registrants conducted 10 months after the site was completed showed that users have found these curricular tools to be easy to use and adapt and helpful in implementing competency-based education. By providing an accessible, user-friendly, and customizable resource, we appear to have created tools that meet the needs of our users.

Quillen11 argues that specific goals and objectives, along with evaluation methods that measure learners’ mastery of these objectives, play a central role in implementation of a competency-based curriculum. The Guidelines Web site provides tools for curriculum and evaluation development, both of which are needed by residency programs to meet new ACGME requirements. This study demonstrates that among programs that downloaded files from the Web site, curriculum-building tools (goals/objectives and competencies) were downloaded by 182 (97%), whereas resident evaluations were downloaded by only 105 (56%), results that probably reflect a natural development in curriculum enhancement. Clearly, programs must have goals and objectives before they can develop objective-based evaluation methods.

Data on file downloads and survey feedback show that users prefer obtaining simpler predesigned lists of goals and objectives to constructing their own longer and more detailed lists using the Guidelines’ “build-your-own” functions. Because all files downloaded from the Guidelines are customizable, the use of shorter, predesigned lists apparently allows programs sufficient latitude to control their curricular products. The clear preference for predesigned tools likely reflects a desire for efficiency and a recognition that longer lists are more difficult to implement, especially in evaluation forms. Residency program directors may also prefer the predesigned format because they have heavy demands on their time; major and detailed curricular revision may require more time or more skills than program directors are able to invest. This result could be instructive for programs considering development of a curricular resource similar to the Guidelines, because constructing interactive “build-your-own” functions is expensive and labor-intensive.

Data on file downloads document that 50% of programs have accessed the Guidelines’ tutorials, a finding that may reflect programs’ need to satisfy RRC requirements for faculty development. Survey responses indicate that some faculty reviewed the tutorials online, so download data may underestimate actual use of these tools. Survey respondents who used the tutorials were extremely satisfied. The potential power of the Guidelines as a faculty development tool goes well beyond the tutorial system. Instructions throughout the Web site were written to guide users on how to make sound use of each tool. Use of Guidelines tools other than the tutorials for faculty development and resident instruction was noted in several comments by survey respondents.

In contrast to other curriculum planning tools, the program planning and rotation planning templates were downloaded by a small percentage of programs: 34 (18%) for each tool). Survey respondents showed considerable ambivalence about their utility. On the basis of beta-testers’ comments during Web site development, we redesigned the program and rotation planning tools to make them easier to use, but we may not have streamlined them enough. However, it is also possible that the problem lies in the task, not the tools. Faculty trying to meet new ACGME requirements may not yet have gotten to the stage of comprehensive evaluation and revision of programs and individual rotations. Faculty attending our workshops on the Guidelines have often expressed frustration and confusion about program-wide planning around broad competency domains. Continuing faculty development on how to conduct rotation and program-wide planning seems warranted, given the pressing need of many programs to meet this challenge.

The main strength of this study is that we used two sources of data to document use of the Guidelines. Online data about user registrations and file downloads are relatively free of response bias. These showed accurately who came to the Web site and what documents they took away when they visited. Registrant and downloader tracking, however, do not tell us whether users found the tools useful or what they did with the documents after downloading. The survey data were less generalizable but more informative about what experienced registrants did with Guidelines tools. Although the survey respondents represent a small subset (128; 16%) of all registrants at the time of the study, these were authentic users who were capable of reporting on the utility of functions in the real world of residents’ education. Hence, our study included broad and quantitative, as well as small-scale and qualitative, measures of Guidelines use.

There are several limitations to this study. First, use of survey data is inherently dependent on the ability and willingness of respondents to remember and accurately report their opinions. It is likely that some respondents could not recall in detail what components of the Web site they had used or how they used them, when asked months later. On the other hand, a delayed survey offered our best opportunity to obtain reflective responses about real-world applications, not just “first impressions” of the material. Although the volunteer respondents were similar to the whole user group in terms of discipline and program role, they may have been a dedicated subgroup whose experiences do not generalize to the larger user group.

An additional limitation applies to data collected directly from the Web site. Because individuals were not forced to use the same username and password every time they came to the Web site, we may have overestimated the registrant count. We accounted for this potential source of error by collecting program information on registrants and by reporting downloads by program as well as by registrant. This limitation does not affect the array of files downloaded, the count of accredited residency programs using the site, or the types of files downloaded by program.

Finally, this study made no attempt to measure important educational outcomes for programs using the Guidelines, such as improved effectiveness of their educational offerings, development of more accurate outcomes assessments of residents, or enhanced educational confidence and skills of faculty. Studies to address these long-term outcomes are complex but important, and such studies will require a concerted effort by many individuals in the future.

Back to Top | Article Outline

Conclusions

This study demonstrates how the Guidelines have been accessed by nearly all U.S. pediatric residency programs, our primary targeted users. The majority of U.S. pediatric residency programs have used the Guidelines for curriculum development and found the tools useful for obtaining lists of learning goals and objectives and integrating competencies into their programs. The online survey identified high satisfaction, and data on file downloads demonstrated patterns of use that have been sustained over time. Our data collection methods offer practical strategies for evaluating online curriculum development tools of other disciplines and specialties.

The APA Guidelines for Pediatric Residency may offer a useful model for the delivery of curriculum development tools for disciplines outside of pediatrics. The programming used to develop this interactive Web site is in no way specific to one discipline, and it could be adapted as a platform for other curriculum-building tools.

Back to Top | Article Outline

Acknowledgments

The authors of this article are members of the APA Guidelines project team; one author (R.H.) is the project’s evaluation consultant. The authors wish to thank Kenneth Roberts, MD, who was project director for the Macy Foundation grant; Carol Carraccio, MD, competency-based education consultant; and the 10 members of the national advisory board, who provided guidance during the development of the Guidelines. The authors also wish to recognize the scores of beta-testers who tried out Web site functions during development and offered many suggestions for improvement. David McDonald Jr. at Specialized Software Systems, Inc., Madison, Virginia, served as a patient and knowledgeable software developer. Input from all of these individuals enhanced the accessibility and usefulness of this Web-based educational resource.

Development of the Guidelines was supported primarily by the Josiah Macy Jr. Foundation (2002–2005) and the Academic Pediatric Association. Additional support was provided by the Pfizer Foundation (2000–2001).

Back to Top | Article Outline

References

1 Leach DC. The ACGME competencies: Substance or form? J Am Coll Surg. 2001;192:396–398.
2 Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: From Flexner to competencies. Acad Med. 2002;27:361–367.
3 National Institute of Education. Involvement in Learning: Realizing the Potential of American Higher Education. Washington, DC: Government Printing Office; 1984.
4 McGaghie WC, Miller GE, Sajid AW, Telder TV. Competency based curriculum development in medical education: An introduction. Public Health Pap. 1978;68:11–91.
5 Accreditation Council for Graduate Medical Education. Toolbox of Assessment Methods. Available at: (http://www.acgme.org/outcome/assess/toolbox.asp). Accessed May 26, 2009.
6 Carraccio C, Englander R, Wolfsthal S, Martin C, Ferentz K. Educating the pediatrician of the 21st century: Defining and implementing a competency-based system. Pediatrics. 2004;113:252–258.
7 Sectish TC, Zalneraitis EL, Carraccio C, Behrman RE. The state of pediatrics residency training: A period of transformation of graduate medical education. Pediatrics. 2004;114:832–841.
8 Kittredge D, Baldwin CD, Bar-on ME, Beach PS, Trimm RF, eds; Ambulatory Pediatric Association. APA Guidelines for Pediatric Residency. Available at: (http://www.academicpeds.org/egwebnew). Accessed May 26, 2009. Approved by MedEd PORTAL, April 2009; Available at: (http://www.aamc.org/mededportalID=1736).
9 Kittredge D, Baldwin CD, Bar-on ME, Levine HG, Trimm RF, eds. Educational Guidelines for Residency Training in General Pediatrics. McLean, Va: Ambulatory Pediatric Association; 1996.
10 Kittredge D, Baldwin CD, Bar-on ME, Trimm RF, Beach PS. One specialty’s collaborative approach to competency-based curriculum development. Acad Med. 2009;84:1262–1268.
11 Quillen DM. Challenges and pitfalls of developing and applying a competency-based curriculum. Fam Med. 2001;33:652–653.

* The Guidelines were launched in May 2004, but development continued through 2005, to respond to beta-tester input.
Cited Here...

© 2009 Association of American Medical Colleges