Skip Navigation LinksHome > March 2012 - Volume 87 - Issue 3 > Effecting Curricular Change Through Comprehensive Course Ass...
Academic Medicine:
doi: 10.1097/ACM.0b013e318244739c
Medical Education

Effecting Curricular Change Through Comprehensive Course Assessment: Using Structure and Process to Change Outcomes

Goldman, Ellen F. EdD; Swayze, Susan S. PhD; Swinehart, Sarah E. MBA; Schroth, W. Scott MD, MPH

Free Access
Article Outline
Collapse Box

Author Information

Dr. Goldman is assistant professor of human and organizational learning, George Washington University Graduate School of Education and Human Development, and director, Master Teacher Leadership Development Program, George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Swayze is assistant professor of education research, George Washington University Graduate School of Education and Human Development, Washington, DC.

Ms. Swinehart is an education consultant. At the time of this process' development, she was a staff member, Office of Medical Education, George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Schroth is associate dean for administration, George Washington University School of Medicine and Health Sciences, Washington, DC.

Correspondence should be addressed to Dr. Goldman, Graduate School of Education and Human Development, George Washington University, 2134 G Street, NW Room 218, Washington, DC 20052; telephone: (202) 994-1531; fax: (202) 994-4928; e-mail: egoldman@gwu.edu.

First published online January 25, 2012

Collapse Box

Abstract

Effective curriculum oversight requires periodic assessment and continuous improvement of individual course offerings as well as their overall integration. The literature indicates that most course review processes do not use the breadth of information available or sufficiently encourage faculty feedback and reflection, limiting the value derived. Suggestions for which data to include in the course evaluations are available in the literature; however, there is little guidance on effective course review structures and processes. In this article, the authors discuss a course review process revised as part of a comprehensive reform of the George Washington University School of Medicine and Health Sciences undergraduate medical school curriculum management structure. The process improvements incorporated evaluation practices grounded in the medical and higher education literatures and included changes to the data reviewed as well as the review timing, participants, and structure. The revised process uses a broad array of information, requires significant faculty participation, and uses questioning, writing, and dialogue to encourage faculty reflection and learning. Course directors indicate that the process helps them focus, and the information and the perspectives of others lead to reflection and new ideas. Through the process, course directors have changed course content and teaching methods, improved assessments of learning, and expanded course integration across the curriculum. The procedural and content elements of the process can be easily transferred to other medical schools and are applicable to other curricular reform projects across the continuum of medical education.

Periodic evaluation of courses is a critical component of effective curriculum oversight.1 These evaluations may have many purposes in medical education, including ensuring that students' learning needs are being met, providing feedback and identifying needed improvements to teaching, informing resource allocation decisions, facilitating curriculum development, supporting applications for faculty promotion, and meeting accreditation requirements.2,3 The literature, both medical and higher education, provides some guidance for assessing courses, focusing on the types of data to use and how to consider the data.13 In practice, however, course reviews do not make efficient use of the wealth of information available. The most commonly used data—student evaluations of courses—are believed to suffer from issues with instrument validity, and individual faculty feedback and reflection during the process are subordinate to meeting requirements for accreditation, curriculum evaluation, or tenure and promotion decisions.25 Regardless of the breadth of information considered, limited information is available regarding how the process of considering the data should be designed and implemented for maximum effectiveness.

In 2009, the George Washington University School of Medicine and Health Sciences developed a systematic course review process to promote curricular quality and improve individual instructional units of the undergraduate medical curriculum. Here, we describe this innovative course review process and discuss its structural aspects including the information used. Finally, we provide generalizable process elements that may be useful to other medical schools seeking to improve course review practices.

Back to Top | Article Outline

Identifying a Comprehensive, Inclusive Process for Course Review

Before 2009, our course review process relied primarily on student course evaluations, was administered by a small curriculum committee one course at a time with limited peer involvement, and was largely reactive in nature, requiring the course director to defend low course ratings and other concerns raised in a report prepared by one or a few committee members. Limitations of course review processes with some or all of these features have been discussed in the literature.2,6 As part of a comprehensive reform of our curriculum management structure, we developed an improved course review process. A task force of basic science and clinical faculty, administrators, and education experts was charged by the dean with the development of a comprehensive, inclusive course review process (E.F.G., S.E.S., and W.S.S. were members of this task force). The task force met approximately biweekly for six months. The time line for completing the new process was guided by a work plan addressing a number of curricular issues. Two external medical education consultants reviewed the work of the task force. The consultants and task force members used materials from the education and medical literatures and their own prior experiences to develop the new course review process. Discussion among the task force members resolved any areas of concern. Once the materials were codified to the group's satisfaction, the task force shared them with the course directors and made modifications as appropriate based on feedback and discussion. There was not a lot of discrepancy during the development process; the overriding concern was to develop a comprehensive, rigorous course review process, and task force members worked efficiently toward this goal.

After six months of design work, the process was piloted in spring 2009 on three courses and slightly modified based on those experiences. The evaluation process has been used widely since fall 2009, during which time 17 of the 20 required courses offered in the first three years of the curriculum have been reviewed. (We use the term “courses” to refer to both courses and clerkships.)

Back to Top | Article Outline

Central Oversight Ensures Process and Content Consistency

Before we present the details of the revised course review process, it is helpful to understand the flow of the entire activity. Central oversight, supportive resources, and expertise in instructional design had been previously lacking in course evaluation practices and were needed to ensure consistency for the course review activity and the curriculum overall. The course review process is now facilitated by the Office of Medical Education (OME) and supported by a curriculum database that houses information on all courses and allows cross-course mapping. Initially, the OME team included an associate dean (W.S.S.), a faculty member with health care experience and a doctorate in education (E.F.G.), and a staff member with extensive organizational experience (S.E.S.). Although the OME membership continues to evolve in accordance with institutional staff changes, the structure of the team will remain essentially the same.

The revised course review process includes four major activities as diagrammed in Figure 1. The process begins with the preparation of information for the review, including a file of information on the course (the elements of which are detailed below) and a memo from the OME that includes specific questions (among those generally asked) requiring the course director's particular attention. To prepare the memo, the OME reviews the file of information and identifies areas that do not meet target performance levels or that were previously identified for improvement. The file of information and the memo are then forwarded to the course director.

Figure 1
Figure 1
Image Tools

The second major step in the process is the course director's preparation of a written assessment of the course with suggested changes. The assessment is guided by a set of questions (Guidelines for Course Review, discussed below) that prompt the course director to analyze the information for areas of success and concern. The course director provides a written response to the questions (some require more detail than others) and forwards the assessment to the department chair and the OME.

The third major step in the process involves a meeting with the course director, department chair, and OME to discuss the written assessment and reach consensus on changes to be made in the course based on the assessment outcomes.

The fourth major process step is a summary of the assessment presented to three review committees in turn: The Year Committee (one of two separate committees which evaluate courses based on when the course is taught in the curriculum—Year 1/2 or Year 3/4), the Curriculum Management Group (CMG—all course directors), and, finally, the Curriculum Oversight Group (COG—the OME and relevant deans). We describe the specific roles of each group in the review process in a subsequent section.

We searched the literature but could find no recommendations about the optimal frequency of course reviews. An exemplar, provided by the University of Kansas, indicates that comprehensive course reviews are completed every three years, with a limited annual review.7 We decided to complete a full evaluation of each course every two years. In the off years, the course directors are provided with limited review data, do not prepare a written report, and meet only with the OME. The course is not discussed at the various committees unless there is a pressing need to do so.

Back to Top | Article Outline

A Broad Range of Information Provides Diverse Perspectives

The medical education literature notes that assessments of undergraduate medical courses mostly consist of student ratings of course organization, teaching methods, and placement in the curriculum.2,4 Experts in both higher and medical education, however, suggest adding open-ended questions to student feedback instruments, developing means of providing formative feedback to faculty, and including input in evaluations from a variety of perspectives: students further along in the curriculum, graduates, faculty peers, and those at other institutions doing creative work.13,6,8 Some empirical work shows peer reviews in particular to be valuable to enhancing medical courses and transforming curricula.7,9

Our prior course review process was similar to that of most medical schools in its nearly exclusive reliance on student course evaluation data.2,4 Given our goal of a comprehensive, inclusive course review process, we now depend on descriptive information on the course and its contribution to the curriculum, student performance data, student perceptions, and various types of peer reviews. To fully understand the course content and to ensure consistency across course materials as they are prepared for the students, we review the course purpose, objectives, content, assessment methods, and syllabus. The OME checks the course purpose and objectives to verify a learner-oriented approach and ensure that the purpose and objectives are aligned with the instructor's teaching and assessment methods. The syllabus is reviewed to ensure that standard requirements of the university (e.g., office hours, grading, disability assistance, etc.) are included. Reviewers assess the contribution of the course to the curriculum overall by considering what program objectives the course fulfills (the concept of the “crosswalk” of course objectives to program objectives). In addition, curriculum database reports allow those involved in the review process to see where else in the curriculum the course content is taught and to ensure an appropriate level of teaching in the course under review (i.e., not repeating cognitive or skill-based activity, but building on it).

Student performance data used in the course reviews include, as relevant, case log and duty hours summaries, historical examination scores and pass rates (spanning three years, as available), and United States Medical Licensing Examination (USMLE) subsection scores. For these data, we established target performance levels (e.g., 95% course passing rate; a target range of USMLE scores) and provided a framework for analyzing the data. Course reviews also consider student perceptions of the course as provided by course ratings and written comments, the senior student exit interview summary, and three years of Association of American Medical Colleges Graduation Questionnaire data. Again, for these data, we established target performance levels (e.g., >80% “very good” or “excellent” for the students' ratings), and we provide reviewers with a framework for analyzing the data.

Various types of peer reviews are also used in the course review process. First, faculty throughout the university are encouraged to obtain assessments of their teaching on a regular basis using a suggested assessment tool. These and any other forms of formative feedback, such as short surveys administered during the course by the faculty themselves, are part of the review material. Second, peer reviews are completed for each course undergoing review. Three to five faculty teaching before, simultaneously, and after the course are asked to comment on the level and emphasis of course content as it relates to their course. This helps ensure scaffolding in learning; that is, that material is built on rather than simply repeated. Finally, external reviews of the course content are completed at the discretion of the OME, usually when a course has experienced a precipitous drop in the various indicators or when there is a change in the standards the course must meet.

Back to Top | Article Outline

Responsibility Promotes Reflection and Learning

Faculty involvement in the course review process was previously limited to reactive responses to a report prepared by the curriculum committee. We recognized that the new course review process was a tool of organizational change and that its features needed to encourage faculty learning and growth to achieve maximum benefits.10,11 Accordingly, we used five critical principles to promote reflection and learning.

First, the responsibility for preparing the course assessment and presenting it rests with the course director. Learning for adults is highly related to the tasks they immediately need to complete12; thus, course directors are more likely to learn from completing the assessment themselves than from having a report prepared for them.

A second critical principle of the process is that data are given to the course directors in raw form. Analysis is a key catalyst to critical thinking and reflection.13,14 The course directors are invited to make their own conclusions about what is working and what needs improvement in their courses, using the target performance levels and the Guidelines for Course Review (Appendix 1) to stimulate their thinking.

Third, course directors are required to provide written responses to the questions in the Guidelines for Course Review. Questioning is widely recognized as a catalyst to the identification of gaps and opportunities and to the significant reflection required to promote learning, focus effort, instill ownership of insights, set goals for change, and identify help needed.1319 Codifying reflective thoughts in written form further promotes learning.13,20 The questions in the Guidelines for Course Review are consistent with suggestions of the higher education literature,1 but we adapted them to medical curricula and expanded them to focus on assessment and course integration. At the conclusion of the written report, course directors are required to specify a plan for improvement. The plan includes how the course director will involve all faculty teaching in the course in the changes proposed and otherwise provide feedback to them on their teaching. Although responding to the 50+ questions in the Guidelines for Course Review may seem onerous, on examination, most questions are answerable in a few sentences. In addition, as noted previously, the OME provides the course director with a memo identifying areas of concern, providing focus for the report.

The fourth critical principle of the process is that the OME uses a consultative approach13,21 (i.e., questioning versus directing) with the course directors in challenging their assessment of the course and to foster the type of reflection that leads to learning and change. Given that this was the first change to the course review process in 15 years and that the cooperation of the course directors is essential for the process to achieve its aims, OME's approach needed to be collaborative rather than authoritarian. Consistent with this approach is the fifth critical principle, the use of dialogue to catalyze the reflection and learning of all involved in the process.1315,18,19 The review process discussions (described below) include all course directors, OME, department chairs, and deans. The discussions culminate with responses to questions, again promoting reflection, which leads to learning and change.1319

Back to Top | Article Outline

Full Participation Encourages Collaboration

To facilitate curricular integration, we involved all course directors in the course reviews of all courses. Four new committees, each with specific course review tasks to avoid redundancy and duplication of effort, meet monthly to consider course reviews as well as other curriculum matters. Two Year Committees participate in course reviews based on the year in which the course is taught; one committee of faculty directs courses in Years 1 and 2 of the curriculum and another directs courses in Years 3 and 4. The appropriate committee (Year 1/2 or Year 3/4) reviews each written course assessment in detail, judges its appropriateness according to the Guidelines for Course Review, and considers the proposed changes and implications for any other Year 1/2 or Year 3/4 course.

The Year Committee forwards its recommendation to the CMG. The CMG is made up of all course directors. They receive a written summary of the course review, assess the course's contribution to the curriculum objectives, and consider the recommendations of the Year Committee.

The CMG then forwards its recommendation to the COG. The COG consists of the members of the OME and relevant deans. The COG reviews the CMG's recommendations and assesses the course's consistency with the medical school's curricular vision and educational strategies. The COG approves the allocation of resources required for implementing course changes.

Providing specific review questions for each committee promotes discussion, avoids duplication of effort, and facilitates the strategic development of the curriculum.3,14

Back to Top | Article Outline

Immediate Effects on the Curriculum

On average, between five and seven substantive changes were made to each course following its full review. Table 1 provides examples of the changes, reflecting modifications to course subject matter and materials, adoption of active learning techniques,22 enhanced methods of assessment, and steps toward improved curricular integration.

Table 1
Table 1
Image Tools

Changes resulting from our old course review process (using limited data from student feedback) mainly addressed content enhancements (e.g., the order of material presented, emphasis or de-emphasis of material) or teaching issues (e.g., faculty who did not communicate material well). The changes resulting from the revised course review process were much more substantive. Discussions about the level of course learning objectives frequently resulted in changes to more active forms of student engagement. For example, science courses were encouraged to adopt learning objectives that required students to “apply” concepts, challenging course directors to reduce lecture time in favor of discussion and case-based learning. Feedback from peers teaching simultaneously led to the temporal coordination of concepts being taught in many areas including pharmacology and pathology, neuroanatomy and psychopathology, and all second-year science courses and the simultaneous second-year Introduction to Medicine course. Sharing information amongst the course directors in meetings and through the curriculum database identified content gaps and excesses (e.g., a need for more neuroscience and less history-taking and physical examination instruction) and led to standardization of grading across clerkships and identification of a core set of clinical skills expected on graduation. These changes were facilitated by the new process in which course directors were now in the same room at the same time, multiple times a year, focusing on the curriculum in a structured manner.

Back to Top | Article Outline

Course Directors' Responses to the Process

On completion of their course review under the new process, we offered course directors the opportunity to provide feedback. Fourteen of 17 participated in an online survey developed by the authors. It asked course directors to rate each aspect of the review process (including each data element) as to the degree of helpfulness it provided to the improvement of their course. A five-item scale ranging from “not helpful at all” to “very helpful” was offered, with the opportunity to explain ratings and suggestions for improving the process. Participants could complete the survey in 15 minutes.

A researcher not affiliated with the review process (S.S.S.) analyzed the responses. Table 2 summarizes the ratings. List 1 provides the most specific comments offered by the course directors on the topic. Ratings and comments were consistent across course and clerkship directors.

Table 2
Table 2
Image Tools
List 1
List 1
Image Tools

All process elements were considered at least “somewhat helpful” to the course directors' assessment of their courses, with the student course evaluations, peer reviews, Guidelines for Course Review questions, and the related process of writing the course assessment report and discussing it with OME rated relatively highly. The course directors appreciated having the “data in one place with easily reviewable summaries” and indicated that the comparison information with other courses “changed [their] assumptions about the effectiveness of [their] teaching.” The Guidelines for Course Review and OME memo were appreciated as “structuring,” “organizing,” and focusing mechanisms. The report-writing process was seen as a catalyst to reflection on “where the course should go.” Most of the dialoguing opportunities were perceived as providing “different perspectives … good suggestions” and “possible opportunities” as well as “helping [the course director] to reflect more deeply about particular aspects of the course.” These comments confirm the importance of the critical principles previously discussed.

The value of discussion at the various committee meetings was among the lowest rated elements. Our observations of this are that faculty needed to gain confidence and trust in one another to fully engage. This is a reasonable reaction to any new process, particularly one such as this requiring collaborative effort in place of previously siloed activity. Recent meetings are more interactive.

The design of the course review process is ongoing. Most courses are just now approaching their second comprehensive review. The faculty are now familiar with the particulars of the process and need less guidance. Previously, we collected feedback via survey to provide anonymity as the parties were not used to working together. Now, anonymity is less of a concern as faculty are more comfortable voicing their opinions openly. We will continue to refine the process based on the verbal feedback of the course directors and the judgment of the OME.

Back to Top | Article Outline

Successful Change in a Challenging Environment

We have described a course review process grounded in educational theory and research, incorporating a broad range of information, and promoting reflection and learning among course directors, department chairs, and deans. Our implementation of this process was not without challenge. We expected that a course review could be completed in three to four months; however, busy faculty schedules (those of the course directors and the peer reviewers) sometimes caused meetings to be delayed, elongating the reviews by a few months. It should be noted that the comprehensive review of a course reported by the University of Kansas took place across six to eight months,7 so our expectation was perhaps overly optimistic. Staff turnover made data acquisition difficult and pointed out the need for backup personnel who can access the various reporting systems. In complying with our standard syllabus requirements, our faculty required an unexpected amount of individualized assistance writing learned-centered course objectives and linking them to appropriate means of assessment because few had prior training in this area. Also, the focus on outcomes over process was a new perspective for many faculty members. Finally, course directors were sometimes reluctant to publicly discuss their concerns with others' courses.

Despite these challenges, after 18 months of implementing this process, we are encouraged by the changes that have taken place in course delivery. Faculty are experimenting with team-based learning and self-study “flex” lectures to better address course learning objectives. Learning assessment methods have improved, and many faculty now request formative feedback from students on their teaching. More clinical cases and faculty have been integrated into the preclinical courses; clerkship directors have embedded refreshers of preclinical material by basic science faculty. In addition, as course directors gain experience and comfort with open critique, content gaps and the areas where better scaffolding of teaching is needed have been identified. A general benefit has been the identification of several areas in which all course directors require development—writing learning objectives as previously mentioned, understanding students' learning styles and implications for teaching techniques, and formative and summative methods of assessing both learning and teaching.

The evaluation process includes a number of resources (e.g., information lists, the Guidelines for Course Review, committee structure and foci) that may be useful to other medical schools with minimal modifications. Of greater value may be the applicability of the various features of the process to other projects across the continuum of medical education: Central oversight ensures process and content consistency; incorporating a broad range of information provides diverse perspectives on issues; individual responsibility for responding to questions and presenting to others promotes reflection and learning; and collective participation encourages collaboration. These features are not often discussed, but they are critical to implementing change in medical education.

Back to Top | Article Outline
Acknowledgments:

The authors thank the MD curriculum course and clerkship directors of the George Washington University School of Medicine and Health Sciences for their participation in the revised course review process and the associated research study.

Back to Top | Article Outline

References

1. Diamond RM. Designing and Assessing Courses and Curricula: A Practical Guide. 3rd ed. San Francisco, Calif: Jossey Bass; 2008.

2. Kogan JR, Shea JA. Course evaluation in medical education. Teach Teach Educ. 2007;23:251–264.

3. Morrison J. Evaluation. BMJ. 2003;326:385–387.

4. Billings-Gagliardi S, Barret SV, Mazor KM. Interpreting course evaluation results: Insights from thinkaloud interviews with medical students. Med Educ. 2004;38:1061–1070.

5. Combs KL, Gibson SK, Hayes JM, Saly J, Wend JT. Enhancing curriculum and delivery: Linking assessment to learning objectives. Assess Eval Higher Educ. 2008;33:87–102.

6. Elzubeir M, Rizk D. Evaluating the quality of teaching in medical education: Are we using the evidence for both formative and summative purposes? Med Teach. 2002;24:313–319.

7. Burke MJ, Bonaminio G, Walling A. Implementing a systemic course/clerkship peer review process. Acad Med. 2002;77:930–931.

8. Frankford DM, Patterson MA, Konrad TR. Transforming practice organizations to foster lifelong learning and commitment to medical professionalism. Acad Med. 2000;75:708–717.

9. Horowitz S, Van Eyck S, Albanese M. Successful peer review of courses: A case study. Acad Med. 1998;73:266–271.

10. Diamond RM. Faculty, instruction, and organizational development: Options and choices. In: Gillespie K, Hilsen L, Wadsworth E, eds. A Guide to Faculty Development. San Francisco, Calif: Jossey-Bass; 2002:2–8.

11. Senge P. The Fifth Discipline: The Art and Practice of the Learning Organization. New York, NY: Doubleday; 2006.

12. Knowles MS, Holton EF, Swanson RA. The Adult Learner. 5th ed. Woburn, Mass: Butterworth-Heineman; 1998.

13. Brookfield SD. Developing Critical Thinkers: Challenging Adults to Explore Alternative Ways of Thinking and Acting. San Francisco, Calif: Jossey-Bass; 1987.

14. Westberg J, Jason H. Fostering Reflection and Providing Feedback. New York, NY: Springer; 2001.

15. Boud D, Keogh R, Walker D, eds. Reflection: Turning Experience Into Learning. London, UK: Kogan Page; 1985.

16. Illeris K. How We Learn: Learning and Non-learning in School and Beyond. New York, NY: Routledge; 2007.

17. Marquardt M. Leading With Questions. San Francisco, Calif: Jossey-Bass; 2005.

18. Mezirow J. Transformative learning theory. In: Mezirow J, Taylor E, et al.. Transformative Learning in Practice: Insights From Community, Workplace, and Higher Education. San Francisco, Calif: Jossey-Bass; 2009:18–31.

19. Schon DA. Educating the Reflective Practitioner. San Francisco, Calif: Jossey-Bass; 1987.

20. Huff A. Writing for Scholarly Publication. Thousand Oaks, Calif: Sage; 1999.

21. Kolb D. Experiential Learning. Englewood Cliffs, NJ: Prentice-Hall; 1984.

22. Silberman M. Active Learning: 101 Strategies to Teach Any Subject. Needham Heights, Mass: Allyn & Bacon; 1996.

Back to Top | Article Outline
Funding/Support:
Appendix 1
Appendix 1
Image Tools

None.

Back to Top | Article Outline
Other disclosures:

None.

Back to Top | Article Outline
Ethical approval:

Determined to be exempt from review by the George Washington University institutional review board.

Back to Top | Article Outline
Previous presentations:

Parts of the process were presented at the Northeast Group on Educational Affairs (of the Association of American Medical Colleges) Annual Retreat, March 2011, Washington, DC.

Cited By:

This article has been cited 1 time(s).

Journal of Chemical Education
Results from a National Needs Assessment Survey: A View of Assessment Efforts within Chemistry Departments
Emenike, ME; Schroeder, J; Murphy, K; Holme, T
Journal of Chemical Education, 90(5): 561-567.
10.1021/ed200632c
CrossRef
Back to Top | Article Outline

© 2012 Association of American Medical Colleges

Login

Article Tools

Images

Share