Secondary Logo

Journal Logo

Continuous Quality Improvement and Community-Based Faculty Development through an Innovative Site Visit Program at One Institution

Malik, Rebecca MD, CCFP, MHSc; Bordman, Risa MD, CCFP; Regehr, Glenn PhD; Freeman, Risa MD, CCFP, MEd

doi: 10.1097/ACM.0b013e31803ea942
Educational Strategies

This article describes and evaluates a unique site-visit process for community-based teaching sites. A continuous quality-improvement program was developed by the undergraduate program in the Department of Family and Community Medicine at the University of Toronto Faculty of Medicine to facilitate and document both self- and peer-assessment. A pilot program was launched in 2000, and, after some adjustments based on initial feedback, the program in its current form was implemented in 2002. This program provides individualized support mechanisms to address the faculty development needs and infrastructure requirements of community-based, mostly volunteer, teachers. It also trains participating reviewers to provide individualized faculty development at the point of teaching. During their training, reviewers receive a toolkit consisting of suggestions for initial contact with teachers, guidelines for peer assessments, previously completed previsit teacher surveys, reviewer checklists, postvisit feedback forms, sample thank-you letters, and a faculty development reference resource list. A two-year evaluation of the program demonstrated that faculty and reviewer participants perceived it to be comprehensive, consistent, informative, and an acceptable method of reviewing existing and prospective community-based teaching sites. This program should be transferable to other institutions that engage in community-based teaching.

Dr. Malik is electives coordinator, Undergraduate Family Medicine, chair, Site Visit Committee, and assistant professor, University of Toronto Faculty of Medicine, Toronto, Ontario.

Dr. Bordman was chair, Site Visit Committee, and district program director, Undergraduate Family Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario. Currently, she is district postgraduate program director, North York General Hospital, and assistant professor, University of Toronto Faculty of Medicine, Toronto, Ontario.

Dr. Regehr is Richard and Elizabeth Currie Chair in Health Professions Education Research, the Wilson Centre for Research in Education, and professor, University of Toronto Faculty of Medicine, Toronto, Ontario.

Dr. Freeman initiated the Site Visit Program, recently completed two terms as undergraduate program director in Family Medicine, and is associate professor, University of Toronto Faculty of Medicine, Toronto, Ontario.

Correspondence should be addressed to Dr. Malik, Davisville Family Practice, 1881 Yonge Street Suite 600, Toronto, Ontario, Canada, M4S 3C4; telephone: (416) 932-0409, ext. 232; fax: (416) 440-0238; e-mail: (

There have been a number of paradigm shifts in medical education in the past two decades. The impetus for many of these changes was Muller’s1 1984 comprehensive report for the Association of American Medical Colleges that encouraged medical schools to move from large-group to small-group teaching, from disease-centered to person-centered case material, from pedagogical to andragogical models, and from primarily inpatient teaching to a combination of inpatient, outpatient, and community-based teaching environments.

The clerkship in family medicine at the University of Toronto Faculty of Medicine, established in 1972 as the first clerkship of its kind in Canada, has led the way in adopting many of these changes. Originally, clinician–educators in large hospital-based teaching units supervised the entire family medicine curriculum, but in the last 15 years, the teaching has been increasingly shifted to community-based sites. Currently, a cadre of approximately 100 community-based physician–educators has been recruited to teach in the faculty of medicine’s core family medicine clerkship program, and an equal number supervise in the electives program.

Both the core clerkship and electives programs in family medicine at the University of Toronto Faculty of Medicine have consistently received excellent teacher and program evaluations from the students. However, program administrators had never formally scrutinized the quality of teaching and learning opportunities in the community-based sites. In an effort to redress this gap, a site visit working group was formed with a mandate to develop, implement, and evaluate a continuous quality-improvement (CQI) process for all established and potential core and elective community teaching sites.

A review of the literature supported the importance of site visits and the potential opportunity for faculty development that could stem from these visits.2–7 Suzewits2 outlines the key reasons to conduct site visits. Brand3,4 offers faculty members information about how to survive a site visit. Zayas et al5 outline the importance of monitoring the quality of medical student instruction in decentralized environments. The important goal of finding acceptable methods of CQI of community-based teaching opportunities is a recurring theme in the literature; however, authors have not described a standardized strategy for program administrators to design, implement, or monitor the usefulness and acceptability of these CQI programs.

Our goal was to create a review process for our community-based teaching sites that would be comprehensive, consistent, informative, and acceptable to teachers, reviewers, and program administrators. Although the traditional goal of CQI in medicine is to improve patient outcomes, here we would use CQI to improve the educational experience. In this article, we describe the development and implementation of our site review process, detail its unique features, and provide information to allow other schools to implement similar programs.

Back to Top | Article Outline

The Development Process

In designing our site visit program, we searched the literature for key factors to consider in conducting CQI visits and found the following recurring themes: the importance of the practice setting, the quality of the learning opportunities, and the key role of the preceptor.5–7 Distillation of this literature and consultation with our teachers and other educators led us to field-test two broad assessment components for our program: scope of practice (clinic setting, patient demographics, etc.), and clinical learning opportunities (teaching strategies, supervision and feedback, etc.).

Our site visit program had two components: a previsit survey to be completed by the community-based teacher being assessed, and an on-site visit from a faculty reviewer. We designed the previsit survey to gather information concerning the scope of practice and clinical learning opportunities (our two established assessment components) and to provide the teacher with an opportunity for self-reflection before the visit. The completed survey was made available to the reviewer before the visit to allow the reviewer to become familiar with the information and to focus questions during the visit.

The large geographic distribution of our program required that the university appoint 10 district program directors, each responsible for the coordination of all aspects of the teaching program in 1 of 10 districts. These individuals were considered the natural choice to serve as the reviewers for the program and to conduct the on-site visits in their respective districts.

Each on-site visit was divided into four parts and was guided by an instruction sheet that we designed for the reviewers’ benefit. First, the reviewer toured the premises with an eye to the physical learning environment (number of exam rooms, resources, space for learners, etc.). Second, the reviewer and teacher reviewed the previsit reflective survey together. They discussed the teaching and learning strategies employed at the site from the perspective of the goals and objectives of the teaching program. Third, the reviewer conducted brief chart reviews of five patients from the learner’s perspective. Finally, the reviewer completed a teacher feedback form and discussed with the teacher both the information on the feedback form and any faculty development issues that arose from the visit. Figure 1 provides an outline of the entire process.

Figure 1

Figure 1

A pilot project of 10 visits was launched in 2000 to determine the feasibility of fully implementing the site visit program. The process was generally acceptable to both reviewers and teachers. Two important issues arose that prompted revisions to the site visit process before implementing it on a larger scale. The first issue was the need to reduce the time required of participants (two hours for the reviewers; one and a half hours for the teacher). We addressed this encumbrance by streamlining the paperwork. We preassembled teacher-specific site visit packages for the reviewers that included all necessary forms and a checklist for the reviewer to use at the site visit. The time required to prepare for and complete the visit was, therefore, significantly reduced, because all the necessary information and paperwork were centralized and readily available to the reviewer.

The second issue was that teachers were asking reviewers for feedback about their teaching methods and information regarding their personal faculty development needs. Though the literature we consulted in our planning process had noted this potential faculty development opportunity in the site visit model, we had not, in the initial planning stages, concentrated our efforts on developing this component. It became apparent, however, that our site visit program presented an ideal opportunity for individualized faculty development. Consequently, we set out to develop this unique aspect of the program, which would allow peer reviewers to help teachers with their personal educational needs.

One of the main challenges in developing this faculty development component was to defuse the reviewers’ concern regarding their role in “policing” their peers. In discussion with our university administrators and our local licensing body, we developed a policy statement to explain clearly that, although the site visit program would include a faculty development component for the benefit of our community-based teachers, the program was not designed to assess the teachers’ clinical skills or patient-care delivery.

Another opportunity for improvement was to provide reviewers with a more extensive training program for their roles as peer assessors of teachers and as providers of individualized faculty development. We designed a “train-the-reviewer” workshop to address both of these needs. We invited leaders in peer assessment from the local licensing board to share their expertise. On the basis of these leaders’ recommendations, a review of the literature, and current best practice models, we created an interactive “how to provide individualized faculty development” session for the workshop.8–12 At the end of the workshop, reviewers received a toolkit consisting of suggestions for initial telephone contact with teachers, guidelines for peer assessments, all previously completed previsit surveys, reviewer checklists, postvisit feedback forms, sample thank-you letters, and a faculty development reference resource list. A professional videographer recorded the workshop to allow new reviewers to benefit from the workshop in the future.

The improvements were made possible through a combination of funding opportunities that were granted after our pilot project was complete. These funds allowed for administrative support, the train-the-reviewer workshop, compensation for reviewer and teacher participation in the evaluation component, a research associate, and dissemination of findings.

Back to Top | Article Outline

Lessons Learned from the Implementation Stage

We launched the site visit program in September 2002 and tracked its implementation with an evaluation component for two years. Teachers, reviewers, and program administrators were asked to complete previsit, immediate postvisit, and one-year postvisit questionnaires to assess the impact of the program. All 15 reviewers and 22 of the 42 teachers consented to participate in the evaluation aspect of the project.

Several themes emerged from the evaluation process. Reviewers as well as teachers were accepting of the site visit process and, over time, felt more positive about the experience after the encounter, with the dissipation of initial concerns, increased sense of comfort, and increased perception of the program’s value. Both groups noted that the visit influenced change. For example, teachers reported becoming more involved in faculty development, giving more feedback to students, and improving organization around teaching. According to respondents, the greatest challenge was finding the time to conduct the site visit and coordinating reviewer and teacher schedules. To improve this aspect of the process, respondents suggested having the central office set appointments or make known all specific times available for scheduling (as opposed to having the reviewer and teacher try to negotiate a time themselves).

The site visit program allowed for teacher self-reflection and for individualized faculty development at the point of teaching. Most important, reviewers enhanced their own knowledge of and connection with their community-based teachers, fulfilling the primary goal of any site visit initiative. Reviewers and teachers maintained their perception of the value of the program throughout a one-year follow-up.

We note that the evaluation results must be interpreted in the context of the participation rate. Although our participation rate was 100% (15/15) for reviewers, it was only 52% (22/42) for the teachers. We believe that this may reflect the teachers’ fear of being scrutinized, and it suggests the possibility of a response bias in the data regarding teachers’ attitudes toward this process. Future research efforts might be aimed at increasing participation in similar evaluation processes to determine the extent to which the current findings would generalize to the larger population of participants. Nonetheless, on the basis of the data collected, the site visit program can be considered a success.

Back to Top | Article Outline

A Successful and Generalizable Approach to CQI

Ambulatory teaching is becoming an important component of every discipline in undergraduate and postgraduate medical training. The need to ensure the appropriateness of distant teachers and sites requires rigorous and reproducible CQI methodologies and is becoming an accreditation standard.

As program administrators of a large, community-based teaching program, we were keenly aware of the need for a fair, thorough, and evaluated method to assess distant teaching sites. The site visit program we describe in this article provides an approach to addressing this important issue and a value-added component for the teachers in the form of individualized faculty development. The standardized forms, reviewer-training program, and clear instructions for reviewers facilitate the delivery of a comprehensive and consistently applied program. The information we have gathered will allow and inform ongoing curricular improvement and the development of enhanced faculty development programs at the University of Toronto Faculty of Medicine. We believe that our site visit program is generalizable to other settings. Program administrators in other institutions should be able to implement this program to satisfy accreditors’ requirements with ease. All materials developed will be made available for implementation at other universities on request.

We note that the success of our program relied on three essential elements:

  1. A pilot process with extensive committee and participant involvement to provide opportunities for the enhancement and improvement of the process.
  2. The presence of formal training for the reviewers to allow them to increase their knowledge and skills with respect to both peer assessment and provision of individualized faculty development.
  3. The availability of funding for the development and implementation of the program (administration of the visits, data collection) and support for faculty (remuneration, release time, CME credits, etc.).

In the near future, we plan to streamline the CQI process by using electronic data collection and by implementing the site visit program as a required component of the clerkship program.

The site visit program we have described is unique for its comprehensiveness as a CQI initiative in evaluating distant teaching sites because it incorporates, facilitates, and documents self- and peer-assessment. It is also unique in that it details methods of training participating reviewers to provide individualized faculty development at the point of teaching. It addresses the unique needs of community-based, mostly volunteer, teachers. Participants noted that the process was comfortable, comprehensive, useful, and an informative method in reviewing existing and prospective community-based teaching sites. The program should be transferable to other institutions that engage in community-based teaching as a CQI tool, and it may also allow for curricular and faculty development improvements. As was the case in our experience, teachers are likely to benefit from the individualized faculty development and possibly feel more valued by their institution. These positive outcomes may aid in the retention of faculty and recruitment of new community-based teachers.

Back to Top | Article Outline


The development and evaluation of this program was supported through a grant from The Dean’s Excellence Fund competition at the University of Toronto and through matching funds from the Department of Family and Community Medicine at the University of Toronto.

Dr. Regehr is supported as the Richard and Elizabeth Currie Chair in Health Professions Education Research.

The authors would like to thank Drs. Ann Kenney and Diane Kelsall for their participation in the early developmental stages of our site visit program and Rachel Ellis, Sharon Lee, Michael Kaftarian, and all of the teacher and reviewer participants for their assistance.

Back to Top | Article Outline


1 Muller S. Physicians for the twenty-first century: Report of the Project Panel on the General Professional Education of the Physician and College Preparation for Medicine. J Med Educ. 1984;59:1–208.
2 Suzewits J. Preceptor site visit. Fam Med. 2002;34:240–241.
3 Brand JL. When your practice is under the microscope: how to survive a rural preceptorship site visit. Fam Med. 1997;29:461–462.
4 Brand JL, Carpenter JM. Preparing for a site visit. In: Precepting Medical Students in the Office. Baltimore, MD: Johns Hopkins University Press; 2000.
5 Zayas LE, James PA, Shipengrover JA, James PA. Exploring instructional quality indicators in ambulatory medical setting: an ethnographic approach. Fam Med. 1999;31:635–640.
6 James PA, Osborne JW. A measure of medical instructional quality in ambulatory settings: the MedIQ. Fam Med. 1999;31:263–269.
7 Young BL, Graham RP, Shipengrover JA, James PA. Components of learning in ambulatory settings: a qualitative analysis. Acad Med. 1998;73(10 suppl):S60–S63.
8 Bergquist WH, Phillips SR. Components of an effective faculty development program. J Higher Educ. 1975;49:177–211.
9 Bergquist WH, Phillips SR. A Handbook for Faculty Development. Vol. II. Washington, DC: The Council for Advancement of Small Colleges; 1977.
10 Jason H, Westberg J. The process of faculty development. In: Teachers and Teaching in U.S. Medical Schools. New York, NY: Appleton-Century-Crofts; 1982:297–307.
11 Sheets KJ, Schwenk TL. Faculty development for family medicine educators: an agenda for future activities. Teach Learn Med. 1990;2:141–148.
12 Tiberius RG. From shaping performances to dynamic interaction: the quiet revolution in teaching improvement programs. In: Successful Strategies for Higher Education. Bolton, Mass: Anker; 1995.
© 2007 Association of American Medical Colleges