Secondary Logo

Journal Logo

Monitoring and Improving Resident Work Environment Across Affiliated Hospitals: A Call for a National Resident Survey

Byrne, John M. DO; Loo, Lawrence K. MD; Giang, Dan MD

doi: 10.1097/ACM.0b013e318193833b
Graduate Medical Education
Free
SDC

Purpose To assess, monitor, and improve resident work environment and enhance compliance with Accreditation Council for Graduate Medical Education (ACGME) institutional requirements among the five Loma Linda University–affiliated hospitals using an annual survey.

Method From 2001 to 2005, residents completed an anonymous questionnaire assessing each hospital's educational and work environment regarding clinical services, attending physicians, learning opportunities, resident environment, and coordination of care. Resident perceptions were compared across hospitals and residency programs.

Results A total of 2,399 resident surveys were collected during a five-year period from 500 residents in 43 different residency programs. The overall response rate was 88%. Residents perceived deficiencies in clinical services at the Veterans Affairs (VA) hospital. After focused improvement efforts directed to VA clinical services, serial data showed significant improvements in residents' perceptions of nursing, phlebotomy, radiology, social work, case management, and overall clinical services in subsequent years (P < .001). The survey data identified resident concerns at other affiliated hospitals and were instrumental in persuading hospital administrators to improve the university hospital resident lounge and computerized lab results at a county hospital. Resident perceptions improved in subsequent years after the changes were made (P < .001). The survey also suggested improvements in residency-specific program educational issues.

Conclusions A survey across five consecutive years involving five affiliated hospitals provided local benchmarks and serial data to monitor and significantly improve resident perceptions of their work environment and ensured optimal compliance with ACGME institutional requirements. These results and the data of others support the idea of an expanded, national resident survey.

Dr. Byrne is associate chief of staff for education, VA Loma Linda Healthcare System, and assistant professor of medicine, Department of Medicine, Loma Linda University School of Medicine, Loma Linda, California.

Dr. Loo is director of graduate medical education, Riverside County Regional Medical Center, and associate professor of medicine, Department of Medicine, Loma Linda University School of Medicine, Loma Linda, California.

Dr. Giang is associate dean and director of graduate medical education, Loma Linda University Medical Center, and associate professor of neurology, Department of Neurology, Loma Linda University School of Medicine, Loma Linda, California.

Correspondence should be addressed to Dr. Byrne, Loma Linda VA Healthcare System, 11201 Benton Street (14A), Loma Linda, CA 92357; telephone: (909) 583-6004; fax: (909) 777-3828; e-mail: (john.byrne@med.va.gov).

The Accreditation Council for Graduate Medical Education's (ACGME's) institutional requirements state that an overriding responsibility of a graduate medical education (GME) program's sponsoring institution is to ensure that GME is conducted in a scholarly environment that is committed to medical education and excellent patient care.1 To that end, the ACGME requires sponsoring institutions to develop processes for “the resident work environment” that “minimize the work of residents that is extraneous to their graduate medical education program” by providing adequate food services, call rooms, security, support services such as phlebotomy, lab, radiology, messenger, and transport services, and medical records.1 The ACGME Institutional Review Document requires the designated institutional officer (DIO) to address whether these services are adequately provided in all institutions participating in the GME program.2

In addition to the focused educational experience, residents must have a confidential and protected process to raise and resolve issues related to the educational and work environment. Although adequate communication between residents, GME officials, and hospital administrators is essential to address resident concerns effectively, informal processes may not be adequate. Much can be gained from resident forums, confidential advisors, and informal communication, but these methods lack strict confidentiality or a quantifiable assessment of work environment deficiencies that may help persuade hospital administrators and GME leaders to make improvements.

In the late 1990s, residents in the Loma Linda University Medical Center GME program expressed concerns about clinical services at the Veterans Affairs (VA) hospital (such as phlebotomy, nursing, radiology, and lab), the lounge at the university hospital, and access to lab data at an affiliated county hospital. However, we lacked a means to systematically evaluate their concerns or gauge their responses after implementing changes. To date, no national resident survey similar to the Association of American Medical Colleges (AAMC) Medical School Graduate Questionnaire exists to assess residents' specific concerns about the learning and work environment at affiliated hospitals. We hypothesized that gathering and evaluating objective data from our own affiliated hospitals would bring attention to our residents' concerns and motivate hospital administrations to improve hospital services and the resident work environment.

In order to address our residents' concerns in a confidential and quantifiable way, we developed an annual resident survey beginning in 2000. The purpose of this survey is to serially assess and compare the resident work environments across five affiliated hospitals and to use the data to assist hospital administrators and residency program directors in acknowledging, initiating, and monitoring improvement efforts and compliance with ACGME institutional requirements. The survey was conducted at the Loma Linda University Medical Center GME program. The program consists of a university medical center, a VA medical center, two county medical centers, and one privately owned medical center. We will refer to these medical centers as the university, VA, county 1, county 2, and private hospitals. Nearly 500 residents in more than 40 different programs complete clinical rotations in one or more of these hospitals, with most residents training at two or three hospitals. In this report, we present the results of the survey and its role in improving the resident work environment in our affiliated hospitals.

Back to Top | Article Outline

Method

Survey development

The questionnaire was developed by the authors (the DIO and two designated educational officers at two affiliated hospitals) and was based on several factors: (1) our own residents' concerns about clinical services at affiliated hospitals, (2) ACGME institutional requirements regarding the work environment, and (3) a review of the literature. Based on this information, we grouped the questions into five categories: clinical services, attending physicians, learning opportunities, resident environment, and coordination of care. Respondents were asked to assess each item for each hospital on a 5-point Likert-type scale (0 = absent, 1 = poor, 2 = fair, 3 = good, and 4 = excellent). An option of “not applicable” was available for residents who did not rotate to an affiliated hospital. We collected narrative comments from three questions asking residents to identify at least one improvement, one strength, and any other comments about their residency experience. The institutional review boards of the affiliated hospitals were consulted, and an exemption was granted.

A pilot survey was performed in 2000. The survey was placed in all 549 residents' mailboxes with an invitation to take the survey. One hundred forty-six residents responded, and three faculty members were invited to take the survey and provide feedback. We revised several questions based on the analysis of the pilot study data. Summary questions regarding residents' overall impressions of clinical services, teaching, and coordination of care at each hospital were added. In order to evaluate our GME program's effectiveness regarding new ACGME initiatives, questions were added in 2004 to evaluate the ACGME General Competencies, an institution-specific curriculum designed to teach the competencies, whether or not residents have face-to-face meetings with the program director twice annually, and residents' familiarity with their program's goals and objectives. These items were grouped into a category called ACGME General Competencies and other educational issues.

After the pilot, the survey was administered for the first time in 2001 during the required annual daylong resident training session at the university medical center, and it was also administered in subsequent years during this training session that occurs annually in late April or early May. On this day, residents receive training on information security, infection control, and other hospital policies. From 2001 to 2005, each year's graduating residents completed the survey as part of their exit process in June. The survey was administered anonymously, and the only demographic data collected were the residents' program and postgraduate year (PGY).

After the first study year in 2001, the VA medical center conducted focus groups from June through August 2002 to further address a number of concerns raised regarding clinical services and resident environment in that hospital. The focus groups were undertaken to validate and further assess the findings of the survey. VA staff who were formally trained in focus-group methodology and were unknown to the residents volunteered to conduct the focus group. The volunteer staff were not directly involved in resident education and consisted of quality improvement staff, a social worker, a patient educator, a data analyst, administrative assistants, and two patient advocates.

Back to Top | Article Outline

Validity

We determined validity of the survey in several different ways: (1) the survey was derived largely from the ACGME requirements, giving it content validity, (2) analysis of the pilot data as well as of subsequent data found quantitative differences between hospitals that were known to exist (e.g., the variety of patients seen at the VA was much lower than at the other hospitals), giving it face validity, (3) the focus groups confirmed the data collected in the survey regarding clinical services and coordination of care at the VA medical center, and (4) the narrative comments on the questionnaire corroborated the survey's quantitative data, giving it concurrent validity.

Back to Top | Article Outline

Statistical method

Cross-sectional comparisons were used to identify hospital and specific residency program resident perception deficiencies at affiliated hospitals. Serial data were used to evaluate the effectiveness of interventions to address the residents' concerns. Therefore, ANOVA was used to analyze trends in responses over time at each of the five medical centers. An adjustment was made for multiple comparisons so that statistical significance was reached at P < .001.

Back to Top | Article Outline

Results

Response rates

From 2001 to 2005, excluding the pilot survey, 2,399 questionnaires were collected. The overall five-year response rate was 88% (range 86%–92%). Residents from 43 different primary and secondary residency programs took the survey. The numbers of PGY 1, 2, 3, 4, 5, and 6 residents were 23%, 26%, 23%, 15%, 7%, and 4%, respectively, with 3% providing no PGY. The total number of responses for each medical center was 2,262 for the university hospital, 1,434 for the VA, 1,068 for county hospital 1, 321 for county hospital 2, and 297 for the private hospital.

Back to Top | Article Outline

Resident work hours

Beginning in 2003, the third year of the survey, the ACGME work hours limitations were instituted. Each year, residents were asked how many hours they spent in each hospital during an average workweek during the course of the academic year. An overall average of all programs showed no appreciable change in the average number of work hours reported. In 2001, averages for the university hospital, VA hospital, county hospital 1, county hospital 2, and the private hospital were 70, 57, 72, 69, and 74, respectively, and in 2005 they were 66, 58, 71, 76, and 69. However, some programs showed marked changes. For example, neurosurgery, which only rotates at the university hospital, dropped from 92 to 80 hours from 2001 to 2005. General surgery average hours per week dropped in all hospitals. At the university, VA, county 1, county 2, and private hospitals in 2001, general surgery average hours per week were 93, 87, 87, 84, and 87, whereas in 2005, they were 82, 81, 82, 80, and 82, respectively. Internal medicine did not show appreciable changes at two of the three hospitals where the residents rotate (university and VA hospital), but at county hospital 1, average hours per week dropped from 89 in 2001 to 74 hours in 2005.

Back to Top | Article Outline

Comparisons among hospitals

The face validity of the survey was confirmed by several comparisons between hospitals. For example, availability of medical records was rated highest at the VA for each year of the survey, with scores ranging from 3.47 to 3.73 and a five-year average of 3.61 on the five-point scale. The VA is the only hospital among the five that fully implemented an electronic health record (Computerized Patient Record System, or CPRS) before the survey period. Similarly, variety of patients was rated lowest at the VA, reflecting the VA patient demographics.

On cross-sectional analysis of the initial survey data, marked differences were found in resident perceptions of clinical services at the VA in comparison with the other affiliated hospitals. On questions related to nursing, availability and timeliness of laboratory results, availability and timeliness of imaging results, intravenous line placement, phlebotomy, patient transportation within the hospital, and social work and case management, the VA was rated lowest compared with all other affiliated hospitals during the first four years of the survey. For the other hospitals, the university hospital lounge and county hospital lab services were rated comparatively low. Table 1 shows the hospital mean ratings for the entire survey period.

Table 1

Table 1

In cross-sectional comparisons in the category of attending physician teaching and learning opportunities, residents' perceptions were generally good, comparable, and maintained longitudinally across the five hospitals throughout the survey period (Table 2). The opportunity to do research and the stimulus to read was relatively lower across all five hospitals.

Table 2

Table 2

In response to the survey results and the resident focus groups, GME leadership presented the survey data to the three affiliates with the largest resident complements (the university, the VA hospital, and county hospital 1), emphasizing the deficiencies in clinical services and resident environment. Examples of how GME leadership at the VA, university, and county hospital 1 used the survey to address specific issues at their hospitals follow.

Back to Top | Article Outline

VA hospital

Survey data were presented to VA hospital administration, and, based on this data, a number of improvement efforts were undertaken simultaneously. The VA hospital nursing service developed internal surveys, conducted meetings with physician staff to improve communication, enhanced nursing competencies, increased nursing salaries, and aggressively pursued qualified nurses for hire. The residents' perceptions of VA nursing improved during the course of the survey (P < .0001) (Figure 1).

Figure 1

Figure 1

Phlebotomy services were enhanced by increasing the number of routine blood draws on the ward, hiring additional phlebotomists, training nurses on the inpatient ward to do phlebotomy, and through meetings with the pathology and lab service to coordinate test ordering and cancellations with physicians. Collection of lab specimens outside scheduled lab blood draws was made the responsibility of nursing staff rather than the resident staff. The residents' perceptions of VA phlebotomy services consistently improved during the survey period (P < .0001) (Figure 1).

Availability and timeliness of imaging results was rated lowest at the VA, which was the only hospital that did not have images available electronically through a Picture Archiving and Communication System (PACS) in May 2001. The VA nationally mandated PACS implementation, which was fully operational at the VA hospital by the end of 2001. In addition, the Veterans Health Administration (VHA) introduced a national performance measure in 2004 that requires imaging reports to be verified within two days. The VA hospital took several steps to meet this measure, including the deployment of voice-recognition software for report dictation and provider-specific feedback to the individual radiologists on the national performance measure. At the end of fiscal year 2005 (September), the VA hospital was ranked seventh out of 130 VA hospitals nationwide, with 96% of reports meeting the performance measure. Residents' perceptions of the timeliness and availability of imaging results increased during the period of the survey (P = .0002) (Figure 1).

The VA was rated lowest for case management and social work support in the initial years of the survey (Figure 1). Case managers were added to each of the inpatient ward teams to better assess patients' home care and extended care facility needs early in the course of hospitalization and to coordinate discharge planning. Resident perceptions of social work and case management support and discharge planning increased throughout the survey period (P < .0001 for both survey items).

During the course of the survey, the overall clinical services rating at the VA improved (P < .0001) (Figure 1). In the first year of the survey (2001), the VA's overall clinical support services rating (2.51) was the lowest among all five hospitals, but it improved throughout the study to a mean rating of 2.95 in 2005 that was exceeded only by the university and private hospitals. The VA hospital mean for all five years now exceeds that of county hospital 1 (Table 1).

Back to Top | Article Outline

County hospital 1

GME leaders at county hospital 1 used the survey results to address resident concerns about computerized lab results with hospital administration. County hospital 1 was the only hospital that did not have lab results available electronically, and the survey showed that the hospital ranked lowest among all five hospitals in the initial years of the survey. To address this, the hospital implemented electronic lab results in 2002. The residents' perceptions improved from 2001 to 2005, with mean ratings increasing from 2.01 to 2.53 (P < .0001).

Back to Top | Article Outline

University hospital

Residents' perceptions of the adequacy of the university hospital lounge were low (1.62) in 2001. GME leadership at the university presented the data to the administration in an appeal for funding a resident lounge, which was funded and built before the 2002 survey. The mean rating of the university lounge rose to 2.92 in 2002 (P < .00001).

Back to Top | Article Outline

Residency program data

Program directors were provided with individual program reports from the survey in the categories of attending physician teaching, learning opportunities, ACGME General Competencies, and a category for other educational issues. Using these cross-sectional data to compare affiliates, problems have been identified and addressed through these reports. For example, psychiatry residents' mean ratings were very low in 2002 at the VA in the availability of the teaching physician, adequacy of supervision, and quantity of teaching (Table 3). In response to resident complaints, the program director and the VA service chief appointed a VA residency coordinator and created a residency training committee at the VA. The committee met regularly in 2002 to address educational issues and effected a change in personnel that led to the hiring of two new teaching physicians. Mean ratings on the survey subsequently improved. In the following two years, the program director requested that additional residents be allocated to the VA psychiatry rotation because of the improved teaching environment. The survey data were used to justify additional resident funding at the VA hospital.

Table 3

Table 3

Back to Top | Article Outline

Discussion

In this report, we present the results of a five-year survey assessing and comparing the work and educational environments across five affiliated hospitals in one institution's GME program. Our results demonstrate that comparisons among affiliated hospitals lend credibility to resident complaints, identify problems at affiliates that may otherwise have gone unrecognized, create data-driven administrative incentives for improvement, and provide longitudinal monitoring of targeted improvement efforts. The survey has enhanced our sponsoring institution's ability to monitor ACGME requirements, to quantitatively assess residents' concerns, and to prepare for an ACGME institutional review.

A key strength of our survey is the comparison of residents' perceptions across affiliated hospitals. Comparing affiliates reveals disparities between training sites in residents' perceptions of their work environment that might go unrecognized or unquantified. Our survey provides an opportunity for local benchmarks and comparisons that might yield different and important results from surveys focusing on one type of affiliated hospital or aggregated data from one GME program. For example, satisfaction with call rooms was rated low in VHA in the study by Keitz and colleagues,3 whereas in our survey the VA hospital call rooms were rated highest and the university lowest, a finding that would perhaps be unexpected in other programs (Table 2). These types of comparisons among affiliates identify hospital-specific problems and provide data for directed process improvement in the training sites where it is needed. Our data and those of Keitz underscore the need for GME leaders to examine residents' educational and work environments and the ACGME institutional requirements across all affiliated hospitals.

A second strength of our survey is the serial nature of its administration. During the course of five years, we have been able to identify not only problem areas with resident work environment and ACGME institutional requirements, but also hospital- and program-specific deficiencies for which specific actions have resulted in improved resident perceptions. In other areas, where system initiatives have not resulted in improved resident perceptions, such as timeliness of medication administration with physician order entry and bar-coded medication administration,11,12 the data can be used to identify and target further research and ongoing performance improvement. Also, our high response rate ensured adequate sample sizes to compare program-specific resident perceptions among affiliated hospitals.

Other surveys have been used in an effort to improve compliance with ACGME requirements or to examine resident perceptions across multiple institutions.3,4 Heard and colleagues4 developed a resident survey to assess compliance with four commonly cited ACGME educational requirements: supervision, feedback and evaluation, scholarly time, and duty hours. Reports were provided to program directors who were required to submit plans of correction. The authors demonstrated a decrease in the number of ACGME citations in programs with action plans. Although no formal action plans were required in response to our survey results, the data were disseminated to program directors, GME leaders, and hospital administrators to drive improvement efforts in a number of different venues both formal and informal. The survey played a major role in initiating and monitoring process improvement efforts at the VA and particularly in addressing residents' concerns about clinical services in the resident work environment.

Keitz and colleagues3 reported the first-year results of the VA's National Learners' Perceptions Survey (LPS) involving more than 1,400 resident respondents training in VA hospitals across the country. Residents were generally satisfied with the learning environment and with their faculty and preceptors, which is consistent with our results with regard to the VA. However, they were generally less satisfied with the working environment and reported lower levels of satisfaction for laboratory, radiology, and ancillary/support staff, which is also consistent with our results. The authors concluded that residents' perceptions “may reflect frustrations with a fiscally constrained system in which a fixed set of resources are available to provide care for a growing number of increasingly ill and complex patients.”3 However, during the course of our survey, the number of patients at our local VA increased by 8.3% per year, the number of staff decreased by 0.7% full-time equivalent employees per 1,000 patients, and funding increased minimally by about 4% per year. Our survey demonstrates that by serially quantifying perceptions of work environments, resident perceptions can improve through targeted interventions and process improvement despite a growing patient population and minimal increases in resources.

One possible explanation for the observed improvement at our local VA hospital is the overall VHA system reengineering.5–9 The residents' perceptions of timely imaging reports, social work support, case management, and discharge planning may reflect VHA performance measures and initiatives to reduce inpatient care and hospital length of stay.7 VHA is also acknowledged for having a very advanced and effective clinical information system.9–12 The VA hospital's medical record availability through CPRS received the single highest rating of all measures in our survey. Therefore, our survey results reflect in part the ongoing national VHA system improvement initiatives.

However, we believe that our survey is also measuring the effects of targeted interventions to address specific resident concerns within our affiliated hospitals. As an additional validation of our findings, we also reviewed our local VHA LPS results for 2003 to 2005, which show an increasing percentage of residents and medical students who are “very satisfied” or “somewhat satisfied” with the quality, timeliness, and availability of clinical services. These results mirror the improved resident perceptions reported on our survey. Figure 2 provides selected examples from the LPS of clinical services that were a concern at our local VA hospital. By comparison, our local LPS improvement in satisfaction is more dramatic than that of residents and medical students across the VA nationally (Figure 2). Therefore, although timely imaging results are a VA national performance measure, we believe the high rate of success at our VA hospital was mainly the result of local process improvement. Similarly, the introduction of case managers on ward teams was a local decision and has been previously shown to reduce hospital length of stay.13 Thus, serial comparisons of local data with national means help account for secular trends, thereby lending support to the effect of our local process improvements on residents' perceptions.

Figure 2

Figure 2

Our study has limitations. First, we did not develop a specific protocol for process improvement or action plans in response to the survey results. However, as discussed, the survey played an important role in a number of improvement efforts across the five affiliated hospitals and could be adapted to a more structured quality improvement effort. Second, the survey is limited to one GME program and its four affiliated hospitals, so our survey instrument and findings may not be broadly applicable. However, we found multiple similarities between our results and the report by Keitz et al,3 and our VA hospital results are consistent with the local results from VHA's LPS. Furthermore, given that our survey is largely based on ACGME requirements and has been conducted in organizationally different hospitals, it may be useful to other institutions and their affiliates.

In conclusion, our survey has proven to be a valuable tool in assessing and monitoring resident perceptions across affiliated hospitals. Our survey and the work of others3,4 lend support to the idea of comparing resident work environments across institutions and with national benchmarks. An expanded national resident survey could provide these data. A model for this concept is the AAMC's national Medical School Graduation Questionnaire, which is an invaluable tool to compare individual institutions with national trends and is a rich source for tracking medical student educational perceptions and influences longitudinally.14,15 We believe that a similar annual national GME survey administered by the Council of Teaching Hospitals or the AAMC Group on Resident Affairs comparing affiliated hospitals would provide an equally invaluable opportunity for hospital administrators and GME leaders to improve residents' educational and work environments and to ensure compliance with ACGME institutional requirements in all hospitals where residents train.

Back to Top | Article Outline

Acknowledgments

This material is the result of work supported with resources and the use of facilities at the VA Loma Linda Health Care System.

Back to Top | Article Outline

References

1 Accreditation Council for Graduate Medical Education. Institutional Review Committee. Institutional requirements. Available at: (http://www.acgme.org/acWebsite/irc/irc_IRCpr703.asp). Accessed October 17, 2008.
2 Accreditation Council for Graduate Medical Education. Institutional Review Committee. Institutional Review Document—Part II. Available at: (http://www.acgme.org/acWebsite/irc/800cont-0707-071129.doc). Accessed October 17, 2008.
3 Keitz SA, Holland GJ, Melander EH, Bosworth HB, Pincus SH. The Veterans Affairs Learners' Perception Survey: The foundation for educational quality improvement. Acad Med. 2003;78:910–917.
4 Heard JK, O'Sullivan P, Smith CE, Harper RA, Schexnayder SM. An institutional system to monitor and improve the quality of resident education. Acad Med. 2004;79:858–864.
5 Asch SM, McGlynn EA, Hogan, MM, et al. Comparison of quality of care for patients in the Veterans' Health Administration and patients in a national sample. Ann Intern Med. 2004;141:938–945.
6 Jha AK, Perlin JB, Kizer KW, Dudley RA. Effect of the transformation of the Veterans Affairs Health Care System on the quality of care. N Engl J Med. 2003;348:2218–2227.
7 Ashton CM, Soucheck J, Peterson NJ, et al. Hospital use and survival among Veterans Affairs beneficiaries. N Engl J Med. 2003;349:1637–1646.
8 Kerr EA, Gerzoff RB, Krein SL, et al. Diabetes care quality in the Veterans Affairs Health Care System and commercial managed care: The TRIAD study. Ann Intern Med. 2004;141:272–281.
9 Perlin JB, Kolodner RM, Roswell RH. The Veterans Health Administration: Quality, value, accountability and information of transforming strategies for patient-centered care. Am J Manag Care. 2004;10:828–836.
10 Weber DO. Survey reveals physicians' love/hate relationship with technology. Phys Exec. March–April 2004:4–10.
11 Coyle GA, Heinen M. Evolution of BCMA in the Department of Veterans Affairs. Nurs Adm Q. January–March 2005;29:32–38.
12 Wright AA, Katz IT. Bar coding for patient safety. N Engl J Med. 2005;353:329–331.
13 Sivaram CA, Attebery S, Boyd AL, et al. Introducing case management to a general medicine ward team of a teaching hospital. Acad Med. 1997;72:555–557.
14 Rosenblatt RA, Andrilla CH. The impact of U.S. medical students' debt on their choice of primary care careers: An analysis of data from the 2002 medical school graduation questionnaire. Acad Med. 2005;80:815–819.
15 Pugnaire MP, Purwono U, Zanetti ML, Carlin MM. Tracking the longitudinal stability of medical students' perceptions using the AAMC graduation questionnaire and serial evaluation surveys. Acad Med. 2004;79(10 suppl):S32–S35.
© 2009 Association of American Medical Colleges