The 2007 and 2008 surveys encompassed residents in all core specialty programs, regardless of size, and fellows in subspecialty programs with four or more learners. Aggregated report data were provided to the programs and institutions with at least four residents or fellows, as part of ACGME's continuing efforts to safeguard resident confidentiality. Programs were randomly selected (within each specialty, so that all programs in a specialty would be surveyed in the same year) for survey participation in either 2007 or 2008.
The ACGME collected survey data between January and early June of both 2007 and 2008. Because of the high volume of residents who were accessing the Web-based survey, we scheduled survey participation across four time periods, with roughly equal numbers of programs participating in each period. We wanted to give ample time for program directors to inform residents of the upcoming survey and for residents to participate in the survey. Each program had approximately five weeks in which to complete the survey. A 128-bit secured site, which ensured the safe transfer of information, provided access to the survey. When surveys were complete, participants implemented password protection of their responses to ensure confidentiality. In addition, residents were notified that no individual responses would be given to the program director, faculty, sponsoring institution, or ACGME residency review committee. They were also told that summary data could be used to inform ACGME policy decisions at the national level and may be published in a manner appropriate to furthering the quality of graduate medical education.
The American Institutes for Research in Washington, DC, conducted a retrospective review of this project with respect to the protection of human subjects. They deemed it exempt from further review.
The ACGME required programs to participate in the survey in both years if they were involved in specialty-specific pilot projects (n = 105), if they did not reach 70% compliance in the previous year (n = 60), if they were operating under an exception to the 80-duty-hour work week (n = 63), or if they were identified by their duty hours responses in the 2007 survey as potentially being significantly noncompliant with duty hours limits (n = 115). Several programs met more than one of these criteria. If a program participated twice in the survey for any of these reasons, we report only the first administration in the aggregate data below, with the exception of test–retest results.
To assess validity, we compared data from the resident survey with the number and type of citations given by ACGME residency review committees. These data from the six-month period immediately preceding the resident survey were available for 1,140 programs. We believed that it was important to confine our data selection to that period because the resident survey results may be used by review committees to determine the number and/or type of citation to issue. In this way, we were able to ensure that the resident survey data had not influenced the type or number of citations given by the review committee but, rather, that the data reflected the circumstances in the program at or near the time of the site visit to the program.
To identify programs that potentially were substantially noncompliant, we aggregated noncompliant survey responses into two summary scores, each weighted by program size. One score was the mean of all noncompliant responses for the entire survey, and the other was the mean of the responses to the duty hours items only. We then compared the likelihood that a program would score at or above the 95th percentile (the programs that potentially were substantially noncompliant) or at or below the 5th percentile (the programs that potentially were substantially compliant) with the decisions of the review committee.
Our statistical analysis used simple correlations, Cronbach alpha, and common factor analyses with varimax rotation to assess internal consistency. To assess relationships between duty hours issues and educational environment, both within the survey and with external data, we used simple correlations, t tests, and logistic regression. All analyses were conducted with SAS software (version 9.13; SAS Institute, Inc., Cary, North Carolina).
Response rate and sample description
Across the two years of the survey, 5,614 programs (102,757 residents) were selected for participation. Residents from 5,610 programs participated, for a program response rate of 99.9%. A total of 91,073 residents participated, for a resident response rate of 88.6%. The average completion time for the survey was 8.08 minutes (SD: 6.13 minutes).
Response rates for the participating programs ranged from 16% to 100%; 127 programs (2.3%) failed to achieve the 70% response rate. A large number of programs (n = 2,470; 44%) had response rates of 100%. Response rate frequencies are shown in Table 2.
The average age of the respondents was 34.1 years. Table 3 shows other demographic data for the sample. This sample is representative of the entire U.S. residency program population, and further demographic information about the population can be accessed at the ACGME Web site in the ACGME Data Resource Book (http://www.acgme.org/databook).
Table 1 shows the percentage of resident responses that indicated potential program noncompliance for each item. The item (i.e., question) that residents most frequently reported as noncompliant was Question 19 (Q19; rotations and other major assignments emphasize clinical education over any other concerns); the least frequently identified as noncompliant was Q29, which covered internal moonlighting. Of the duty hours standards, 1.7% of respondents reported noncompliance with the requirement that “call be no more frequent than every third night”; 8.5% identified “10 hours of rest” as an area of noncompliance.
In 169 programs, no residents identified any area of noncompliance (i.e., the program had zero noncompliant responses). This accounted for 3% of programs, but 30% of the residents. A total of 2,066 programs (37%) had zero noncompliant duty hours responses.
In the second year, 276 programs were resurveyed, according to the criteria noted above. We used only these programs' first-year (2007) responses in our analyses, except for the retest statistics presented below.
Internal consistency and reliability
The Cronbach alpha for the entire survey was 0.84, and the item–total correlations ranged from 0.12 (for Q16, interference from other learners) to 0.49 (for Q1, faculty teaching). For the duty hours section (Q20–Q29), the alpha was 0.80, and the item–total correlations ranged from 0.33 (for Q29) to 0.56 (for Q20). For the non-duty-hours educational environment section (Q1–Q19), the alpha was 0.79, and the item–total correlations ranged from 0.09 to 0.54 (for Q16 and Q1, respectively).
To ascertain the survey's test–retest reliability, we examined the subset of residents completing the survey within the same program during both administrations. Within this sample of 276 programs (3,403 residents), the correlations between the same items at time 1 (2007) and time 2 (2008) ranged from 0.19 (Q29, internal moonlighting) to 0.44 (Q17, mechanisms to resolve issues without fear of intimidation).
In the overall sample, common factor analysis with a varimax rotation revealed a two-factor solution. Eigenvalues for the first two factors (5.49 for the first factor and 2.42 for the second) accounted for 27% of the total variance. The proportion of variance accounted for by each of the remaining factors was ≤5%. Examination of the rotated factor pattern showed the first factor to be a general “educational environment” factor, with item loadings ranging from 0.32 to 0.65. The second factor was clearly a “duty hours” factor, and all of the duty hours items loaded heavily (0.46–0.65) on this factor. There were no items cross-loading on the two factors (i.e., no items loading at ≥0.3 on both factors). However, one item (Q16—do other trainees interfere with your education?) did not load appreciably on either factor. Loadings from the rotated factor pattern are shown in Table 2.
Relationship between duty hours and the educational environment
As evident in the correlations presented above, the survey has high internal consistency. We attempted to discern whether a relationship exists between duty hours item responses and broader educational environment issues.
Correlations between duty hours items and educational environment items ranged from 0.03 to 0.23. The strongest relationships were found for Q13 (fatigue and sleep deprivation) and Q20 (exceed the 80-hour week limit on duty hours)—a value of 0.23; Q17 (mechanism to resolve issues without fear of intimidation) and Q22 (inadequate time for rest between duty periods)—a value of 0.23; and Q13 (fatigue and sleep deprivation) and Q22 (inadequate time for rest between duty periods)—a value of 0.22. The correlation was 0.41 between the total number of noncompliant duty hours items and the total number of noncompliant educational environment items. All correlations were significant at P < .0001.
Logistic regression demonstrated that noncompliant duty hours item responses are strongly related to issues in other aspects of the education program. Residents not meeting at least one duty hours standard within their programs were much more likely also to have reported noncompliance with Q1 (faculty teaching) (OR: 1.96; 95% CI: 1.87, 2.06), Q2 (faculty supervision) (OR: 1.85; 95% CI: 1.73, 1.97), Q16 (service obligations) (OR: 1.42; 95% CI: 1.37, 1.48), and Q17 (mechanisms to resolve issues without fear of intimidation) (OR: 3.5; 95% CI: 3.38, 3.64).
To ascertain the validity of survey responses, we compared program-level data on compliance with citations given by residency review committees in the review cycle that immediately preceded the administration of the survey for each program. A logistic regression analysis of these data shows that programs scoring at or above the 95th percentile on noncompliant duty hours responses were 2.04 times more likely to have at least one duty hours citation than were programs not scoring at or above the 95th percentile (95% CI: 1.03, 4.05). However, the two groups did not differ significantly in the total number of duty hours citations they received.
The logistic regression comparing the likelihood of any citation for those programs scoring at or above the 95th percentile on all the survey questions (not just the duty hours questions) was inconclusive (OR: 0.95; 95% CI: 0.49, 1.82). However, significantly fewer total citations (of any kind) were given to programs scoring at or below the fifth percentile of noncompliant responses on the total survey: M = 2.16 compared with M = 3.72; t(70.3) = 4.34; P < .0001.
We describe here the ACGME's Resident/Fellow Survey with respect to its internal consistency, reliability, and relationships between duty hours and educational environment items. Our findings show the survey to be internally consistent and reliable as well as strongly related to ACGME residency review committee decisions. Because of the scope of the survey (88% of residents at almost 100% of residency programs with four or more residents or fellows were included), it is highly representative of residents' assessments of their graduate medical education experiences and training.
The survey demonstrates adequate internal consistency and reliability, with all but one item (Q16—interference from other learners) showing moderate-to-high internal correlations. The test–retest statistics show weak-to-moderate relationships; the strongest of these results was the result for a structural element of residency (mechanisms available), which logically should remain stable across a one-year period. Factor analyses reveal two separate factors (the educational environment and duty hours) that are cited by review committees; thus, these distinctions may be useful in examining a program's pattern of compliance. However, some correlations among the items suggest that the areas are not mutually exclusive.
Issues identified by the survey are apparent not only to residents but also to ACGME residency review committees. The committees were more than twice as likely to cite duty hours issues in programs having noncompliant duty hours responses than in programs in which residents did not have noncompliant duty hours responses. Similarly, programs in which respondents identified no issues in any area on the survey received significantly fewer citations than did those programs for which residents indicated noncompliance.
Deficiencies in the educational environment were more likely to be identified by residents in programs identified as having potential duty hours violations. Our results are consistent with several studies that suggested that duty hours issues are related to other issues in residency education.16,17 Our data show that one of the strongest relationships is the relationship between inadequate time for rest between shifts and the lack of a mechanism to address issues without fear of intimidation. At least two possible explanations exist for this finding. First, it may be that resource-poor programs rely excessively on residents to provide patient care, and resource scarcity may also include the lack of structures to handle issues and complaints that arise. Second, residents who feel particularly overworked may perceive, perhaps incorrectly, that few channels exist through which they can address their concerns.
Another interesting finding in our data is the relatively large percentage (37%) of programs in which residents reported no duty hours noncompliance. In contrast, the percentage of programs in which residents reported no noncompliance of any kind was relatively small (3%). It may be that residency education and training have become increasingly structured to satisfy duty hours standards. The contrast between these findings may also reflect the recent and intense focus on reducing duty hours, perhaps to the exclusion of addressing other aspects of the educational environment.18,19 Limiting duty hours may not be a panacea, and awareness of the relationships between duty hours and other aspects of the educational environment may be useful in improving resident education and patient care.20
Our results show a significant proportion of residents who report deficiencies in other, non-duty-hours dimensions of the educational and learning environment. These reports involve concerns about the inability of residents to raise issues without fear of intimidation (Q17), the balance between excessive service and education (Q19), and the adverse impact of other trainees on their education (Q16).
The results of the survey show it to be a useful tool in the assessment of residency accreditation. These analyses do, however, have several limitations. First, the data, although representative of the larger U.S. residency programs, do not include the many subspecialty programs with fewer than four fellows. The programs training one or two fellows per year present a challenge to the collection of reliable, valid, and—perhaps most important—confidential information. The ACGME continues to work with its constituents to identify methods for confidentially collecting meaningful and usable accreditation data for these small programs.
Second, it is possible that residents may have misunderstood questions or been unaware of ACGME requirements, which may have affected their responses. Residents with questions were encouraged (within the survey) to contact us, and our help desk estimates that less than 5% of the calls were about question content or meaning.
Third, residents may have been concerned about the confidentiality of their responses, or they may have felt coerced by program directors to provide positive responses to the survey questions. Thus, they may not have offered completely candid assessments of their programs' functioning. However, our data do include critical evaluations of many programs, so this possible problem, though it may affect a few residents, does not seem to be widespread.
Fourth, our evidence for validity is quite strong, and there is a compelling relationship between the survey and residency review committee citations. At the same time, there is no source of external (non-ACGME) data against which to compare our findings. As we continue to effectively assess and monitor residency education, we will rely on the ACGME residency review committees to link decisions and citations to objective, measurable outcomes.
The survey will continue to play an important role in the ACGME's new and evolving accreditation model, which will shift the focus from an episodic review of program structure and function to a continuous assessment of program effectiveness and outcomes. In the interval between site visits, which is likely to lengthen in the proposed future accreditation model, data from annual resident surveys, together with other reliable and valid measures of program quality and functioning, will substantially inform the accreditation process.
The ACGME Resident/Fellow Survey is a reliable, valid, and useful tool for the evaluation of residency programs. Although formal data collection and assessment are becoming more common in medical education accreditation, this, to our knowledge, is the first such data-gathering tool that has been validated and reported in the medical literature. Program directors and designated institutional officials may find their program- and institution-specific data useful in internal program evaluations as well as for informing their improvement efforts. Our data show that duty hours issues are often linked with other aspects of the educational environment and, moreover, that duty hours issues identified by residents are often also noted by residency review committees. The ACGME will continue to examine thresholds of survey responses to allow for finer discrimination among residency programs. Residents are a pivotal source of information as we assess recent efforts to balance service and education through the limitation of duty hours. This survey, together with other tools under development at the ACGME, will permit ACGME residency review committees to assess the effects that educational program changes have on resident educational outcomes. We must heed the recent calls to reduce duty hours and increase the quality of patient care, but we also must do no harm to the educational system or to the caregivers themselves.
The authors wish to thank Nehemiah Ellison, Erle Fajardo, Christopher Jordan, and Steve Nash for their help in data preparation and application programming; the ADS team—Samantha Alvarado, Timothy Goldberg, Rachel Eng, Andrew Turkington, and Emilio Villatoro—for their support in resident survey data collection; and Kavitha Reinhold for her editorial support and advice.
1Byrne JM, Loo LK, Giang D. Monitoring and improving resident work environment across affiliated hospitals: A call for a national resident survey. Acad Med. 2009;84:199–205.
2Heard JK, O'Sullivan P, Smith CE, Harper RA, Schexnayder SM. An institutional system to monitor and improve the quality of residency education. Acad Med. 2004;79:858–864.
3Roth LM, Severson RK, Probst JC, et al. Exploring physician and staff perceptions of the learning environment in ambulatory residency clinics. Fam Med. 2006;38:177–184.
4Davenport DL, Henderson WG, Hogan S, Mentzer RM Jr, Zwischenberger JB. Participants in the Working Conditions of Surgery Residents and Quality of Care Study. Surgery resident working conditions and job satisfaction. Surgery. 2008;144:332–338.e5.
5Klessig JM, Wolfsthal SD, Levine MA, et al. A pilot survey study to define quality in residency education. Acad Med. 2000;75:71–73.
6Yudkowsky R, Elliott R, Schwartz A. Two perspectives on the indicators of quality in psychiatry residencies: Program directors' and residents.' Acad Med. 2002;77:57–64.
7Thrush CR, Hicks EK, Tariq SG, et al. Optimal learning environments from the perspective of resident physicians and associations with accreditation length. Acad Med. 2007;82(10 suppl):S121–S125.
8Daugherty SR, Baldwin DC Jr, Rowley BD. Learning, satisfaction, and mistreatment during medical internship: A national survey of working conditions. JAMA. 1998;279:1194–1199.
9Keirns CC, Bosk CL. Perspective: The unintended consequences of training residents in dysfunctional outpatient settings. Acad Med. 2008;83:498–502.
10Immerman I, Kubiak EN, Zuckerman JD. Resident work-hour rules: A survey of residents' and program directors' opinions and attitudes. Am J Orthop. 2007;36:E172–E179.
11Golub JS, Weiss PS, Ramesh AK, Ossoff RH, Johns MM 3rd. Burnout in residents of otolaryngology–head and neck surgery: A national inquiry into the health of residency training. Acad Med. 2007;82:596–601.
12Schneider JR, Coyle JJ, Ryan ER, Bell RH Jr, DaRosa DA. Implementation and evaluation of a new surgical residency model. J Am Coll Surg. 2007;205:393–404.
13Swide CE, Kirsch JR. Duty hours restriction and their effect on resident education and academic departments: The American perspective. Curr Opin Anaesthesiol. 2007;20:580–584.
14Basu CB, Chen LM, Hollier LH Jr, Shenaq SM. The effect of the Accreditation Council for Graduate Medical Education duty hours policy on plastic surgery resident education and patient care: An outcomes study. Plast Reconstr Surg. 2004;114:1878–1886.
15Philibert I, Friedman P, Williams WT. New requirements for resident duty hours. JAMA. 2002;288:1112–1114.
16Strunk CL, Bailey BJ, Scott BA, et al. Resident work hours and working environment in otolaryngology. Analysis of daily activity and resident perception. JAMA. 1991;11:266.
17Baldwin DC Jr, Daugherty SR, Tsai R, Scotti MJ Jr. A national survey of residents' self-reported work hours: Thinking beyond specialty. Acad Med. 2003;78:1154–1163.
18Myers JS, Bellini LM, Morris JB, et al. Internal medicine and general surgery residents' attitudes about the ACGME duty hours regulations: A multicenter study. Acad Med. 2006;81:1052–1058.
19Mathis BR, Diers T, Hornung R, Ho M, Rouan GW. Implementing duty-hour restrictions without diminishing patient care or education: Can it be done? Acad Med. 2006;81:68–75.
© 2010 Association of American Medical Colleges
20Longnecker DE. Resident duty hours reform: Are we there yet? Acad Med. 2006;81:1017–1020.