Skip Navigation LinksHome > March 2010 - Volume 85 - Issue 3 > Residents' Perspectives on the Learning Environment: Data Fr...
Academic Medicine:
doi: 10.1097/ACM.0b013e3181ccc1db
Graduate Medical Education

Residents' Perspectives on the Learning Environment: Data From the Accreditation Council for Graduate Medical Education Resident Survey

Holt, Kathleen D. PhD; Miller, Rebecca S. MS; Philibert, Ingrid PhD, MBA; Heard, Jeanne K. MD, PhD; Nasca, Thomas J. MD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Holt is senior data analyst, Accreditation Council for Graduate Medical Education, Chicago, Illinois and adjunct research professor, Family Medicine, University of Rochester, Rochester, New York.

Ms. Miller is vice president, Applications and Data Analysis, Accreditation Council for Graduate Medical Education, Chicago, Illinois.

Dr. Philibert is senior vice president, Field Activities, Accreditation Council for Graduate Medical Education, Chicago, Illinois.

Dr. Heard is senior vice president, Accreditation Committees, Accreditation Council for Graduate Medical Education, Chicago, Illinois, and adjunct professor, Medical Education, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. Nasca is chief executive officer, Accreditation Council for Graduate Medical Education, Chicago, Illinois, and professor, Department of Medicine, Jefferson Medical College, Philadelphia, Pennsylvania.

Correspondence should be addressed to Dr. Holt, Accreditation Council for Graduate Medical Education, 515 North State Street, Suite 2000, Chicago, IL 60654; telephone: (312) 755-7481; fax: (312) 755-7493; e-mail: Kholt@acgme.org.

Collapse Box

Abstract

Purpose: Residents' assessment of their learning environment is an important element of residency accreditation and a strong predictor of resident satisfaction. The authors examined the reliability and validity of a resident/fellow survey and explored the relationship between reported duty hours noncompliance and residents' perceptions of other aspects of their learning environments.

Method: The Accreditation Council for Graduate Medical Education (ACGME) administered a 29-item Web-based survey in 2007 and 2008 to 91,073 residents in 5,610 programs. Aggregate data from the survey comprised indicators of substantial compliance or noncompliance. The authors examined relationships among duty hours and aspects of the educational environment, as well as the relationship of the survey results to citations from accreditation reviews.

Results: The survey demonstrated a high degree of internal reliability (Cronbach alpha, 0.84). Common factor analysis revealed two factors, educational environment and resident duty hours (eigenvalues of 5.49 and 2.42, respectively). Programs having resident-identified duty hours issues were more likely than those without such issues to have received duty hours citations from residency review committees (odds ratio: 2.04; 95% CI: 1.03, 3.05).

Conclusions: The ACGME Resident/Fellow Survey is a reliable, valid, and useful tool for evaluating residency programs. There are strong relationships between duty hours noncompliance and noncompliance in other aspects of the program environment.

The Accreditation Council for Graduate Medical Education (ACGME) accredits specialty and subspecialty graduate medical education programs. The assessment of program quality is an important part of the accreditation process, which depends on systematic and comprehensive data collection. The ACGME collects accreditation data on an ongoing basis from residents and fellows, as well as from program directors and faculty. These data include information on residency program curriculum design and implementation, evaluation systems, institutional support and resources, clinical care systems, faculty and resident scholarly activities, residents' operative experiences, program administrative resources, and residents' evaluation of the program. The ACGME specialty-specific residency review committees use this information to make accreditation decisions (e.g., full accreditation or probation), to set the next date of on-site review, and to issue citations for areas that require improvement. The primary tool the ACGME uses for assessing residents' evaluation of their programs is the ACGME Resident/Fellow Survey.

Residents' evaluations of their programs are a vital source of information about residency program quality, and they have been useful for planning program evaluation and improvement.1–3 As well as predicting resident satisfaction, residents' evaluations of their work environments are related to their perceptions of the quality of patient care that they provide.4

Residents' views of their programs are corroborated by others in the educational environment. A survey of internal medicine residents and program directors found a high rate of agreement (r = 0.91) between the attributes that residents and program directors reported as important to program quality, including faculty characteristics, supervision, institutional support, and clinical skills.5 Program directors also agree with residents that curriculum and clinical resources contribute to a successful residency.6

Residents' opinions and assessment of faculty quality have been found to be associated with the length of the program accreditation cycle.7 Moreover, although most residents are satisfied with their residency programs, residents' dissatisfaction may affect learning and may even encourage residents to change their specialty or to select a subspecialty rather than primary care practice.8,9

Duty hours are a particularly salient aspect of residents' evaluations of their programs. A reduction in duty hours seems to positively affect residents' perceptions of their quality of life and may affect their assessments of other aspects of the educational program.10–12 However, there is some evidence that a reduction in duty hours in and of itself may not increase either educational effectiveness or residents' quality of life.13,14

This report presents national normative data from the ACGME Resident/Fellow Survey. This instrument, based on the ACGME common program requirements, gathers information relevant to program quality and resident satisfaction. To date, there have been no other published assessments of the reliability, consistency, and validity of the ACGME Resident/Fellow Survey data.

In our analyses, we explored the survey's internal consistency, reliability, and validity. The primary goal of these analyses is to show the strong relationship between the residents' responses to duty hours questions and their responses to other aspects of the educational environment. Secondarily, we examined the relationship between noncompliant survey responses and ACGME accreditation decisions; those findings demonstrated the survey's validity. Finally, we explored the value of these survey data in monitoring duty hours compliance and in assessing residency programs' educational environments.

Back to Top | Article Outline

Evolution of the Resident Survey

To enhance and supplement the on-site resident interviews (beyond those with the peer-selected cohort that take place face-to-face with reviewers), the ACGME began in 2003 to survey residents and fellows during program accreditation site visits. The initiation of this survey also coincided with implementation of the resident duty hours limits that apply to all residents and fellows.15 In the course of program review, ACGME site surveyors are asked to verify and clarify all information that a program provides to the ACGME, including information on faculty rosters, rotation schedules, curriculum design, documentation of resident evaluation, and residents' survey responses. A Web-based pilot test was conducted in the spring of 2003; in January 2004, the survey was administered to residents enrolled in approximately one-third of the ACGME-accredited programs. This initial group of programs was selected because site visits to these programs were set to take place in that year or the next; over the next two years, the survey was administered to the remaining two-thirds of the programs.

We asked that each program achieve a minimum completion rate of 70%. The ACGME established the 70% threshold to ensure that an adequate and representative sample of residents' opinions was obtained and to ensure the confidentiality of each resident's response, especially in the smaller programs that have fewer residents. Although seeking accreditation is voluntary, the accreditation process carries expectations that must be met. Any program not meeting the 70% compliance threshold received letters from the chief executive officer of the ACGME, explaining that failure to comply again could result in an adverse accreditation action. This is the same data-collection policy that governs ACGME-requested data of any kind, including residents' case log data.

In 2006, after three years of initial data collection (2004, 2005, and 2006), we reviewed all of the survey feedback received from ACGME constituents. We then reassessed the survey's design and implementation and consulted with the University of Illinois Survey Research Laboratory to eliminate or rewrite poorly performing, confusing, or low-variability items.

The 29 items in this revised survey are shown in Table 1. One additional item (“If you noted any issues with duty hours in the section above, would you say that those issues occurred mostly on rotations to other services outside your specialty?”) was included for the programs' use only and is not included in these analyses. Item 17 (“Are there mechanisms within the institution available to you so that you may raise and resolve issues without fear of intimidation or retaliation?”) was worded differently in the initial year of the survey. However, different versions of this question did not affect our results, and data were combined for the analyses below. Additional information, including the actual survey, instructions for completion, and sample reports, are available on the ACGME Web site (https://www.acgme.org/acWebsite/Resident_Survey/res_Index.asp).

Table 1
Table 1
Image Tools
Table 1
Table 1
Image Tools

The 2007 and 2008 surveys encompassed residents in all core specialty programs, regardless of size, and fellows in subspecialty programs with four or more learners. Aggregated report data were provided to the programs and institutions with at least four residents or fellows, as part of ACGME's continuing efforts to safeguard resident confidentiality. Programs were randomly selected (within each specialty, so that all programs in a specialty would be surveyed in the same year) for survey participation in either 2007 or 2008.

Back to Top | Article Outline

Method

The ACGME collected survey data between January and early June of both 2007 and 2008. Because of the high volume of residents who were accessing the Web-based survey, we scheduled survey participation across four time periods, with roughly equal numbers of programs participating in each period. We wanted to give ample time for program directors to inform residents of the upcoming survey and for residents to participate in the survey. Each program had approximately five weeks in which to complete the survey. A 128-bit secured site, which ensured the safe transfer of information, provided access to the survey. When surveys were complete, participants implemented password protection of their responses to ensure confidentiality. In addition, residents were notified that no individual responses would be given to the program director, faculty, sponsoring institution, or ACGME residency review committee. They were also told that summary data could be used to inform ACGME policy decisions at the national level and may be published in a manner appropriate to furthering the quality of graduate medical education.

The American Institutes for Research in Washington, DC, conducted a retrospective review of this project with respect to the protection of human subjects. They deemed it exempt from further review.

The ACGME required programs to participate in the survey in both years if they were involved in specialty-specific pilot projects (n = 105), if they did not reach 70% compliance in the previous year (n = 60), if they were operating under an exception to the 80-duty-hour work week (n = 63), or if they were identified by their duty hours responses in the 2007 survey as potentially being significantly noncompliant with duty hours limits (n = 115). Several programs met more than one of these criteria. If a program participated twice in the survey for any of these reasons, we report only the first administration in the aggregate data below, with the exception of test–retest results.

To assess validity, we compared data from the resident survey with the number and type of citations given by ACGME residency review committees. These data from the six-month period immediately preceding the resident survey were available for 1,140 programs. We believed that it was important to confine our data selection to that period because the resident survey results may be used by review committees to determine the number and/or type of citation to issue. In this way, we were able to ensure that the resident survey data had not influenced the type or number of citations given by the review committee but, rather, that the data reflected the circumstances in the program at or near the time of the site visit to the program.

To identify programs that potentially were substantially noncompliant, we aggregated noncompliant survey responses into two summary scores, each weighted by program size. One score was the mean of all noncompliant responses for the entire survey, and the other was the mean of the responses to the duty hours items only. We then compared the likelihood that a program would score at or above the 95th percentile (the programs that potentially were substantially noncompliant) or at or below the 5th percentile (the programs that potentially were substantially compliant) with the decisions of the review committee.

Our statistical analysis used simple correlations, Cronbach alpha, and common factor analyses with varimax rotation to assess internal consistency. To assess relationships between duty hours issues and educational environment, both within the survey and with external data, we used simple correlations, t tests, and logistic regression. All analyses were conducted with SAS software (version 9.13; SAS Institute, Inc., Cary, North Carolina).

Back to Top | Article Outline

Results

Response rate and sample description

Across the two years of the survey, 5,614 programs (102,757 residents) were selected for participation. Residents from 5,610 programs participated, for a program response rate of 99.9%. A total of 91,073 residents participated, for a resident response rate of 88.6%. The average completion time for the survey was 8.08 minutes (SD: 6.13 minutes).

Response rates for the participating programs ranged from 16% to 100%; 127 programs (2.3%) failed to achieve the 70% response rate. A large number of programs (n = 2,470; 44%) had response rates of 100%. Response rate frequencies are shown in Table 2.

Table 2
Table 2
Image Tools

The average age of the respondents was 34.1 years. Table 3 shows other demographic data for the sample. This sample is representative of the entire U.S. residency program population, and further demographic information about the population can be accessed at the ACGME Web site in the ACGME Data Resource Book (http://www.acgme.org/databook).

Table 3
Table 3
Image Tools
Back to Top | Article Outline
Noncompliant responses

Table 1 shows the percentage of resident responses that indicated potential program noncompliance for each item. The item (i.e., question) that residents most frequently reported as noncompliant was Question 19 (Q19; rotations and other major assignments emphasize clinical education over any other concerns); the least frequently identified as noncompliant was Q29, which covered internal moonlighting. Of the duty hours standards, 1.7% of respondents reported noncompliance with the requirement that “call be no more frequent than every third night”; 8.5% identified “10 hours of rest” as an area of noncompliance.

In 169 programs, no residents identified any area of noncompliance (i.e., the program had zero noncompliant responses). This accounted for 3% of programs, but 30% of the residents. A total of 2,066 programs (37%) had zero noncompliant duty hours responses.

In the second year, 276 programs were resurveyed, according to the criteria noted above. We used only these programs' first-year (2007) responses in our analyses, except for the retest statistics presented below.

Back to Top | Article Outline
Internal consistency and reliability

The Cronbach alpha for the entire survey was 0.84, and the item–total correlations ranged from 0.12 (for Q16, interference from other learners) to 0.49 (for Q1, faculty teaching). For the duty hours section (Q20–Q29), the alpha was 0.80, and the item–total correlations ranged from 0.33 (for Q29) to 0.56 (for Q20). For the non-duty-hours educational environment section (Q1–Q19), the alpha was 0.79, and the item–total correlations ranged from 0.09 to 0.54 (for Q16 and Q1, respectively).

To ascertain the survey's test–retest reliability, we examined the subset of residents completing the survey within the same program during both administrations. Within this sample of 276 programs (3,403 residents), the correlations between the same items at time 1 (2007) and time 2 (2008) ranged from 0.19 (Q29, internal moonlighting) to 0.44 (Q17, mechanisms to resolve issues without fear of intimidation).

In the overall sample, common factor analysis with a varimax rotation revealed a two-factor solution. Eigenvalues for the first two factors (5.49 for the first factor and 2.42 for the second) accounted for 27% of the total variance. The proportion of variance accounted for by each of the remaining factors was ≤5%. Examination of the rotated factor pattern showed the first factor to be a general “educational environment” factor, with item loadings ranging from 0.32 to 0.65. The second factor was clearly a “duty hours” factor, and all of the duty hours items loaded heavily (0.46–0.65) on this factor. There were no items cross-loading on the two factors (i.e., no items loading at ≥0.3 on both factors). However, one item (Q16—do other trainees interfere with your education?) did not load appreciably on either factor. Loadings from the rotated factor pattern are shown in Table 2.

Back to Top | Article Outline
Relationship between duty hours and the educational environment

As evident in the correlations presented above, the survey has high internal consistency. We attempted to discern whether a relationship exists between duty hours item responses and broader educational environment issues.

Correlations between duty hours items and educational environment items ranged from 0.03 to 0.23. The strongest relationships were found for Q13 (fatigue and sleep deprivation) and Q20 (exceed the 80-hour week limit on duty hours)—a value of 0.23; Q17 (mechanism to resolve issues without fear of intimidation) and Q22 (inadequate time for rest between duty periods)—a value of 0.23; and Q13 (fatigue and sleep deprivation) and Q22 (inadequate time for rest between duty periods)—a value of 0.22. The correlation was 0.41 between the total number of noncompliant duty hours items and the total number of noncompliant educational environment items. All correlations were significant at P < .0001.

Logistic regression demonstrated that noncompliant duty hours item responses are strongly related to issues in other aspects of the education program. Residents not meeting at least one duty hours standard within their programs were much more likely also to have reported noncompliance with Q1 (faculty teaching) (OR: 1.96; 95% CI: 1.87, 2.06), Q2 (faculty supervision) (OR: 1.85; 95% CI: 1.73, 1.97), Q16 (service obligations) (OR: 1.42; 95% CI: 1.37, 1.48), and Q17 (mechanisms to resolve issues without fear of intimidation) (OR: 3.5; 95% CI: 3.38, 3.64).

Back to Top | Article Outline
Validity

To ascertain the validity of survey responses, we compared program-level data on compliance with citations given by residency review committees in the review cycle that immediately preceded the administration of the survey for each program. A logistic regression analysis of these data shows that programs scoring at or above the 95th percentile on noncompliant duty hours responses were 2.04 times more likely to have at least one duty hours citation than were programs not scoring at or above the 95th percentile (95% CI: 1.03, 4.05). However, the two groups did not differ significantly in the total number of duty hours citations they received.

The logistic regression comparing the likelihood of any citation for those programs scoring at or above the 95th percentile on all the survey questions (not just the duty hours questions) was inconclusive (OR: 0.95; 95% CI: 0.49, 1.82). However, significantly fewer total citations (of any kind) were given to programs scoring at or below the fifth percentile of noncompliant responses on the total survey: M = 2.16 compared with M = 3.72; t(70.3) = 4.34; P < .0001.

Back to Top | Article Outline

Discussion

We describe here the ACGME's Resident/Fellow Survey with respect to its internal consistency, reliability, and relationships between duty hours and educational environment items. Our findings show the survey to be internally consistent and reliable as well as strongly related to ACGME residency review committee decisions. Because of the scope of the survey (88% of residents at almost 100% of residency programs with four or more residents or fellows were included), it is highly representative of residents' assessments of their graduate medical education experiences and training.

The survey demonstrates adequate internal consistency and reliability, with all but one item (Q16—interference from other learners) showing moderate-to-high internal correlations. The test–retest statistics show weak-to-moderate relationships; the strongest of these results was the result for a structural element of residency (mechanisms available), which logically should remain stable across a one-year period. Factor analyses reveal two separate factors (the educational environment and duty hours) that are cited by review committees; thus, these distinctions may be useful in examining a program's pattern of compliance. However, some correlations among the items suggest that the areas are not mutually exclusive.

Issues identified by the survey are apparent not only to residents but also to ACGME residency review committees. The committees were more than twice as likely to cite duty hours issues in programs having noncompliant duty hours responses than in programs in which residents did not have noncompliant duty hours responses. Similarly, programs in which respondents identified no issues in any area on the survey received significantly fewer citations than did those programs for which residents indicated noncompliance.

Deficiencies in the educational environment were more likely to be identified by residents in programs identified as having potential duty hours violations. Our results are consistent with several studies that suggested that duty hours issues are related to other issues in residency education.16,17 Our data show that one of the strongest relationships is the relationship between inadequate time for rest between shifts and the lack of a mechanism to address issues without fear of intimidation. At least two possible explanations exist for this finding. First, it may be that resource-poor programs rely excessively on residents to provide patient care, and resource scarcity may also include the lack of structures to handle issues and complaints that arise. Second, residents who feel particularly overworked may perceive, perhaps incorrectly, that few channels exist through which they can address their concerns.

Another interesting finding in our data is the relatively large percentage (37%) of programs in which residents reported no duty hours noncompliance. In contrast, the percentage of programs in which residents reported no noncompliance of any kind was relatively small (3%). It may be that residency education and training have become increasingly structured to satisfy duty hours standards. The contrast between these findings may also reflect the recent and intense focus on reducing duty hours, perhaps to the exclusion of addressing other aspects of the educational environment.18,19 Limiting duty hours may not be a panacea, and awareness of the relationships between duty hours and other aspects of the educational environment may be useful in improving resident education and patient care.20

Our results show a significant proportion of residents who report deficiencies in other, non-duty-hours dimensions of the educational and learning environment. These reports involve concerns about the inability of residents to raise issues without fear of intimidation (Q17), the balance between excessive service and education (Q19), and the adverse impact of other trainees on their education (Q16).

Back to Top | Article Outline
Limitations

The results of the survey show it to be a useful tool in the assessment of residency accreditation. These analyses do, however, have several limitations. First, the data, although representative of the larger U.S. residency programs, do not include the many subspecialty programs with fewer than four fellows. The programs training one or two fellows per year present a challenge to the collection of reliable, valid, and—perhaps most important—confidential information. The ACGME continues to work with its constituents to identify methods for confidentially collecting meaningful and usable accreditation data for these small programs.

Second, it is possible that residents may have misunderstood questions or been unaware of ACGME requirements, which may have affected their responses. Residents with questions were encouraged (within the survey) to contact us, and our help desk estimates that less than 5% of the calls were about question content or meaning.

Third, residents may have been concerned about the confidentiality of their responses, or they may have felt coerced by program directors to provide positive responses to the survey questions. Thus, they may not have offered completely candid assessments of their programs' functioning. However, our data do include critical evaluations of many programs, so this possible problem, though it may affect a few residents, does not seem to be widespread.

Fourth, our evidence for validity is quite strong, and there is a compelling relationship between the survey and residency review committee citations. At the same time, there is no source of external (non-ACGME) data against which to compare our findings. As we continue to effectively assess and monitor residency education, we will rely on the ACGME residency review committees to link decisions and citations to objective, measurable outcomes.

The survey will continue to play an important role in the ACGME's new and evolving accreditation model, which will shift the focus from an episodic review of program structure and function to a continuous assessment of program effectiveness and outcomes. In the interval between site visits, which is likely to lengthen in the proposed future accreditation model, data from annual resident surveys, together with other reliable and valid measures of program quality and functioning, will substantially inform the accreditation process.

Back to Top | Article Outline

Conclusions

The ACGME Resident/Fellow Survey is a reliable, valid, and useful tool for the evaluation of residency programs. Although formal data collection and assessment are becoming more common in medical education accreditation, this, to our knowledge, is the first such data-gathering tool that has been validated and reported in the medical literature. Program directors and designated institutional officials may find their program- and institution-specific data useful in internal program evaluations as well as for informing their improvement efforts. Our data show that duty hours issues are often linked with other aspects of the educational environment and, moreover, that duty hours issues identified by residents are often also noted by residency review committees. The ACGME will continue to examine thresholds of survey responses to allow for finer discrimination among residency programs. Residents are a pivotal source of information as we assess recent efforts to balance service and education through the limitation of duty hours. This survey, together with other tools under development at the ACGME, will permit ACGME residency review committees to assess the effects that educational program changes have on resident educational outcomes. We must heed the recent calls to reduce duty hours and increase the quality of patient care, but we also must do no harm to the educational system or to the caregivers themselves.

Back to Top | Article Outline

Acknowledgments:

The authors wish to thank Nehemiah Ellison, Erle Fajardo, Christopher Jordan, and Steve Nash for their help in data preparation and application programming; the ADS team—Samantha Alvarado, Timothy Goldberg, Rachel Eng, Andrew Turkington, and Emilio Villatoro—for their support in resident survey data collection; and Kavitha Reinhold for her editorial support and advice.

Back to Top | Article Outline

Funding/Support:

None.

Back to Top | Article Outline

Other disclosures:

None.

Back to Top | Article Outline

Ethical approval:

Not applicable.

Back to Top | Article Outline

References

1Byrne JM, Loo LK, Giang D. Monitoring and improving resident work environment across affiliated hospitals: A call for a national resident survey. Acad Med. 2009;84:199–205.

2Heard JK, O'Sullivan P, Smith CE, Harper RA, Schexnayder SM. An institutional system to monitor and improve the quality of residency education. Acad Med. 2004;79:858–864.

3Roth LM, Severson RK, Probst JC, et al. Exploring physician and staff perceptions of the learning environment in ambulatory residency clinics. Fam Med. 2006;38:177–184.

4Davenport DL, Henderson WG, Hogan S, Mentzer RM Jr, Zwischenberger JB. Participants in the Working Conditions of Surgery Residents and Quality of Care Study. Surgery resident working conditions and job satisfaction. Surgery. 2008;144:332–338.e5.

5Klessig JM, Wolfsthal SD, Levine MA, et al. A pilot survey study to define quality in residency education. Acad Med. 2000;75:71–73.

6Yudkowsky R, Elliott R, Schwartz A. Two perspectives on the indicators of quality in psychiatry residencies: Program directors' and residents.' Acad Med. 2002;77:57–64.

7Thrush CR, Hicks EK, Tariq SG, et al. Optimal learning environments from the perspective of resident physicians and associations with accreditation length. Acad Med. 2007;82(10 suppl):S121–S125.

8Daugherty SR, Baldwin DC Jr, Rowley BD. Learning, satisfaction, and mistreatment during medical internship: A national survey of working conditions. JAMA. 1998;279:1194–1199.

9Keirns CC, Bosk CL. Perspective: The unintended consequences of training residents in dysfunctional outpatient settings. Acad Med. 2008;83:498–502.

10Immerman I, Kubiak EN, Zuckerman JD. Resident work-hour rules: A survey of residents' and program directors' opinions and attitudes. Am J Orthop. 2007;36:E172–E179.

11Golub JS, Weiss PS, Ramesh AK, Ossoff RH, Johns MM 3rd. Burnout in residents of otolaryngology–head and neck surgery: A national inquiry into the health of residency training. Acad Med. 2007;82:596–601.

12Schneider JR, Coyle JJ, Ryan ER, Bell RH Jr, DaRosa DA. Implementation and evaluation of a new surgical residency model. J Am Coll Surg. 2007;205:393–404.

13Swide CE, Kirsch JR. Duty hours restriction and their effect on resident education and academic departments: The American perspective. Curr Opin Anaesthesiol. 2007;20:580–584.

14Basu CB, Chen LM, Hollier LH Jr, Shenaq SM. The effect of the Accreditation Council for Graduate Medical Education duty hours policy on plastic surgery resident education and patient care: An outcomes study. Plast Reconstr Surg. 2004;114:1878–1886.

15Philibert I, Friedman P, Williams WT. New requirements for resident duty hours. JAMA. 2002;288:1112–1114.

16Strunk CL, Bailey BJ, Scott BA, et al. Resident work hours and working environment in otolaryngology. Analysis of daily activity and resident perception. JAMA. 1991;11:266.

17Baldwin DC Jr, Daugherty SR, Tsai R, Scotti MJ Jr. A national survey of residents' self-reported work hours: Thinking beyond specialty. Acad Med. 2003;78:1154–1163.

18Myers JS, Bellini LM, Morris JB, et al. Internal medicine and general surgery residents' attitudes about the ACGME duty hours regulations: A multicenter study. Acad Med. 2006;81:1052–1058.

19Mathis BR, Diers T, Hornung R, Ho M, Rouan GW. Implementing duty-hour restrictions without diminishing patient care or education: Can it be done? Acad Med. 2006;81:68–75.

20Longnecker DE. Resident duty hours reform: Are we there yet? Acad Med. 2006;81:1017–1020.

Cited By:

This article has been cited 1 time(s).

New England Journal of Medicine
Service: An Essential Component of Graduate Medical Education
Kesselheim, JC; Cassel, CK
New England Journal of Medicine, 368(6): 500-501.

Back to Top | Article Outline

© 2010 Association of American Medical Colleges

Login

Article Tools

Images

Share