Secondary Logo

The Association Between Residency Learning Climate and Inpatient Care Experience in Clinical Teaching Departments in the Netherlands

Smirnova, Alina, MD, PhD; Arah, Onyebuchi A., MD, PhD; Stalmeijer, Renée E., PhD; Lombarts, Kiki M.J.M.H., PhD; van der Vleuten, Cees P.M., PhD

doi: 10.1097/ACM.0000000000002494
Research Reports
Free
SDC

Purpose To examine the association between residency learning climate and inpatient care experience.

Method The authors analyzed 1,201 evaluations of the residency learning climate (using the Dutch Residency Educational Climate Test questionnaire) and 6,689 evaluations of inpatient care experience (using the Consumer Quality Index Inpatient Hospital Care questionnaire) from 86 departments across 15 specialties in 18 hospitals in the Netherlands between 2013 and 2014. The authors used linear hierarchical panel analyses to study the associations between departments’ overall and subscale learning climate scores and inpatient care experience global ratings and subscale scores, controlling for respondent- and department-level characteristics and correcting for multiple testing.

Results Overall learning climate was not associated with global department ratings (b = 0.03; 95% confidence interval −0.17 to 0.23) but was positively associated with specific inpatient care experience domains, including communication with doctors (b = 0.11; 0.02 to 0.20) and feeling of safety (b = 0.09; 0.01 to 0.17). Coaching and assessment was positively associated with communication with doctors (b = 0.22; 0.08 to 0.37) and explanation of treatment (b = 0.22; 0.08 to 0.36). Formal education was negatively associated with pain management (b = −0.16; −0.26 to −0.05), while peer collaboration was positively associated with pain management (b = 0.14; 0.03 to 0.24).

Conclusions Optimizing the clinical learning environment is an important step toward ensuring high-quality residency training and patient care. These findings could help clinical teaching departments address those aspects of the learning environment that directly affect patient care.

A. Smirnova is a PhD researcher, School of Health Professions Education, Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, the Netherlands, and researcher, Professional Performance Research Group, Department of Medical Psychology, Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands.

O.A. Arah is professor, Department of Epidemiology, Fielding School of Public Health, University of California, Los Angeles, Los Angeles, California.

R.E. Stalmeijer is assistant professor, Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, the Netherlands.

K.M.J.M.H. Lombarts is professor, Professional Performance Research Group, Department of Medical Psychology, Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands.

C.P.M. van der Vleuten is professor and scientific director, School of Health Professions Education, Department of Educational Development and Research, Faculty of Health, Medicine, and Life Sciences, Maastricht University, Maastricht, the Netherlands.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: The institutional ethical review board of the Academic Medical Center of the University of Amsterdam, the Netherlands, confirmed that the Medical Research Involving Human Subjects Act did not apply to this study and provided a waiver of ethical approval for the overall study design (W14_065 no. 14.17.0090). However, written permissions were requested and granted from the departments using the online Dutch Residency Educational Climate Test (D-RECT) platform and from hospitals using paper D-RECT questionnaires. The authors obtained permissions from individual hospitals to use anonymized questionnaire data for this research and consulted a privacy officer at their institution to ensure that the data used in this study complied with the Dutch Personal Data Protection Act. Participating hospitals were recruited through the Miletus Foundation (www.stichtingmiletus.nl), a coordinating body of all Consumer Quality Index (CQI) evaluations within the Netherlands. A detailed research proposal was sent to all hospitals and subsequently discussed at the general meeting. Hospitals interested in participating in the study gave informed consent either via the Miletus Foundation or by directly contacting the primary researcher (A.S.). MediQuest (home.mediquest.nl), a company that processes patient evaluation data from these evaluations, provided the final dataset for this study.

Previous presentations: Preliminary results of this study were presented at the biannual meeting of the School of Health Professions Education Academy in Maastricht, the Netherlands, March 27–30, 2017; the Association for Medical Education in Europe conference in Helsinki, Finland, August 27–30, 2017; and the Netherlands Association for Medical Education conference in Egmond aan Zee, the Netherlands, November 16–17, 2017.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A608.

Correspondence should be addressed to Alina Smirnova, Professional Performance Research Group, Department of Medical Psychology, Academic Medical Center, PO Box 22660, Amsterdam 1100DD, the Netherlands; telephone: (+31) 20-566-1273; e-mail: a.smirnova@amc.uva.nl; Twitter: @asmirnova7.

Recent quality improvement efforts in graduate medical education (GME) have aimed to improve the clinical learning environment to both support residents’ learning and well-being and ensure patient safety and quality of care.1 Residency programs are increasingly turning to perception-based assessment tools completed by residents to evaluate the effectiveness of interventions to improve the clinical learning environment and to identify further areas for improvement.2–4 Residents’ collective perceptions of both the formal and informal aspects of their training, including the educational atmosphere and how educational policies and procedures are enacted, as reported on these assessment tools,2 , 5 constitute a department’s learning climate.6 Past research has demonstrated that optimizing the residency learning climate benefits residents’ learning7 and exam performance8 as well as their professional development9 and well-being.10 Ultimately, efforts to improve residency learning climates should consider the effects of any interventions on the quality of patient care. Although patient care should ideally benefit from improvements to the learning climate, only one study to our knowledge has researched this issue using patient outcomes.11 A better understanding of the relationship between residency learning climate and quality of care in clinical teaching departments is needed across various specialties using relevant patient outcome measures.

One such outcome identified as an important target for quality improvement is patient-centered care, defined as care that is “respectful of and responsive to individual patient preferences, needs, and values, and [that ensures] that patient values guide all clinical decisions.”12 Measuring patient experience, as opposed to patient satisfaction, has become the preferred method to capture patient-centered care in practice because patient experience better reflects the quality of care received during a hospitalization or outpatient visit, including the interpersonal aspects of the care and the extent to which the patient’s needs were met.13 , 14 In addition to their positive association with patient satisfaction measures,15 , 16 better patient experience measures also have been associated with higher rates of adherence to treatment and prevention guidelines, better clinical outcomes, better patient safety within hospitals, and less care utilization.14 Finally, because patient complaints usually revolve around the interpersonal aspects of care, such as communication and professionalism, improving patient experience also can play a role in reducing patient complaints.17 In the United States, patient experience has even been included in Medicare’s Hospital Value-Based Purchasing Program to calculate payments and evaluate quality of care.18

Previous studies of organizational climates in health care, which relate how employees experience their organization, linked positive perceptions of organizational climate with improved employee functioning,19 quality of care,20 and patient care outcomes.21 In particular, organizational climates that support health care professionals’ well-being, autonomy, and professional development positively affect patients’ satisfaction with health care services.22

Teaching hospitals, however, present a unique situation. Carvajal and colleagues23 found that patient experience differs in teaching versus nonteaching hospitals. In teaching hospitals, patients tend to view residents as their primary caregivers during hospitalizations,24 so we hypothesized that residency learning climate, which reflects residents’ perceptions of their learning environment, might affect inpatient care experience. Because both the learning climate and patient experience are multidimensional concepts, it is not known which aspects of the learning climate better predict patient experience and which aspects of patient experience are more affected by the residency learning climate in clinical teaching departments.

In this study, we investigated the association between residency learning climate and inpatient care experience in both academic and nonacademic (community) teaching hospitals in the Netherlands. Additionally, to support targeted efforts to improve the residency learning climate in relation to patient experience, we aimed to identify which facets of the learning climate are most likely to be associated with patient experience. Therefore, we addressed the following research questions: (1) Is there a relationship between overall residency learning climate and patients’ experiences of their care during a hospitalization? and (2) Which facets of the learning climate are associated with inpatient care experience?

Back to Top | Article Outline

Method

Setting and data collection

This study was conducted in the clinical teaching departments of 18 hospitals in the Netherlands that evaluated their learning climate and inpatient care experience between January 2013 and December 2014.

The Netherlands has 28 hospital-based residency programs that are between four and six years in duration. Rotations are offered in both academic and nonacademic (community) teaching hospitals. Residents can spend up to two years in a single hospital. Depending on the residency program, residents also may spend at least three months training in a different specialty. Prior to entering a residency program, trainees commonly complete an internship in the same or a different specialty.

In the Netherlands, all clinical teaching department faculty, including clinical supervisors and program directors, are responsible for monitoring and improving the learning climate.25 As part of their quality improvement activities, faculty ask trainees (interns, residents, and fellows) who are currently on that rotation or who recently completed that rotation to evaluate the learning climate of the department using either an online or a paper-based version of the Dutch Residency Educational Climate Test (D-RECT) (depending on the department’s preferences).

In 2013–2014, patients who were 16 years or older and who spent at least 24 hours in the hospital in the previous 12 months were invited to fill out the Consumer Quality Index (CQI) Inpatient Hospital Care questionnaire, according to a prescribed protocol, which included up to three reminders. Patients could respond via e-mail or use a paper-based questionnaire. Respondents were assured that their participation was voluntary and anonymous.

The institutional ethical review board of the Academic Medical Center, University of Amsterdam, the Netherlands, provided a waiver of ethical approval for the overall study design.

Back to Top | Article Outline

Measures

Residency learning climate.

The D-RECT is a resident questionnaire originally developed using interviews, a Delphi panel, and a literature review.26 The questionnaire consists of 35 questions evaluated on a five-point Likert scale (1 = totally disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = totally agree) with an additional “not applicable” option. Questions cover nine subscales related to the learning climate: (1) educational atmosphere (level of respect among team members and the impact of differences of opinion between attendings on work and the educational climate as well as care); (2) teamwork (ability of other health care team members to work well together and contribute to residents’ learning); (3) role of the specialty tutor (involvement of the program director in monitoring and guiding residents’ performance and quality of the training); (4) coaching and assessment (involvement and initiative of attendings to assess and provide feedback on residents’ performance); (5) formal education (organization, relevance, and attendance at formal education sessions); (6) peer collaboration (teamwork among residents as a group); (7) adaptation of work to residents’ competence (adaptation of clinical work to residents’ needs and opportunities to follow up with patients); (8) accessibility of supervisors (guidelines for contacting supervisors and availability of supervisors when needed); and (9) patient signout (educational value of the patient handover process and residents’ role during these discussions) (see Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A608 for the complete questionnaire).6

Since its development, the D-RECT has been validated and widely used internationally in the context of quality improvement in GME.27 , 28 Based on the findings of a previous large-scale validation study, at least three completed questionnaires are required to reliably evaluate the overall learning climate and five to eight questionnaires to reliably evaluate the subscales.6 We followed these recommendations for minimum numbers of completed questionnaires to calculate the overall learning climate and subscale scores for each department.

Back to Top | Article Outline

Inpatient care experience.

Departments evaluated patients’ experiences of their care during a hospitalization using the CQI Inpatient Hospital Care questionnaire, a standardized patient questionnaire that was developed for the purposes of internal quality improvement; national benchmarking; and informing patients, regulators, and insurance companies.29 It is a validated questionnaire measuring experiences with inpatient hospital care on two sets of subscales. Seven subscales included response options from 1 (never) to 4 (always): (1) communication with nurses (whether nurses took time to listen and explain things to the patient) (3 items); (2) communication with doctors (whether doctors took time to explain things to the patient) (2 items); (3) patient’s contribution (whether the patient had a say in important matters around his/her care) (5 items); (4) explanation of treatment (whether doctors and nurses explained the treatment, its purpose, and possible side effects) (3 items); (5) pain management (whether doctors and nurses reacted quickly if the patient had pain and whether the pain was well controlled) (2 items); (6) communication about medication (whether the purpose and possible side effects of new or changed medication were explained) (2 items); and (7) feeling of safety (whether doctors and nurses checked the patient’s identity before giving him/her medication or treatment and paid sufficient attention to unsafe situations) (3 items). Two additional subscales included dichotomous response options (yes/no): admission (regarding the completeness of the information provided during admission to the hospital) (10 items) and discharge information (regarding the completeness of the information provided at discharge and the quality of the discharge planning) (5 items).30 Respondents also indicated their overall rating of the department (“How do you rate the department where you were admitted?”) using a 10-point rating scale. See Supplemental Digital Appendix 2 at http://links.lww.com/ACADMED/A608 for the complete questionnaire.

Back to Top | Article Outline

Covariates.

We tested several covariates for statistical significance to all outcome measures and included those that were statistically significant in our final analysis. Those patient covariates that we included were age (16–24, 25–34, 45–54, 55–64, 65–74, 75–79, 80 years or older); sex; education level (lower secondary or less [equivalent to an elementary school diploma], upper secondary [equivalent to a high school diploma], tertiary education [equivalent to a college or university diploma]); self-reported overall physical health (bad, moderate, good, very good, excellent); self-reported mental health (bad, moderate, good, very good, excellent); total number of hospital admissions in the previous 12 months (1, 2, 3, 4 or more); whether the patient needed help filling out the questionnaire (yes/no); language spoken at home (Dutch/Fries/Dutch dialect or other); and whether the patient was born outside the Netherlands (yes/no).

Department covariates included in our models were total number of previous D-RECT evaluation cycles in the department31 as well as the sex mix of the D-RECT respondents. The average postgraduate training year of the D-RECT respondents and the type of hospital (academic, nonacademic, or top clinical) were not significantly associated with the outcome measures; therefore, we did not include them in our final analysis.

Back to Top | Article Outline

Data analysis

Because the percentage of missing data from the D-RECT questionnaires was considered ignorable (7.4% overall), we imputed all D-RECT data using the expectation maximization algorithm.32 We then calculated the mean overall learning climate score as well as the subscale scores for each department, if the minimum number of questionnaires per department for the subscale was met (see above).6

The dataset containing the CQI Inpatient Hospital Care questionnaire responses was first analyzed for nonresponses (sex and age). Statistically significant variables were included in all subsequent analyses. Data were then excluded if the respondents did not answer yes to the item about whether they had been hospitalized in the previous 12 months or if they were missing responses to more than half of the items. Because some items had a relatively high percentage of missing responses (up to 80% for a single item; 14% overall), we chose multiple imputation as the appropriate missing data imputation method because it better reflects the inherent uncertainty in our analyses due to missing data in the sample.32 Multiple imputation creates several datasets, which can be subsequently analyzed separately, and their resulting parameter estimates and standard errors can then be pooled using Rubin’s rules to give the final result.32 To account for a potential drop in statistical power due to the relatively high percentage of missing responses in one item, we also opted for a higher number of imputed datasets (m = 80, maxit = 20) using respondent age and sex as predictors.33 We checked the convergence of the imputation by calculating the Rhat statistic.34 Because our study had a nested design (patients nested within clinical teaching departments), only departments with multiple CQI Inpatient Hospital Care questionnaires were included (no departments had a single questionnaire).

We assessed the reliability of the scores by calculating the intraclass correlation coefficient (ICC) for patient evaluations of each subscale related to inpatient care experience (ICC1) and department evaluations of residency learning climate (ICC2).35

We conducted linear hierarchical panel analyses to take into account the nesting of patients within departments as well as the year that the questionnaire was completed. To answer our first research question, we regressed patients’ global department ratings and inpatient care experience subscale scores on the overall learning climate score and included the covariates. To answer our second research question, we regressed each inpatient care experience subscale score on each learning climate subscale score, adjusted for the other learning climate subscales, and included the covariates. Because multiple comparisons of learning climate subscale scores could increase Type I error, we applied the false discovery rate (FDR) correction to the P values and calculated false coverage rate (FCR)-adjusted confidence intervals (CIs).36 Associations were deemed significant if the FDR-adjusted P values were < .05.

Finally, we repeated the main analysis using Bartlett factor scores instead of mean scores as a sensitivity analysis of our results.37 The advantage of using Bartlett factor scores over simple mean scores is that the factor scores take into account the strength of the association of each item with the subscale score, thereby minimizing potential bias caused by items that are only weakly associated with the subscale score when using simple mean scores.38

Missing data were imputed using mice (version 2.25), and FDR-adjusted P values were calculated using p.adjust in R statistical software (version 3.2.3).39 , 40 All other analyses were performed using SPSS version 23.0.0.2 (IBM Corp., Armonk, New York).

Back to Top | Article Outline

Results

In total, 86 clinical teaching departments across 15 specialties in 18 hospitals evaluated their residency learning climate and inpatient care experience between 2013 and 2014, of which 39 (45%) departments evaluated these characteristics in both years. Of the total 1,204 returned D-RECT questionnaires, 3 had more than 50% of responses missing, resulting in 1,201 D-RECT questionnaires included in our final analysis (by department: mean 10; standard deviation [SD] 6; minimum = 3; maximum = 42 questionnaires). Eighty-two percent (71) of departments used the online questionnaire; they had a 63% average response rate. The response rate for departments that used the paper-based questionnaire (15; 18% of departments) could not be reliably calculated because the number of invited residents was not collected. However, we expected a similar or even higher response rate for the departments that used the paper-based questionnaire based on evidence from the literature.41

Of the 6,853 returned CQI Inpatient Hospital Care questionnaires (gross response rate 30%), 87 were excluded because of a negative or missing response to the item about whether the patient had a hospital admission in the last 12 months, and 77 questionnaires were excluded because responses to more than 50% of items were missing. The final sample included 6,689 CQI Inpatient Hospital Care questionnaires (effective response rate 27%; by department: mean 54; SD 37; minimum = 3; maximum = 192 questionnaires). Table 1 lists the descriptive characteristics of our study sample.

Table 1

Table 1

Table 2 includes the overall and subscale learning climate and inpatient care experience scores. Differences between departments explained a large proportion of the variance in departments’ overall learning climate scores (ICC2 = 76%) and subscale scores (ICC2 = 57% to 83%). In contrast, differences between departments explained only 2%-6% of the variance (ICC1) in inpatient care experience scores.

Table 2

Table 2

We found no significant association between departments’ overall learning climate scores and patients’ global department ratings (b = 0.03; 95% CI −0.17 to 0.23; P = .781). Table 3 reports the adjusted significant associations between overall and subscale learning climate scores and specific inpatient care experience scores. The overall learning climate was positively associated with communication with doctors (b = 0.11; 95% CI 0.02 to 0.20; P = .016) and feeling of safety (b = 0.09; 95% CI 0.01 to 0.17; P = .032). Among learning climate subscales, coaching and assessment exhibited the strongest association with communication with doctors (b = 0.22; 95% CI 0.08 to 0.37; P = .003; 95% CIFCR 0.02 to 0.43; P FDR = .027) and explanation of treatment (b = 0.22; 95% CI 0.08 to 0.36; P = .002; 95% CIFCR 0.02 to 0.42; P FDR = .018). Peer collaboration was positively associated with pain management (b = 0.14; 95% CI 0.03 to 0.24; P = .010; 95% CIFCR –0.02 to 0.29; P FDR = .045). In contrast, formal education was negatively associated with pain management (b = −0.16; 95% CI −0.26 to −0.05; P = .003; 95% CIFCR −0.31 to −0.004; P FDR = .027). Adjusted estimates for all learning climate subscales are provided in Supplemental Digital Appendix 3 (available at http://links.lww.com/ACADMED/A608). Our sensitivity analysis using Bartlett factor scores instead of mean scores for the CQI Inpatient Hospital Care questionnaire subscales produced the same results (see Supplemental Digital Appendix 4 at http://links.lww.com/ACADMED/A608).

Table 3

Table 3

Back to Top | Article Outline

Discussion

While the overall residency learning climate was not associated with global department ratings from patients, we found small but significant positive associations between the overall learning climate and communication with doctors and feeling of safety. Among learning climate domains, coaching and assessment was positively associated with communication with doctors and explanation of treatment, and peer collaboration was positively associated with pain management. Formal education, on the other hand, was negatively associated with pain management.

Back to Top | Article Outline

Explanation of findings

In this study, we found no association between a department’s overall learning climate and patients’ global ratings of the department. While the overall learning climate score was based on an average of the responses to the 35 items on the D-RECT questionnaire, patients’ global department ratings were based on the mean of the responses to a single item (rated 1–10) on the CQI Inpatient Hospital Care questionnaire, for which patients were asked to globally evaluate their experience. Previous studies comparing global ratings to overall questionnaire scores have favored overall questionnaire scores because global ratings show poor associations with the individual dimensions of experience.42 , 43 A recent validation study of the CQI Inpatient Hospital Care questionnaire similarly showed that global ratings had inconsistent relationships with the individual dimensions of the patient care experience.30 Therefore, using global ratings may not provide an accurate evaluation of the overall patient experience. As a result, using the associations between the overall learning climate and specific dimensions of the inpatient care experience may provide a better reflection of the nature of the association than using global ratings alone.

Two domains of inpatient care experience demonstrated small but significant positive associations with the overall learning climate—namely, communication with doctors and feeling of safety. The communication with doctors domain reflects patients’ perceptions of whether their doctors spent sufficient time with them and whether their doctors explained things in an understandable way. Previous research has shown that clinical supervisors facilitate residents’ development of clinical, professional, and communication competencies in the learning environment44 , 45 and that departments’ learning climate is positively associated with better teaching qualities in clinical supervisors.46 The results of our study are in line with these findings, demonstrating positive associations between coaching and assessment and communication with doctors as well as with explanation of treatment. Supervisors who actively engage in patient care with residents, such as through direct observation of residents’ practice, can better balance patient care and residents’ learning needs.47–49 On the other hand, supervisors’ involvement in residents’ training may contribute not only to a better overall learning climate but also to patients’ perceptions of safety. In a recent study, Silkens and colleagues50 found that residency learning climate was associated with patient safety climate, which in turn was associated with better patient safety behavior and compliance in residents. Therefore, the association between the overall learning climate and feeling of safety could be mediated by an improved patient safety climate in clinical teaching departments.

The effect sizes for the associations between the overall learning climate and the inpatient care experience subscales ranged from 0.09 to 0.11, which reflects only small changes in inpatient care experience scores (on a four-point scale) relative to a one-unit increase in overall learning climate scores (on a five-point scale). These effect sizes may indicate a weak or absent relationship between residency learning climate and inpatient care experience. These small effect sizes also could potentially reflect the limitations of the response scales as perception-based assessment tools suffer from ceiling effects, a limited number of response options, and generally negatively skewed distributions, all of which make finding differences in performance difficult.38 A previous study assessing D-RECT scores also found a small but significant increase in scores over time,31 which corroborates previous findings of studies using similar perception-based assessment tools.51

Our examination of the learning climate subscale scores showed that three of the nine subscales were significantly associated with inpatient care experience while the other six subscales were not. This lack of association may indicate that these learning climate domains do not contribute to the inpatient care experience or that there are other elements influencing patient satisfaction at play (e.g., the role of the nurse) that may compensate for a poor learning environment or vice versa. The learning climate domains that were significantly associated with pain management showed mixed results. Formal education (i.e., the degree of organization, relevance, and attendance at formal education sessions by residents, doctors, and nurses) was negatively associated with patients’ perceptions of pain management, which may reflect doctors’ and nurses’ (in)ability to respond to acute patient concerns quickly and adequately. These findings of potential tensions between residency training and patient care echo the results from our earlier study, in which we found higher odds of adverse perinatal outcomes in nonacademic obstetrics–gynecology teaching departments.11 On the other hand, we found a positive association between peer collaboration and pain management. In those departments where residents perceived that they worked better together with patients, patients also reported better management of pain. From this finding, we may infer that activities that are resident focused may conflict with immediate patient care needs, and patient-centered activities may offset these negative effects. These results add to the growing literature on the potential tensions between training and care in clinical teaching departments.52 , 53

Back to Top | Article Outline

Limitations

Our study has a few limitations. One limitation is the possibility of ceiling effects due to the rating scales used in the questionnaires, making it more difficult to detect smaller changes in inpatient care experience, especially if the ratings tended to be high. We also did not have information on nonrespondents. Nonresponse bias could be less of a problem with the D-RECT questionnaire, which had a response rate of 63%, and more important with the CQI Inpatient Hospital Care questionnaire because of its low response rate (effective response rate 27%). We minimized the potential for selection bias in the CQI Inpatient Hospital Care questionnaire by adjusting for important nonrespondent characteristics (sex and age) in our analyses; however, we could not account for all potentially important characteristics.54 The prospective collection of data could improve response rates and minimize nonrespondent bias. Memory is another factor that may have influenced patients’ perceptions of their hospitalization. There was no way to record the variability in the time between the patient’s hospitalization and when he or she filled out the questionnaire, which may have affected the overlap between residents’ and patients’ perceptions. Finally, because of the cross-sectional nature of our study, we cannot establish causal relationships between residency learning climate and inpatient care experience. Well-designed longitudinal studies are needed to provide evidence of causality.

Back to Top | Article Outline

Implications for practice and research

Clinical teaching departments could potentially improve both their residency learning climate and inpatient care experience by emphasizing those aspects of the learning environment that are integrated with direct patient care, such as workplace coaching and assessment and collaboration between residents. They may need to critically assess those aspects of residency training that are not well integrated with clinical practice or direct patient care, such as formal education sessions, to identify potential tensions that may exist and to mitigate potential negative effects on clinical practice. Future research should study the factors that influence the relationship between residency learning climate and inpatient care experience, which can shed light on the mechanisms behind these associations. Qualitative studies could be particularly well suited for identifying these mechanisms. Testing the effect of interactions with inpatient care experience also can provide insights into how the learning climate and supervision can be optimized.

Back to Top | Article Outline

Conclusions

Optimizing the clinical learning environment is an important step toward ensuring high-quality residency training and patient care. This study adds to a growing body of evidence on how residency learning climate relates to patient care quality in clinical teaching departments. Future studies should confirm these results and explore the potential mechanisms behind the association between residency learning climate and inpatient care experience.

Back to Top | Article Outline

Acknowledgments:

The authors would like to acknowledge the Miletus Foundation (www.stichtingmiletus.nl) for their assistance in obtaining the data for this research, the individual hospitals that provided the data, and the patients who filled out the Consumer Quality Index Inpatient Hospital Care questionnaire. The authors also would like to thank the trainees who completed the Dutch Residency Educational Climate Test questionnaire. None received financial or other compensation for their contributions.

Back to Top | Article Outline

References

1. Nasca TJ, Weiss KB, Bagian JP. Improving clinical learning environments for tomorrow’s physicians. N Engl J Med. 2014;370:991–993.
2. Colbert-Getz JM, Kim S, Goode VH, Shochet RB, Wright SM. Assessing medical students’ and residents’ perceptions of the learning environment: Exploring validity evidence for the interpretation of scores from existing tools. Acad Med. 2014;89:1687–1693.
3. Philibert I. Satisfiers and hygiene factors: Residents’ perceptions of strengths and limitations of their learning environment. J Grad Med Educ. 2012;4:122–127.
4. Holt KD, Miller RS, Philibert I, Heard JK, Nasca TJ. Residents’ perspectives on the learning environment: Data from the Accreditation Council for Graduate Medical Education resident survey. Acad Med. 2010;85:512–518.
5. Schönrock-Adema J, Bouwkamp-Timmer T, van Hell EA, Cohen-Schotanus J. Key elements in assessing the educational environment: Where is the theory? Adv Health Sci Educ Theory Pract. 2012;17:727–742.
6. Silkens ME, Smirnova A, Stalmeijer RE, et al. Revisiting the D-RECT tool: Validation of an instrument measuring residents’ learning climate perceptions. Med Teach. 2016;38:476–481.
7. Delva MD, Kirby J, Schultz K, Godwin M. Assessing the relationship of learning approaches to workplace climate in clerkship and residency. Acad Med. 2004;79:1120–1126.
8. Shimizu T, Tsugawa Y, Tanoue Y, et al. The hospital educational environment and performance of residents in the General Medicine In-Training Examination: A multicenter study in Japan. Int J Gen Med. 2013;6:637–640.
9. Cross V, Hicks C, Parle J, Field S. Perceptions of the learning environment in higher specialist training of doctors: Implications for recruitment and retention. Med Educ. 2006;40:121–128.
10. van Vendeloo SN, Brand PL, Verheyen CC. Burnout and quality of life among orthopaedic trainees in a modern educational programme: Importance of the learning climate. Bone Joint J. 2014;96-B:1133–1138.
11. Smirnova A, Ravelli ACJ, Stalmeijer RE, et al. The association between learning climate and adverse obstetrical outcomes in 16 nontertiary obstetrics–gynecology departments in the Netherlands. Acad Med. 2017;92:1740–1748.
12. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. 2001.Washington, DC: National Academies Press.
13. Beattie M, Murphy DJ, Atherton I, Lauder W. Instruments to measure patient experience of healthcare quality in hospitals: A systematic review. Syst Rev. 2015;4:97.
14. Anhang Price R, Elliott MN, Zaslavsky AM, et al. Examining the role of patient experience surveys in measuring health care quality. Med Care Res Rev. 2014;71:522–554.
15. Batbaatar E, Dorjdagva J, Luvsannyam A, Savino MM, Amenta P. Determinants of patient satisfaction: A systematic review. Perspect Public Health. 2017;137:89–101.
16. Bjertnaes OA, Sjetne IS, Iversen HH. Overall patient satisfaction with hospitals: Effects of patient-reported experiences and fulfilment of expectations. BMJ Qual Saf. 2012;21:39–46.
17. Wofford MM, Wofford JL, Bothra J, Kendrick SB, Smith A, Lichstein PR. Patient complaints about physician behaviors: A qualitative study. Acad Med. 2004;79:134–138.
18. Zhao M, Haley DR, Spaulding A, Balogh HA. Value-based purchasing, efficiency, and hospital performance. Health Care Manag (Frederick). 2015;34:4–13.
19. Bronkhorst B, Tummers L, Steijn B, Vijverberg D. Organizational climate and employee mental health outcomes: A systematic review of studies in health care organizations. Health Care Manage Rev. 2015;40:254–271.
20. Benzer JK, Young G, Stolzmann K, et al. The relationship between organizational climate and quality of chronic disease management. Health Serv Res. 2011;46:691–711.
21. MacDavitt K, Chou SS, Stone PW. Organizational climate and health care outcomes. Jt Comm J Qual Patient Saf. 2007;33(11 suppl):45–56.
22. Ancarani A, Di Mauro C, Giammanco MD. How are organisational climate models and patient satisfaction related? A competing value framework approach. Soc Sci Med. 2009;69:1813–1818.
23. Carvajal DN, Blank AE, Lechuga C, Schechter C, McKee MD. Do primary care patient experiences vary by teaching versus nonteaching facility? J Am Board Fam Med. 2014;27:239–248.
24. Dalia S, Schiffman FJ. Who’s my doctor? First-year residents and patient care: Hospitalized patients’ perception of their “main physician.” J Grad Med Educ. 2010;2:201–205.
25. Directive of the Central College of Medical Specialists [in Dutch]. 2009.Utrecht, the Netherlands: Royal Dutch Medical Association.
26. Boor K, Van Der Vleuten C, Teunissen P, Scherpbier A, Scheele F. Development and analysis of D-RECT, an instrument measuring residents’ learning climate. Med Teach. 2011;33:820–827.
27. Piek J, Bossart M, Boor K, et al. The work place educational climate in gynecological oncology fellowships across Europe: The impact of accreditation. Int J Gynecol Cancer. 2015;25:180–190.
28. Bennett D, Dornan T, Bergin C, Horgan M. Postgraduate training in Ireland: Expectations and experience. Ir J Med Sci. 2014;183:611–620.
29. Delnoij DM, Rademakers JJ, Groenewegen PP. The Dutch Consumer Quality Index: An example of stakeholder involvement in indicator development. BMC Health Serv Res. 2010;10:88.
30. Smirnova A, Lombarts KMJMH, Arah OA, van der Vleuten CPM. Closing the patient experience chasm: A two-level validation of the Consumer Quality Index Inpatient Hospital Care. Health Expect. 2017;20:1041–1048.
31. Silkens ME, Arah OA, Scherpbier AJ, Heineman MJ, Lombarts KM. Focus on quality: Investigating residents’ learning climate perceptions. PLoS One. 2016;11:e0147108.
32. Dong Y, Peng CY. Principled missing data methods for researchers. Springerplus. 2013;2:222.
33. Graham JW, Olchowski AE, Gilreath TD. How many imputations are really needed? Some practical clarifications of multiple imputation theory. Prev Sci. 2007;8:206–213.
34. Gelman A, Hill J. Data Analysis Using Regression and Multilevel/Hierarchical Models. 2007.Cambridge, UK: Cambridge University Press.
35. Bliese PD. Klein KJ, Kozlowski SWJ. Within-group agreement, non-independence, and reliability: Implications for data aggregation and analysis. In: Multilevel Theory, Research, and Methods in Organizations: Foundations, Extensions, and New Directions. 2000.San Francisco, CA: Jossey-Bass Inc.
36. Benjamini Y, Hochberg Y. Controlling the false discovery rate—A practical and powerful approach to multiple testing. J Roy Stat Soc B Met. 1995;57:289–300.
37. DiStefano C, Zhu M, Mindrila D. Understanding and using factor scores: Considerations for the applied researcher. Pract Assess Res Eval. 2009;14:1–11.
38. Boerebach BC, Arah OA, Heineman MJ, Lombarts KM. Embracing the complexity of valid assessments of clinicians’ performance: A call for in-depth examination of methodological and statistical contexts that affect the measurement of change. Acad Med. 2016;91:215–220.
39. van Buuren S, Groothuis-Oudshoorn K. mice: Multivariate imputation by chained equations in R. J Stat Softw. 2011;45:67.
40. R Foundation. The R project for statistical computing. https://www.R-project.org. Published 2015. Accessed September 26, 2018.
41. Yarger JB, James TA, Ashikaga T, et al. Characteristics in response rates for surveys administered to surgery residents. Surgery. 2013;154:38–45.
42. Krol MW, de Boer D, Rademakers JJ, Delnoij DM. Overall scores as an alternative to global ratings in patient experience surveys; a comparison of four methods. BMC Health Serv Res. 2013;13:479.
43. de Boer D, Delnoij D, Rademakers J. Do patient experiences on priority aspects of health care predict their global rating of quality of care? A study in five patient groups. Health Expect. 2010;13:285–297.
44. Subramaniam A, Silong AD, Uli J, Ismail IA. Effects of coaching supervision, mentoring supervision and abusive supervision on talent development among trainee doctors in public hospitals: Moderating role of clinical learning environment. BMC Med Educ. 2015;15:129.
45. Bates J, Ellaway RH. Mapping the dark matter of context: A conceptual scoping review. Med Educ. 2016;50:807–816.
46. Lombarts KM, Heineman MJ, Scherpbier AJ, Arah OA. Effect of the learning climate of residency programs on faculty’s teaching performance as evaluated by residents. PLoS One. 2014;9:e86512.
47. Farnan JM, Petty LA, Georgitis E, et al. A systematic review: The effect of clinical supervision on patient and residency education outcomes. Acad Med. 2012;87:428–442.
48. van der Leeuw RM, Lombarts KM, Arah OA, Heineman MJ. A systematic review of the effects of residency training on patient outcomes. BMC Med. 2012;10:65.
49. Piquette D, Moulton CA, LeBlanc VR. Model of interactive clinical supervision in acute care environments. Balancing patient care and teaching. Ann Am Thorac Soc. 2015;12:498–504.
50. Silkens MEWM, Arah OA, Wagner C, Scherpbier AJJA, Heineman MJ, Lombarts KMJMH. The relationship between the learning and patient safety climates of clinical departments and residents’ patient safety behaviors. Acad Med. 2018;93:1374–1380.
51. Van Der Leeuw RM, Boerebach BC, Lombarts KM, Heineman MJ, Arah OA. Clinical teaching performance improvement of faculty in residency training: A prospective cohort study. Med Teach. 2016;38:464–470.
52. Cleland J, Roberts R, Kitto S, Strand P, Johnston P. Using paradox theory to understand responses to tensions between service and training in general surgery. Med Educ. 2018;52:288–301.
53. Hoffman KG, Donaldson JF. Contextual tensions of the clinical environment and their influence on teaching and learning. Med Educ. 2004;38:448–454.
54. Zaslavsky AM, Zaborski LB, Cleary PD. Factors affecting response rates to the Consumer Assessment of Health Plans Study survey. Med Care. 2002;40:485–499.

Supplemental Digital Content

Back to Top | Article Outline
© 2019 by the Association of American Medical Colleges