A study of mini-CEX (mini-clinical evaluation exercise) as an assessment tool of postgraduate students in ophthalmology : D Y Patil Journal of Health Sciences

Secondary Logo

Journal Logo

Original Articles

A study of mini-CEX (mini-clinical evaluation exercise) as an assessment tool of postgraduate students in ophthalmology

Shinde, Chhaya A; Shirwadkar, Shruti P; Hegde, Rajarathna V; D'Cunha, Lynn L; Rao, Priyanka R; Kokate, Anupama A; Raghuwanshi, Akash O

Author Information
D Y Patil Journal of Health Sciences 11(1):p 2-7, January-March 2023. | DOI: 10.4103/DYPJ.DYPJ_5_22
  • Open

Abstract

Introduction

Assessment of the students has an important role in the teaching–learning process. During past few decades, there has been an increasing emphasis on using real-life situations in the assessment. Mini-clinical evaluation exercise (mini-CEX) is a variation of the traditional CEX method of assessment. It is a formative assessment tool designed to provide feedback on skills essential to good medical care.[1-3] It stimulates clinical reasoning in the face of a real situation.[4] It has a high fidelity, and it is acceptable to both students and examiners.[5]

It is not expensive, and it does not require elaborate preparations. The aims and objectives of the study were to assess the first-, second-, and third-year postgraduate (PG) students of ophthalmology by the method of mini-CEX, to record the improvement in skills of the PG students, to provide necessary guidance, and to support the PG students to improve the skills where required.

Materials and Methods

  • Study design: Prospective, longitudinal, analytical, interventional, randomized, and hospital-based.
  • Study setup: Ophthalmology out patient department, ward, casualty of tertiary care hospital.
  • Study duration: 5 months.
  • Study participants: Five PG students of ophthalmology (two first-year, one second-year, and two third-year students).
  • Sample size: The five PG students of ophthalmology evaluating 30 cases each = 150 encounters in three assessments over the study period.
  • Sampling method: Simple random sampling (lottery method) for the allotment of cases, and the evaluator for each encounter was used to avoid bias.
  • Informed consents were taken from patients and PG students.

Selection of cases/patients

Inclusion criteria for cases were as follows:

  • Patients presenting to out patient department/ward/casualty with ophthalmic complaints
  • Patients willing to participate in the study.

Exclusion criteria for cases were as follows:

  • Pregnant women, serious/terminally ill, mentally-challenged, and patients not willing to participate in the study.

Selection of assessors

  • For adequate reliability and to reduce assessor bias, two sets were made. Set A had one assessor (assistant professor), and set B had two assessors (senior residents).
  • Each student was given 15 cases with set A assessor and 15 cases with one set B assessor (total = 30 cases per student).

Training of assessors and students about mini-CEX

At the beginning of the study, the assessors were trained about mini-CEX. It was followed by the training of PG students about mini-CEX.

Teaching–learning sessions were conducted in which the PG students were trained by the assessors about the following seven core clinical skills:

  1. Medical interviewing
  2. Physical examination
  3. Professionalism
  4. Clinical judgment
  5. Counseling
  6. Organization efficiency
  7. Overall clinical competence.

The demonstration was given of these skills by the assessors, in the particular disease situation. After the teaching session, students were asked, and it was confirmed that they have understood.

Institutional ethics committee approval was obtained for the study.

Assessments

Three assessments (n = 150 encounters) were conducted over a study period of 5 months as follows:

  • Assessment 1: First 10 cases each to all five PG students (n = 50 clinical encounters) in the first month.
  • Assessment 2: Next 10 cases each to all five PG students (n = 50 clinical encounters) in the third month.
  • Assessment 3: Next 10 cases each to all five PG students (n = 50 clinical encounters) in the fifth month.

Various types of cases of common diagnosis were given to students such as cataract, glaucoma, trauma, orbital involvement, eyelid, and retina. Patients having similar diagnosis were included in both sets as far as possible.

The assessors observed the performance of the trainee during an actual clinical case encounter, scored it, and then provided contextual feedback. Student performance was observed for 10–12 min, and feedback was given for 3–5 min per patient clinical encounter.

Outcome measures

The assessor assessed the PG students on the same seven core clinical skills, which were taught to the students.

The assessor marked the above in a standard rating form using global rating on a nine-point scale: score 1–3: unsatisfactory, score 4–6: satisfactory, and score 7–9: superior performance.

Date, age, sex of the patient, the type of patient (outdoor, indoor, emergency), complexity of the case (low/moderate/high), the type of visit (new/follow-up), the number of minutes for observation and feedback, focus of encounter (data gathering/diagnosis/treatment/counseling) were noted.

Satisfaction of the trainee with the encounter was marked on a nine-point scale, and the assessor and student signed the form.

Statistical analysis

The data were entered using MS Excel 2007 and analyzed using SPSS 16 software. Descriptive statistics for numerical data is presented as mean ± SD (range). One-way analysis of variance test is used for comparison of scores between groups (i.e., mean score of assessments 1, 2, and 3). The P value less than 0.05 was taken as statistically significant.

Results

The scores of the PG students improved over three assessments over the study period in all seven clinical skills observed, and these differences in the successive assessments were found statistically significant (P value < 0.05) [Tables 1 and 2] [Figure 1].

T1
Table 1::
Summary of scores of clinical skills observed in assessments 1, 2, and 3
T2
Table 2::
Average score of clinical skills observed in assessments 1, 2, and 3
F1
Figure 1::
Mean scores of clinical skills observed in assessments 1, 2, and 3

The difference in the mean student satisfaction score was not statistically significant (P > 0.05) in three assessments. The level of satisfaction was uniform in all assessments though conducted at different times [Table 3].

T3
Table 3::
Summary of student satisfaction score observed in assessments 1, 2, and 3

The mean observation time was 10.2 min in assessment 1, 10.1 min in assessment 2, and 10 min in assessment 3, and this difference was not statistically significant (P > 0.05) indicating that the uniformity was maintained in all three assessments [Table 4].

T4
Table 4::
Summary of observation time observed in assessments 1, 2, and 3

The mean feedback time was 4.9 min in assessment 1, 5.0 min in assessment 2, and 5.0 min in assessment 3, and this difference was not statistically significant (P > 0.05) [Figure 2].

F2
Figure 2::
Feedback time in assessments 1, 2, and 3

Discussion

With multiple encounters, the mini-CEX method of assessment evaluated residents in a greater variety of clinical settings with a diverse set of patient problems like other studies in the literature.[6,7] Multiple encounters with different assessors and patients gave the students opportunity to learn about various patient problems. The feedback was very useful to the students.

Overall uniformity was maintained throughout the study period for every single encounter, as evident from mean student satisfaction, observation time and feedback time were same in all three assessments though conducted over the time (as the difference in the three assessments was not statistically significant regarding these readings).

Similar environment was provided in all three assessments, though they were conducted at different times and different scenarios. This is the most important aspect of mini-CEX, i.e., the uniform assessment for PG students though different set of cases and evaluators, which is difficult to ensure in the traditional methods of assessment.

In a single-blind randomized, parallel group, controlled trial “Are workplace-based assessment methods (DOPS and Mini-CEX) effective in nursing students’ clinical skills?” conducted among 108 senior nursing students, in which mini-CEX and direct observation of procedural skills (DOPS) were utilized to evaluate clinical skills in the intervention group, it was found that the mean of students’ scores in all of the five procedures was significantly higher in the intervention group compared with the control group. Students’ scores for the procedures significantly raised through the first stage of DOPS and mini-CEX to the third stage.

Thus, the utilization of DOPS and mini-CEX for the evaluation of clinical skills in nursing students were found to effectively enhance their learning ability.[8]

In a systematic review and meta-analysis, “The educational impact of mini-clinical evaluation exercise (mini-CEX) and direct observation of procedural skills (DOPS) and its association with implementation: A systematic review and meta-analysis,” the authors noted that there were positive effects of mini-CEX and DOPS on trainee performance in meta-analyses.[9]

Norcini et al. conducted a study, “The mini-CEX (clinical evaluation exercise): A preliminary investigation,” in which five internal medicine training programmes in Pennsylvania were included. The participants were 388 mini-CEX encounters involving 88 residents and 97 evaluators. The encounters occurred in both inpatient and ambulatory settings and were longer than anticipated (median duration, 25 min). Residents saw either new or follow-up patients who collectively presented with a broad range of clinical problems. The median evaluator assessed two residents and was generally satisfied with the mini-CEX format; residents were even more satisfied with the format. The authors concluded that the reproducibility of the mini-CEX is higher than that of the traditional CEX, and its measurement characteristics are similar to those of other test formats, such as standardized patients and standardized oral examinations.[10]

In a study by Jafarpoor et al., “The effect of direct observation of procedural skills/mini-clinical evaluation exercise on the satisfaction and clinical skills of nursing students in dialysis,” which was designed to evaluate the impact of DOPS and mini-CEX on nursing students and their clinical satisfaction skills, conducted in 2018, in Iran, it was shown that students who were evaluated by DOPS and mini-CEX methods had a higher score of clinical performance evaluation and a higher level of satisfaction.[11]

In a study “Assessment of clinical competence using objective structured examination” by Harden et al., students rotated round a series of stations in the hospital ward. At one station, they were asked to carry out a procedure, such as take a history, undertake one aspect of physical examination, or interpret laboratory investigations in the light of a patient’s problem, and at the next station, they had to answer questions on the findings at the previous station and their interpretation. The students were observed and scored at some stations by examiners using a checklist. The authors have commented that in the structured clinical examination, the variables and complexity of the examination are more easily controlled, its aims can be more clearly defined, and more of the student’s knowledge can be tested. The examination is more objective, and a marking strategy can be decided in advance. The examination results in an improved feedback to students and staff.[12]

In a research article “What is wrong with assessment in postgraduate training? Lessons from clinical practice and educational research,” Driessen and Scheele opined “In postgraduate specialty training, we propose to shift the emphasis in workplace-based assessment from assessment of trainee performance to the learning of trainees. Workplace-based assessment should focus on supporting supervisors in taking entrustment decisions by complementing their ‘gut feeling’ with information from assessments and focus less on assessment and testability. One of the most stubborn problems with workplace-based assessment is the absence of observation of trainees and the lack of feedback based on observations. Non-standardized observations are used to organize feedback. To make these assessments meaningful for learning, it is essential that they are not perceived as summative by their users, that they provide narrative feedback for the learner and that there is a form of facilitation that helps to integrate the feedback in trainees’ self-assessments.”[13]

A study by Kim et al., “Implementation of a mini-CEX requirement across all third-year clerkships” assessed the impact of a mini-CEX requirement across all third-year clerkships on a student report of direct observation by faculty and objectively measured clinical skills.

The authors concluded that the institution of a mini-CEX requirement was feasible across all third-year clerkships and was associated with a significant increase in the student report of direct observation by faculty and a decrease in summative objectively structured clinical examination failure rates.[14]

A systematic review by Kogan et al. done to identify observation tools used to assess medical trainees’ clinical skills with actual patients and to summarize the evidence of their validity and outcomes, it was stated that the strongest validity evidence has been established for the mini-CEX.[3]

A study by Behere, “Introduction of Mini-CEX in undergraduate dental education in India,” supports the use of mini-CEX in a dental education.[15]

Norcini and Burch in the study “Workplace-based assessment as an educational tool: AMEE guide no. 31” emphasized the need for formative assessment that offers trainees the opportunity for feedback.[16]

Miller in a systematic review “Impact of workplace based assessment on doctors’ education and performance” found no evidence that alternative workplace-based assessment tools (mini-clinical evaluation exercise, DOPS, and case-based discussion) lead to an improvement in performance, although subjective reports on their educational impact are positive.[17]

“Focus on formative feedback,” the article by Shute reviewed the corpus of research on feedback, with a focus on formative feedback and stated that formative feedback has been shown in numerous studies to improve students’ learning and enhance teachers’ teaching to the extent that the learners are receptive and the feedback is on target (valid), objective, focused, and clear.[18]

Brazil et al. in a study “Mini-CEX as a workplace-based assessment tool for interns in an emergency department—Does cost outweigh value?” stated that the mini-CEX assessment process was perceived as generally positive. Both interns and assessors felt that it provided a valid assessment of intern performance and enabled timely and specific feedback.[19]

Castanelli et al. in a study “Perceptions of purpose, value, and process of the mini-clinical evaluation exercise in anesthesia training” found that the consistency of time commitment is necessary to embed the mini-CEX in the culture of the workplace, to realize the full potential for trainee learning, and to reach decisions on trainee progression.[20]

Conclusion

The scores of the PG students improved in three assessments over a period of 5 months. Mini-CEX is advantageous as an assessment tool over the traditional methods of assessment with reference to the uniformity of the assessment.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

References

1. Durning SJ, Cation LJ, Markert RJ, Pangaro LN. Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training. Acad Med 2002;77:900-4
2. Holmboe ES, Huot S, Chung J, Norcini J, Hawkins RE. Construct validity of the miniclinical evaluation exercise (miniCEX). Acad Med 2003;78:826-30
3. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA 2009;302:1316-26
4. Marinho MCB, Castro Júnior EF de, Lauterbach G da P, Nunes M do PT, Augusto KL. Analysis of the perception of interns, residents, and preceptors through the mini-CEX evaluation method (mini-clinical evaluation exercise). Rev Bras Educ Med 2020;44:e094
5. Nair BR, Alexander HG, McGrath BP, Parvathy MS, Kilsby EC, Wenzel J, et al. The mini clinical evaluation exercise (mini-CEX) for assessing clinical performance of international medical graduates. Med J Aust 2008;189:159-61
6. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: A method for assessing clinical skills. Ann Intern Med 2003;138:476-81
7. Norcini JJ. The mini clinical evaluation exercise (mini-CEX). Clini Teac 2005;2:25-30
8. Jasemi M, Ahangarzadeh Rezaie S, Hemmati Maslakpak M, Parizad N. Are workplace-based assessment methods (DOPS and mini-CEX) effective in nursing students’ clinical skills? A single-blind randomized, parallel group, controlled trial. Contemp Nurse 2019;55:565-75
9. Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, et al. The educational impact of mini-clinical evaluation exercise (mini-CEX) and direct observation of procedural skills (DOPS) and its association with implementation: A systematic review and meta-analysis. Plos One 2018;13:e0198009
10. Norcini JJ, Blank LL, Arnold GK, Kimball HR. The mini-CEX (clinical evaluation exercise): A preliminary investigation. Ann Intern Med 1995;123:795-9
11. Jafarpoor H, Hosseini M, Sohrabi M, Mehmannavazan M. The effect of direct observation of procedural skills/mini-clinical evaluation exercise on the satisfaction and clinical skills of nursing students in dialysis. J Educ Health Promot 2021;10:74
12. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-51
13. Driessen E, Scheele F. What is wrong with assessment in postgraduate training? Lessons from clinical practice and educational research. Med Teach 2013;35:569-74
14. Kim S, Willett LR, Noveck H, Patel MS, Walker JA, Terregino CA. Implementation of a mini-CEX requirement across all third-year clerkships. Teach Learn Med 2016;28:424-31
15. Behere R. Introduction of mini-CEX in undergraduate dental education in India. Educ Health (Abingdon) 2014;27:262-8
16. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE guide no. 31. Med Teach 2007;29:855-71
17. Miller A, Archer J. Impact of workplace based assessment on doctors’ education and performance: A systematic review. BMJ 2010;341:c5064
18. Shute VJ. Focus on formative feedback. Rev Educ Res 2008;78:153-89
19. Brazil V, Ratcliffe L, Zhang J, Davin L. Mini-CEX as a workplace-based assessment tool for interns in an emergency department—Does cost outweigh value?. Med Teach 2012;34:1017-23
20. Castanelli DJ, Jowsey T, Chen Y, Weller JM. Perceptions of purpose, value, and process of the mini-clinical evaluation exercise in anesthesia training. Can J Anaesth 2016;63:1345-56
Keywords:

Assessment; mini-CEX; skills

© 2023 D Y Patil Journal of Health Sciences