Education systems have changed dramatically over the past 20 years. The technification of science, the influx of information, and the influence of economics on scientific development have promoted a change not only in institutions, but also in ways of thinking, which is expressed through changes in teaching and assessment methods as a result of the understanding of differences in how children and adults learn.
Performance assessment should be a dynamic, systematic, and structured process that identifies and involves assessment objectives, the selection and use of multiple tools and instruments according to those objectives, and the application of behaviors derived from this process to optimize and guide learning.1,2
After a literature search spanning from 1999 to 2017 focused on the assessment process of anesthesiology students’ performance, which attempted to describe its theoretical and pedagogical foundations, educational principles, assessment tools, and implementation strategies from the concept of programmatic assessment and assessment for learning in this practical knowledge area, 110 articles were reviewed and 73 were considered relevant for the review (Table 1, Fig. 1).
Below, readers will find the most relevant results of this narrative (non-systematic) literature review. Initially, there is an explanation of how the concept of competency has modified the anesthesiology assessment process over the past two decades, through a brief description of the domains and learning theories applied in anesthesiology. Finally, there are the assessment instruments and systems currently recommended for performance assessment of anesthesiology graduate students.
Over their professional lives, anesthesiologists develop a number of complex skills that must be learned during training and honed with practice. Teachers have a responsibility to know what skills they should teach, how to do it, when to delegate responsibilities and when a resident is able to deal with the real world in unsupervised conditions.58
Some authors propose to work on the classification proposed by Gaba et al59 based on the concept of “Situation Awareness”, which describes three basic aspects anesthesiologists should develop during training for conscious decision-making: interpretation of subtle signals, interpretation and management of evolving situations and special knowledge application.58,59,83
Gaba et al classify the competencies in which anesthesiologists should be trained in both technical and non-technical skills.6,7,58,59 The term “technical skills” refers to the execution of actions based on medical knowledge and technical perspective, focused on the control of the body and thought (Table 2).38 The most studied are orotracheal intubation, vascular catheterization, regional anesthesia, crisis management, pain management, patient assessment, and critical care management.60,71
The concept of non-technical skills refers to the development of cognitive and social skills and personal resources that enable safe and efficient task performance.6,80 The acquisition of these types of skills40 is what decreases the possibility of error and adverse events in patient care7 (Fig. 2).
Currently, there are multiple theoretical frameworks focused on the application of different competency-based models (ACGME, CanMEDS, Union of European Medical Specialists [UEMS], SCARE, etc.), which have reached different development and search ranges (Table 3).18–22,30 The most current vision is perhaps the approach based on the entrustable professional activities proposed by Ten Cate since 2010, still under in-depth study in this area of medicine.27–29
How is it assessed in anesthesiology?
Purpose of the assessment
For years, assessment in anesthesia has focused on summative competency assessment related to clinical practice, patient interaction, and critical situation analysis, often at the end of rotations. Currently, it is proposed to emphasize real-time, sequential, and progressive process assessment and learning individualization, as well as the relevance of feedback within this process.4–6
Content of the assessment
The assessment of technical and non-technical performance should have equal weight when establishing judgment.59,70,75 Traditionally, assessment in anesthesia has been limited to theoretical knowledge tests as the main source of information, coupled with unstructured direct observation of daily work and isolated logs of information without feedback focused on technical skill acquisition (Fig. 3).
A study by Ross et al found that most assessments were related to “patient care” and “medical knowledge” competencies (patient care, anesthetic plan and behavior [35%], followed by use and interpretation of monitoring and equipment [8.5%]). 10.2% were related to practice-based learning and improvement, most commonly self-directed learning (6.8%); and 9.7% were related to system-based practice competency.11
Although the educational literature supports the usefulness of multiple tools to assess performance, habit leads to using a single tool to define performance (Global Rotation Assessment and multiple-choice tests).4 This type of assessment suffers from the known limitation of the use and inadequate interpretation of scales, subjective performance assessments and the “halo” effect, where the result is determined by what is known to have occurred in the past.4–6
Despite the interest of the European Society of Anaesthesiology (the UEMS/European Board of Anaesthesia) to harmonize assessment and certification tools for anesthesia programs in Europe,20–22,24 a recent study in the European Union10 found that the assessment and certification processes for anesthesia specialist training were diverse. In many countries the traditional time-based learning model remains active, with an average duration of 5 years (range 2.75–7). The programs with the greatest number of assessment tools were competency-based (mean 9.1 [SD 2.97] vs. 7.0 [SD 1.97]; p 0.03). The most frequently mentioned tools were direct clinical observation, feedback, oral questions and/or multiple-choice tests procedure log and portfolio. Most countries had a certification process at the national level.
Some competency-based anesthesia programs, such as that of the University of Ottawa in Canada,7 suggest simplified assessment tools like those used in the oral exams of the Royal College of Physicians and Surgeons of Canada, which serve to evaluate residents’ medical knowledge and critical thinking associated with questions intended to guide their learning.
Simulation-based assessment is perhaps one of the most evidence-based tools to acquire competencies in the management of simulated intraoperative events; however, further studies are needed to determine its validity in terms of clinical performance and knowledge transfer.60,62
Some studies focused on measuring the effectiveness and validity of methods, such as Script, ECOE, Mini-CEX, and DOPS, among others, have shown their usefulness to assess graduate anesthesiology students, but at a higher cost (4–6.65).
In parallel with the difficulties of applying these “novel” assessment methods within anesthesia practice, Tetzlaff demonstrates the virtues of problem-based assessment for the acquisition and assessment of both technical and non-technical competencies at a more reasonable cost.4–6
To summarize, anesthesiology assessment is characterized in most cases by the gap between what must be evaluated and what is ultimately evaluated, giving greater importance to theoretical knowledge assessment of procedural skills and clinical judgment at the time of assessing residents. This prevents the proper assessment of the competency level reached by the student, as it does not entail a general assessment.
The advent of multiple assessment instruments designed under the precept of “evaluating to learn” and the assessment usefulness formula proposed by Van der Vleuten et al36 show that anesthesiology has a significant gap between the application of such instruments in specific teaching situations and competency-based assessment in this specialty10–17,25–32,37–46,56,57,60–67,71–78,82–89 (Table 4).
The need to assess anesthesiology residents in the clinical setting is evident; however, there is no consensus on the use, let alone the selection of the best strategy to assess their performance and learning. Although the assessment trend is focused on the “patient care” and “medical knowledge” competencies, there is great interest in other types of tools and instruments in other competencies within the AGCME and CanMEDS theoretical framework. For example, to assess non-technical skills, the University of Aberdeen in Scotland designed the Anaesthetists Non-Technical Skills tool, currently incorporated by the UK's Royal College of Anaesthetists for the routine assessment of anesthesia residents and as a possible national selection tool for future anesthesiologists.80,87
Considering that the development of an appropriate programmatic assessment system must start from the fact that there is no single type of assessment method or tool that is intrinsically superior or sufficient to assess all competencies, regardless of the proposed curricular model, programs should ensure the design and implementation of assessment methods that are consistent with the curricular philosophy according to their priorities and learning objectives.
The analysis of assessment from its educational impact and historical development indicates that the way assessment is carried out substantially influences the change in students’ learning styles; hence the importance of not making assessment an isolated measure of student performance.
Competency is content or context specific, and therefore more than one method or measurement is required to assess it, in addition to being appropriate to the learning level.13 This highlights the importance of an assessment program that includes—in a structured manner and in line with the curriculum philosophy—the use of multiple instruments to obtain the greatest amount of data and attributes related to student performance.
It is easier to recognize a competency when it has been developed than when it is absent,33 so it is important to assess all aspects of training, particularly in areas where procedural skill acquisition appears to be of most importance and less attention is paid to the acquisition of the trainees’ other professional skills.
Programs and teachers have the responsibility to define the complex competencies and skills to be learned35,36 and how to teach and evaluate them, to recognize when to delegate responsibilities and when the resident can face the real world in unsupervised conditions.
Every day there are more resources to turn assessment into a transformative tool for learning. Today there are multiple competency measurement instruments based on the traditional Miller pyramid which allow to assess both technical and non-technical skills based on residents’ process and progress, to apply the often-discussed concept of student individualization more broadly.13,33,90
Although there is still a long way to go in the area of anesthesia, there is great concern for perfecting and studying the impact of other types of tools and instruments in specific scenarios of the specialty. Curricular reforms, a change of vision and the professionalization of the medical discipline have expanded the room for improvement in the teaching area, as well as the application of new assessment strategies and instruments that could be positive and increase the likelihood of “significant learning” in anesthesiology residents.
The article is based on and follows the “Scientific, technical and administrative standards for health research” established in Resolution 8430 of 1993 of the Ministry of Health of the Republic of Colombia. The published study was deemed as a low-risk research, which required written informed consent as the study used private documents, as well as opinions and personal data, the use of which may cause psychological and/or social changes or human behavior modifications.
Study assistance: none.
The authors have no funding to disclose.
Conflicts of interest
The authors have no conflicts of interest to disclose.
1. Van Der Vleuten CP. Revisiting assessing professional competence: from methods to programmes. Med Educ 2016;50:885–888.
2. Celman S. ¿Es posible mejorar la evaluación y transformarla en herramienta de conocimiento? En Camilloni AW: La evaluación de los aprendizajes en el debate didáctico contemporáneo. 1998;Paidos, Buenos Aires,
3. Colbert CY, Dannefer EF, French JC. Clinical competency committees and assessment: changing the conversation in graduate medical education. J Grad Med Educ 2015;7:162–165.
4. Tetzlaff JE. Assessment of competency in anesthesiology. Anesthesiology 2007;106:812–825.
5. Tetzlaff JE. Assessment of competence in anesthesiology. Curr Opin Anaesthesiol 2009;22:809–813.
6. Tezlaff JE. Evaluation of Anesthesia Residents. En Frost EAM. Comprehensive Guide to Education in Anesthesia. 2014;Springer, New York:129-145.
7. Fraser AB, Stodel EJ, Jee R, et al. Preparing anesthesiology faculty for competency-based medical education. Surv Anesthesiol 2017;61:32–33.
8. Bould MD, Naik VN, Hamstra SJ. Review article: new directions in medical education related to anesthesiology and perioperative medicine. Can J Anesth 2012;59:136–150.
9. Boet S, Pigford AAE, Naik VN. Program director and resident perspectives of a competency-based medical education anesthesia residency program in Canada: a needs assessment. Korean J Med Educ 2016;28:157–168.
10. Jonker G, Manders L, Marty A, et al. Variations in assessment and certification in postgraduate anaesthesia training: a European survey. Br J Anaesth 2017;119:1009–1014.
11. Ross FJ, Metro DG, Beaman ST, et al. A first look at the Accreditation Council for Graduate Medical Education anesthesiology milestones: implementation of self-evaluation in a large residency program. J Clin Anesth 2016;32:17–24.
12. Yamamoto S, Tanaka P, Madsen MV, et al. Comparing anesthesiology residency training structure and requirements in seven different countries on three continents. Cureus 2017;9:e1060.
13. Durante E. Algunos métodos de evaluación de las competencias: Escalando la pirámide de Miller. Revista del Hospital Italiano 2006;55–61.
14. Ebert TJ, Fox CA. Competency-based education in anesthesiology: history and challenges. Anesthesiology 2014;120:24–31.
15. Frost E. Comprehensive Guide to Education in Anesthesia. New York: Springer science + business Media; 2014.
16. Baker K. Determining resident clinical performance: getting beyond the noise. Anesthesiology 2011;115:862–878.
17. Boulet JR, Murray D. Review article: assessment in anesthesiology education. J Can Anesth 2012;59:182–192.
18. The Accreditation Council for Graduate Medical Education and The American Board of Anesthesiology. The Anesthesiology Milestone Project. [Internet]. Accreditation Council for Graduate Medical Education. [Cited 2020 May 10]. Available at: https://www.acgme.org/
19. The Royal College of Physicians and Surgeons of Canada [Internet]. Royal College of Physicians and Surgeons of Canada. [Cited 2020 May 12]. Available at: http://www.royalcollege.ca/rcsite/canmeds-e
20. The Standing Committee on Education and Professional Development of the Section and Board of Anaesthesiology. European Training Requirement ETR in Anesthesiology [Internet]. Available at: https://www.uems.eu/about-us/medical-specialties
21. Larsson J, Holmström I. Understanding anesthesia training and trainees. Curr Opin Anaesthesiol 2012;25:681–685.
22. Gessel EV, Mellin-Olsen J, Østergaard HT, et al. Postgraduate training in anaesthesiology, pain and intensive care. Eur J Anaesthesiol 2012;29:165–168.
23. Chiu M, Crooks S, Tarshis J, et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anesth 2016;63:1357–1363.
24. Carlsson C, Keld D, Van Gessel E, et al. Education and training in anaesthesia – revised guidelines by the European Board of anaesthesiology, reanimation and intensive care: SECTION and BOARD OF ANAESTHESIOLOGY1, European Union of Medical specialists. Eur J Anaesthesiol 2008;25:528–530.
25. Rebel A, Dilorenzo A, Nguyen D, et al. Should objective structured clinical examinations assist the clinical competency committee in assigning anesthesiology milestones competency? Anesth Analg 2019;129:226–234.
26. Cate OT. Entrustability of professional activities and competency-based training. Med Educ 2005;39:1176–1177.
27. Cate OT. Nuts and bolts of entrustable professional activities. J Grad Med Educ 2013;5:157–158.
28. Wisman-Zwarter N, Schaaf MVD, Cate OT, et al. Transforming the learning outcomes of anaesthesiology training into entrustable professional activities. Eur J Anaesthesiol 2016;33:559–567.
29. Jonker G, Hoff RG, Cate OTJT. A case for competency-based anaesthesiology training with entrustable professional activities. Eur J Anaesthesiol 2015;32:71–76.
30. The Anesthesiology Milestone Project. Assessment of procedural skills in anesthesiology trainees: changing trends. J Grad Med Educ 2014;6 (1 suppl 1):15–28.
31. Sivaprakasam J, Purva M. CUSUM analysis to assess competence: what failure rate is acceptable? Clin Teach 2010;7:257–261.
32. Neira VM, Bould MD, Nakajima A, et al. GIOSAT: a tool to assess CanMEDS competencies during simulated crises. Can J Anesth 2013;60:280–289.
33. Jaramillo S, Vargas R. Cañadas R, Vargas R, Rincón R, et al. Cómo aprenden los adultos: una Aproximación desde la enseñanza médica. Currículo nuclear en endoscopia digestiva: fundamentos teóricos y propuesta curricular Bogotá: Panamericana; 2018;15–25.
34. Sociedad Colombiana de Anestesiología
y Reanimación. Documento marco del Plan de Estudios y Competencias para un Programa de Anestesiología
en Colombia. Bogotá: SCARE; 2017.
35. Van der Vleuten C, Schuwirth L, Driessen E, et al. A model for programmatic assessment fit for purpose. Med Teach 2012;34:205–214.
36. Van Der Vleuten CPM, Schuwirth LWT, Scheele F, et al. The assessment of professional competence: building blocks for theory development. Best Pract Res Clin Obstet Gynaecol 2010;24:703–719.
37. Bilotta F, Titi L, Lanni F, et al. Training anesthesiology residents in providing anesthesia for awake craniotomy: learning curves and estimate of needed case load. J Clin Anesth 2013;25:359–366.
38. Ramírez LJ, Moreno MA, Gartdner L, et al. Modelo de enseñanza de las habilidades psicomotoras básicas en anestesia para estudiantes de ciencias de la salud: sistematización de una experiencia. Colombian Journal of Anesthesiology 2008;36:85–92.
39. Aguirre Ospina OD, Ríos Medina ÁM, Calderón Marulanda M, et al. Cumulative Sum learning curves (CUSUM) in basic anaesthesia procedures. Colombian Journal of Anesthesiology 2014;42:142–153.
40. Stiegler MP, Tung A. Cognitive processes in anesthesiology decision making. Anesthesiology 2014;120:204–217.
41. Enser M, Moriceau J, Abily J, et al. Background noise lowers the performance of anaesthesiology residents’ clinical reasoning when measured by script concordance. Eur J Anaesthesiol 2017;34:464–470.
42. Echevarría Moreno M, Prieto Vera C, Martin Telleria A, et al. The objective structured clinical evaluation of teaching in anaesthesiology and resuscitation. Rev Esp Anestesiol Reanim 2012;59:134–141.
43. Ben-Menachem E, Ezri T, Ziv A, et al. Objective structured clinical examination-based assessment of regional anesthesia skills: the israeli national board examination in anesthesiology experience. Anesth Analg 2011;112:242–245.
44. Ahmed O, O’Donnell B, Gallagher A, et al. Development of performance and error metrics for ultrasound-guided axillary brachial plexus block. Adv Med Educ Pract 2017;5:257–263.
45. Cheung JJH, Chen EW, Darani R, et al. The creation of an objective assessment tool for ultrasound-guided regional anesthesia using the delphi method. Reg Anesth Pain Med 2012;37:329–333.
46. Chin KJ, Tse C, Chan V, et al. Hand motion analysis using the Imperial College surgical assessment device: validation of a novel and objective performance measure in ultrasound-guided peripheral nerve blockade. Reg Anesth Pain Med 2011;36:213–219.
47. Chuan A, Thillainathan S, Graham PL, et al. Reliability of the direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesth Intensive Care 2016;44:201–209.
48. Watson MJ, Wong DM, Kluger R, et al. Psychometric evaluation of a direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesthesia 2014;69:604–612.
49. Laurent DA, Niazi A, Cunningham M, et al. A valid and reliable assessment tool for remote simulation-based ultrasound-guided regional anesthesia. Reg Anesth Pain Med 2014;39:496–501.
50. Corvetto MA, Fuentes C, Araneda A, et al. Validation of the imperial college surgical assessment device for spinal anesthesia. BMC Anesthesiol 2017;17:131.
51. Chuan A, Wan AS, Royse C, et al. Competency-based assessment tools for regional anaesthesia: a narrative review. Br J Anaesth 2017;120:264–273.
52. Chuan A, Graham PL, Wong DM, et al. Design and validation of the Regional Anaesthesia Procedural Skills Assessment Tool. Anaesthesia 2015;70:1401–1411.
53. Hastie MJ, Spellman JL, Pagnano PP, et al. Designing and implementing the objective structured clinical examination in anesthesiology. Anesthesiology 2014;120:196–203.
54. Farnan JM, Petty LA, Georgitis E, et al. A systematic review: the effect of clinical supervision on patient and residency education outcomes. Acad Med 2012;87:428–442.
55. Moore DL, Ding L, Sadhasivam S. Novel real-time feedback and integrated simulation model for teaching and evaluating ultrasound-guided regional anesthesia skills in pediatric anesthesia trainees. Paediatr Anaesth 2012;22:847–853.
56. Riveros R, Kimatian S, Castro P, et al. Multisource feedback in professionalism for anesthesia residents. J Clin Anesth 2016;34:32–40.
57. Smith SE, Tallentire VR. The right tool for the right job: the importance of CUSUM in self-assessment. Anaesthesia 2011;66:747.
58. Gaba DM, Howard SK, Flanagan B, et al. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology 1998;89:8–18.
59. Gaba DM, Howard SK, Gan BF, et al. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Surv Anesthesiol 1999;43:111–112.
60. Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology 2007;107:705–713.
61. Murray DJ, Boulet JR, Kras JF, et al. A simulation-based acute skills performance assessment for anesthesia training. Anesth Analg 2005;101:1127–1134.
62. Rábago JL, López-Doueil M, Sancho R, et al. Learning outcomes evaluation of a simulation-based introductory course to anaesthesia. Rev Esp Anestesiol Reanim 2017;64:431–440.
63. Fehr JJ, Boulet JR, Waldrop WB, et al. Simulation-based assessment of pediatric anesthesia skills. Anesthesiology 2011;115:1308–1315.
64. Lammers RL, Davenport M, Korley F, et al. Teaching and assessing procedural skills using simulation: metrics and methodology. Acad Emerg Med 2008;15:1079–1087.
65. Rebel A, DiLorenzo A, Fragneto RY, et al. Objective assessment of anesthesiology resident skills using an innovative competition-based simulation approach. A A Case Rep 2015;5:79–87.
66. Schwid HA, Rooke GA, Carline J, et al. Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study. Anesthesiology 2002;97:1434–1444.
67. Byrne AJ, Greaves JD. Assessment instruments used during anaesthetic simulation: review of published studies. Br J Anaesth 2001;86:445–450.
68. Corvetto MA, Bravo MP, Montaña RA, et al. Bringing clinical simulation into an anesthesia residency training program in a university hospital. Participants’ acceptability assessment. Rev Esp Anestesiol Reanim 2013;60:320–326.
69. Hindman BJ, Dexter F, Smith TC. Anesthesia residents’ global (departmental) evaluation of faculty anesthesiologists’ supervision can be less than their average evaluations of individual anesthesiologists. Anesth Analg 2015;120:204–208.
70. Mitchell JD, Holak EJ, Tran HN, et al. Are we closing the gap in faculty development needs for feedback training? J Clin Anesth 2013;25:560–564.
71. O'Sullivan O, Shorten GD. Formative assessment of ultrasound-guided regional anesthesia. Reg Anest Pain Med 2011;36:522–523.
72. Hindman BJ, Dexter F, Kreiter CD, et al. Determinants, associations, and psychometric properties of resident assessments of anesthesiologist operating room supervision. Anesth Analg 2013;116:1342–1351.
73. Bindal N, Goodyear H, Bindal T, et al. DOPS assessment: a study to evaluate the experience and opinions of trainees and assessors. Med Teach 2013;35:e1230–e1234.
74. De Oliveira Filho GR, Dal Mago AJ, Garcia JHS, et al. An instrument designed for faculty supervision evaluation by anesthesia residents and its psychometric properties. Anesth Analg 2008;107:1316–1622.
75. Ahmed K, Miskovic D, Darzi A, et al. Observational tools for assessment of procedural skills: a systematic review. Am J Surgery 2011;202:469–80e6.
76. Rebel A, DiLorenzo AN, Fragneto RY, et al. A competitive objective structured clinical examination event to generate an objective assessment of anesthesiology resident skills development. A Case Rep 2016;6:313–319.
77. Norris A, McCahon R. Cumulative sum (CUSUM) assessment and medical education: a square peg in a round hole. Anaesthesia 2011;66:250–254.
78. Khaliq T. Reliability of results produced through objectively structured assessment of technical skills (OSATS) for endotracheal intubation (ETI). J Coll Physicians Surg Pak 2013;23:51–55.
79. Flin R, Patey R, Glavin R, et al. Anaesthetists’ non-technical skills. Br J Anaesth 2010;105:38–44.
80. Flin R, Patey R. Non-technical skills for anaesthetists: developing and applying ANTS. Best Pract Res Clin Anaesthesiol 2011;25:215–227.
81. Graham J, Hocking G, Giles E. Anaesthesia non-technical skills: can anaesthetists be trained to reliably use this behavioural marker system in 1 day? Br J Anaesth 2010;104:440–445.
82. Ahmed A. Assessment of procedural skills in anesthesiology trainees: changing trends. Anaesth, Pain & Intensive Care 2014;18:135–36.
83. Bould MD, Crabtree NA, Naik VN. Assessment of procedural skills in anesthesia. Br J Anaesth 2009;103:472–483.
84. Witt A, Iglesias S, Ashbury T. Evaluation of Canadian family practice anesthesia training programs: can the resident logbook help? Can J Anesth 2012;59:968–973.
85. Weller JM, Castanelli DJ, Chen Y, et al. Making robust assessments of specialist trainees’ workplace performance. Br J Anaesth 2017;118:207–214.
86. Weller JM, Jones A, Merry AF, et al. Investigation of trainee and specialist reactions to the mini-clinical evaluation exercise in anaesthesia: implications for implementation. Br J Anaesth 2009;103:524–530.
87. Kathirgamanathan A, Woods L. Educational tools in the assessment of trainees in anaesthesia. Contin Educ Anaesth Crit Care Pain 2011;11:138–142.
88. Castanelli DJ, Castanelli DJ, Jowsey T, et al. Perceptions of purpose, value, and process of the mini-clinical evaluation exercise in anesthesia training. Can J Anesth 2016;63:1345–1356.
89. Colbert-Getz J, Ryan M, Hennessey E, et al. Measuring assessment quality with an assessment utility rubric for medical education. MedEdPORTAL 2017;13:10588.
90. Van Meeuwen LW, Brand-Gruwel S, Kirschner PA, et al. Fostering self-regulation in training complex cognitive tasks. Educ Tech Res Dev 2018;66:53.