Secondary Logo

Journal Logo


A case for competency-based anaesthesiology training with entrustable professional activities

An agenda for development and research

Jonker, Gersten; Hoff, Reinier G.; ten Cate, Olle Th. J.

Author Information
European Journal of Anaesthesiology (EJA): February 2015 - Volume 32 - Issue 2 - p 71-76
doi: 10.1097/EJA.0000000000000109



Every day, attending anaesthesiologists delegate clinical tasks to residents. Most decisions to delegate are made informally and implicitly, often based on limited personal impressions of knowledge, skills, attitudes and trustworthiness of the trainee. Attending anaesthesiologists should be able to justify these decisions. Such justification is of prime importance in the guarantee of quality of care, patient safety, supervisor's liability and educational appropriateness. Justification becomes even more important as society increasingly expects the medical profession to account for the quality of its members, including that of residents and graduates.1,2 Demands are made on specialty-training programmes to demonstrate that their graduates have mastered all desirable abilities, or competencies, at completion of training.3,4 In addition, clear end-points in specialty-training make benchmarking and comparison of training programmes on a national or international level possible.

This is what competency-based education is aimed at.5 However, competencies that set out the desirable traits of a doctor in general terms are difficult to observe and assess in day-to-day clinical practice,6–11 making proof of mastery of competencies and comparison of training programmes difficult. In this article, we discuss competency-based training, its problems and move towards a potential solution: the emerging concept of entrustable professional activities (EPAs).12 We will describe how EPAs link competencies to clinical tasks and how EPAs allow justification of decisions to delegate by attention to observation of clinical performance. Lastly, we will set an agenda for curriculum development and research on this topic.

Competency-based training

For more than a decade, training in medical specialties has been based around competency frameworks, such as that of the Accreditation Council for Graduate Medical Education (ACGME) in the United States or the CanMEDS of the Royal College of Physicians and Surgeons of Canada.13 These frameworks set out for several competency domains what are considered to be the critical abilities needed for professional practice.3 With competency-based training, a shift from knowledge acquisition to knowledge application is intended,14 with attained competence rather than time-in-training being the key.15 Competency-based training has its limitations. A potential pitfall is that the art of medicine is broken down into a detailed list of competencies taken out of context.12,15–17 Often behavioural descriptors are formulated in universal terms that are not designed for a specific specialty.18 Many clinical educators see current competency frameworks as theoretical and somewhat detached from day-to-day practice.6–9 Competencies are difficult to assess as separate entities while supervising residents.10,11 The true goal of training should not be to attain competencies, but rather to become a doctor with expertise, who is ready to bear professional responsibility and who can be entrusted with the care of patients.1,12,16,18

A possible way to link competencies to clinical activities is the concept of EPA.19 When the performance of daily clinical activities is evaluated well, it is possible to draw inferences about the attainment of predefined competencies.12,18

Entrustable professional activities

An EPA is a task or responsibility, essential to the practice of a specialty. It can be executed individually by a trained professional within a circumscriptive time frame and it requires specific knowledge, skills and attitudes. Each EPA encompasses competencies from different domains and can be observed and assessed by a supervisor. Supervisors can entrust a resident with an EPA, once an adequate level of performance has been obtained, to execute a task without direct supervision.18–21 EPAs remove competencies from a theoretical framework by attaching them to the familiar context of clinical practice.16,22 They clarify the learning objectives of a programme or rotation by displaying the training opportunities that exist in daily work, and offering guidance to assessors on what they should assess.19–21,23,24

Examples of EPAs in anaesthesiology are general anaesthesia in an American Society of Anaesthesiologists (ASA) I to II day surgery patient, handover of a patient to the recovery room, epidural analgesia for labour, admitting a critically ill patient to an ICU, anaesthesia for a common procedure in an ASA I to II infant and trauma life support (Appendix 1, Supplemental Digital Content, One could think of approximately 5 to 10 EPAs per year of training, with the complete set of EPAs covering the breadth and depth of the specialty.

A practical procedure such as inserting a central venous line, would not, on its own, be considered an EPA, but would be incorporated as an item (skill) in several EPAs across several subspecialties, that in this case might include intensive care, cardiac anaesthesia and neuro-anaesthesia. The same holds true for clinical knowledge, wherein for example, use of inotropes might feature as an item in several EPAs from different subspecialties, but would not stand alone as an EPA. Also, competencies such as participating in an interprofessional healthcare team may be observed in many EPAs such as trauma life support, but would not constitute an independent EPA.

Competency domains and EPAs can be seen as two dimensions of a grid, the competencies-activities matrix (Table 1).18,20 By observing performance of an EPA, one implicitly observes several domains of competence.20 Typically, for each EPA, multiple domains of competence are explicitly relevant, whereas other domains of competence weigh more heavily for other EPAs. If a resident is unable to perform an EPA, deficits in competencies that underpin this activity can be found and used for feedback and further learning.16,21,22

Table 1
Table 1:
The competencies-activities matrix

There may be both intra-individual and inter-individual differences in learning curves for EPAs, and residents may vary in the sequence in which they master them (Fig. 1).20 In a developmental trajectory, junior residents will master simple EPAs whilst only more advanced trainees will master complex EPAs.

Fig. 1
Fig. 1:
Acquisition of competence. (a) Acquisition of competence, showing the competence threshold (corresponding to proficiency level IV) and continuing growth of expertise after delegation of a clinical activity. (b) Acquisition of competence for 5 different EPAs. At reaching the competence threshold, an informed and justified delegation decision can be made. Adapted with permission from20.

EPAs clarify the level of proficiency by indicating the resident's responsibilities and the necessary level of supervision. For any EPA, five levels of proficiency can be distinguished, which translate into the levels of supervision to be provided (Table 2).18,20,25,26 This level of proficiency should be recorded in the resident's portfolio.

Table 2
Table 2:
Levels of proficiency for an entrustable professional activity25,26

At reaching level IV, a formal decision is to be made to delegate an EPA to a resident to carry out without close supervision.18,20 This decision to delegate should be made after assessment of the performance of that particular EPA, and based on several assessments by a number of assessors. It should be recorded in the resident's portfolio. Decisions to delegate may be designated as STARs (Statements of Awarded Responsibility)18 or digital badges.27 This method is in sharp contrast to the traditional situation, which is characterised by implicit, informal and uninformed delegation.

Although a resident may be viewed as competent after reaching level IV, the proficiency of this trainee is likely to further increase with added experience (Fig. 1).20 After the removal of close supervision, the resident should still be able to ask for help and assistance from someone more experienced if necessary, just as permanent staff might do.

Premature, unsupervised care provided by residents adversely influences patient safety, healthcare costs and liability of supervisors.28 Conversely, an overprotective approach until the completion of training means that junior attending specialists will not have learned to bear critical clinical responsibilities. With the use of EPAs, deliberate decisions to delegate based on many assessments by multiple assessors can be made for specific tasks. This makes graded assumption of responsibility during residency training possible16 and enhances learning.19,24,28

For a training programme, a vision of a developmental timeline of levels of supervision is possible (Table 3). The timeline indicates which level of proficiency (Table 2) a resident would be expected to have at specified stages in training. In addition, it would show the expected timing of decisions to dispense with close supervision.

Table 3
Table 3:
Expected level of supervision by training stage

Future steps in anaesthesiology training: an agenda for development and research

Training programmes based on EPAs are being explored and developed, across the world and across specialties, such as in psychiatry in New Zealand and Australia,21,29 internal medicine,30 family medicine31 and paediatrics in the USA, orthopaedic surgery in Canada, obstetrics and gynaecology32 and physician assistant training33 in The Netherlands. Anaesthesiology training programmes throughout the world are organised in an outcome and competency-based way,34–36 but, to our knowledge, to date, none have been based on EPAs. To build a training programme with EPAs, consensus should be sought locally, nationally or even internationally on which essential clinical tasks define the specialty.12,21,26,30

For each EPA, the task content and context should be set out. Each EPA should be mapped to the relevant competencies in a competencies-activities matrix (Table 1).12 These competencies must be set out in an observable behaviour format to make them implicitly assessable by observing the EPA.

How to best assess complex tasks robustly and accurately remains the central question.23 Currently, there is no single tool for assessing performance of EPAs; what is needed is the development of assessment systems that incorporate both quantitative and qualitative tools.1,37,38 Different EPAs may use different sets of tools, such as high and low fidelity simulation, Objective Structured Assessment of Technical Skills (OSATS), Direct Observation of Procedural Skills (DOPS), Mini Clinical Evaluation Exercise (Mini-CEX)38–40 and Multisource feedback procedures. These or other instruments should cover the competencies that have been identified as crucial for the particular EPA being assessed. The role of different tools within an EPA assessment system needs to be established and tested in practice for validity, reliability, generalisability and educational outcome.40 Moreover, standards of good performance have to be defined,37 and judges should be trained and ‘calibrated’ with examples of performance levels for an EPA.1,23

Progress in mastery of EPA together with reports, records and comments can be kept in a digital portfolio and used to assist formal decisions to delegate. Learning analytics technology can be used to inform learners about their progress and to establish benchmarks for justified delegation. Learning analytics may provide insight into training performance at the institutional or (inter-) national level, and can make comparison of institutions possible.

Finally, apart from knowledge and skill, factors affecting trustworthiness of a resident need further investigation.18,28 Trustworthiness is difficult to quantify, but incorporates understanding of one's limitations, conscientiousness and truthfulness.16,41 Although resident proficiency might be the central issue, trustworthiness remains a significant factor in delegation, but its exact role needs to be elucidated.

A research and development agenda (Table 4) should therefore include the identification and sharing of essential EPAs for anaesthesiology training, the identification or construction of instruments to allow for justification of delegation and the adaptation of training programmes, gearing them to qualify anaesthesiologists with established high levels of competence.

Table 4
Table 4:
Development and research

Directly linked to the development of an EPA-based training programme are the challenges that come with the implementation of any new training system. There may be resistance at the introductory level before residents and supervisors accept the proposed changes.42 Organisational challenges might include dealing with allowing a variable time spent in training and problems with subsequent enrolment planning in subspecialty training posts.42 Residents’ acceptance depends on clear expectations of the programme and explicit learning objectives. Also, programme directors need to have a well defined view on stimulating and rewarding advancement in training.42 It is essential that supervisors are convinced of the advantages of the new system and are well instructed in its use.42


The concept of EPAs attaches competencies to clinical practice in anaesthesiology. It appears to be a promising approach to demonstrating that the graduating resident has obtained the abilities that the profession, regulatory bodies and society expect. The EPA approach centres on observation of performance of essential professional tasks in different contexts from which the attainment of competencies can be inferred. EPAs may be a good way to make progress in meeting learning objectives transparent. They may also make possible justified, formal and informed delegation of clinical tasks to trainees. The mark of quality of training is demonstrable expertise in performance. A programme for research and development is necessary to elicit the effects of the use of EPAs in anaesthesiology training. The goal is to link quality of training with the ultimate outcome of training: quality of patient care.

Acknowledgements relating to this article

Assistance with the review: none.

Financial support and sponsorship: none of the authors were funded by sources other than departmental funds.

Conflicts of interest: none.


1. Holmboe ES, Sherbino J, Long DM, et al. The role of assessment in competency-based medical education. Med Teach 2010; 32:676–682.
2. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system – rationale and benefits. N Engl J Med 2012; 366:1051–1056.
3. Iobst WF, Sherbino J, Ten Cate O, et al. Competency-based medical education in postgraduate medical education. Med Teach 2010; 32:651–656.
4. Schartel SA, Metro DG. Evaluation: measuring performance, ensuring competence, achieving long-term excellence. Anesthesiology 2010; 112:519–520.
5. Frank JR, Snell LS, Ten Cate O, et al. Competency-based medical education: theory to practice. Medical Teacher 2010; 32:638–645.
6. Grant J. The incapacitating effects of competence: a critique. Adv Health Sci Educ Theory Pract 1999; 4:271–277.
7. Talbot M. Monkey see, monkey do: a critique of the competency model in graduate medical education. Med Educ 2004; 38:587–592.
8. Norman GR. Outcomes, objectives and the seductive appeal of simple solutions. Adv Health Sci Educ Theory Pract 2006; 11:217–220.
9. Brightwell A, Grant J. Competency-based training: who benefits? Postgrad Med J 2013; 89:107–110.
10. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review. Acad Med 2009; 84:301–309.
11. Lurie SJ, Mooney CJ, Lyness JM. Commentary: pitfalls in assessment of competency-based educational objectives. Acad Med 2011; 86:412–414.
12. Ten Cate O. Trust, competence, and the supervisor's role in postgraduate training. BMJ 2006; 333:748–751.
13. Frank JR. The CanMEDS 2005 physician competency framework: better standards, better physicians, better care. Ottawa: Royal College of Physicians and Surgeons in Canada; 2005.
14. Carraccio C, Wolfsthal SD, Englander R, et al. Shifting paradigms: from Flexner to competencies. Acad Med 2002; 77:361–367.
15. Frank JR, Mungroo R, Ahmad Y, et al. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach 2010; 32:631–637.
16. Jones MD, Rosenberg A, Gilhooly JT, Carraccio CL. Perspective: competencies, outcomes, and controversy – linking professional activities to competencies to improve resident education and practice. Acad Med 2011; 86:161–165.
17. Pangaro L, Ten Cate O. Frameworks for learner assessment in medicine: AMEE Guide no. 78. Medical Teacher 2013; 35:e1–e14.
18. Ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med 2007; 82:542–547.
19. Ten Cate O. Entrustability of professional activities and competency-based training. Med Educ 2005; 39:1176–1177.
20. Ten Cate O, Snell L, Carraccio C. Medical competence: the interplay between individual ability and the healthcare environment. Med Teach 2010; 32:669–675.
21. Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ 2011; 11:96.
22. Carraccio C, Burke AE. Beyond competencies and milestones: adding meaning through context. J Grad Med Educ 2010; 2:419–422.
23. Hicks PJ, Englander R, Schumacher DJ, et al. Pediatrics Milestone project: next steps toward meaningful outcomes assessment. J Grad Med Educ 2010; 2:577–584.
24. Babbott S. Watching closely at a distance: key tensions in supervising resident physicians. Acad Med 2010; 85:1399–1400.
25. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ 2013; 5:157–158.
26. Chang A, Bowen JL, Buranosky RA, et al. Transforming primary care training – patient-centered medical home entrustable professional activities for internal medicine residents. J Gen Int Med 2013; 28:801–809.
27. Mehta NB, Hull AL, Young JB, Stoller JK. Just imagine: new paradigms for medical education. Acad Med 2013; 88:1–6.
28. Sterkenburg A, Barach P, Kalkman C, et al. When do supervising physicians decide to entrust residents with unsupervised tasks? Acad Med 2010; 85:1408–1417.
29. Royal Australia & New Zealand College of Psychiatrists. EPA handbook. [Accessed 25 March 2014].
30. Hauer KE, Kohlwes J, Cornett P, et al. Identifying entrustable professional activities in internal medicine training. J Grad Med Educ 2013; 5:54–59.
31. Shaughnessy AF, Sparks J, Cohen-Osher M, et al. Entrustable professional activities in family medicine. J Grad Med Educ 2013; 5:112–118.
32. Scheele F, Teunissen P, Van Luijk S, et al. Introducing competency-based postgraduate medical education in The Netherlands. Med Teach 2008; 30:248–253.
33. Mulder H, Ten Cate O, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: the case of physician assistant training. Med Teach 2010; 32:e453–e459.
34. Ringsted C, Østergaard D, Van der Vleuten CPM. Implementation of a formal in-training assessment programme in anaesthesiology and preliminary results of acceptability. Acta Anaesthesiol Scand 2003; 47:1196–1203.
35. Van Gessel E, Mellin-Olsen J, Østergaard HT, et al. Postgraduate training in anaesthesiology, pain and inyensive care: the new European competence-based guidelines. Eur J Anaesthesiol 2012; 29:165–168.
36. The Accreditation Council for Graduate Medical Education, The American Board of Anesthesiology. The Anesthesiology Milestone Project. [Accessed 25 March 2014].
37. Govaerts MJB, Van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM. Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Ed Theory Pract 2007; 12:239–260.
38. Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ 2009; 1:278–286.
39. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No 31. Med Teach 2007; 29:855–871.
40. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 2009; 302:1316–1326.
41. Kennedy TJT, Regehr G, Baker R, Lingard L. Point-of-care assessment of medical trainee competence for independent clinical work. Acad Med 2008; 83:589–592.
42. Ebert TJ, Fox CA. Competency-based education in anesthesiology. Anesthesiology 2014; 120:24–31.
© 2015 European Society of Anaesthesiology