Secondary Logo

Journal Logo

Special article

European Section/Board of Anaesthesiology/European Society of Anaesthesiology consensus statement on competency-based education and training in anaesthesiology

Shorten, George D.; De Robertis, Edoardo; Goldik, Zeev; Kietaibl, Sibylle; Niemi-Murola, Leila; Sabelnikovs, Olegs

Author Information
European Journal of Anaesthesiology: June 2020 - Volume 37 - Issue 6 - p 421-434
doi: 10.1097/EJA.0000000000001201

Abstract

Process in building this consensus statement

In July 2018, the European Society of Anaesthesiology (ESA) and the European Section/Board of Anaesthesiology (Anaesthesiology Section of the European Union Medical Specialists) (EBA) agreed to undertake production of a consensus statement based on a specific set of topics relating to competency-based medical education and training (CBMET) in anaesthesiology. Each organisation would nominate experts to contribute to the document. The Statement has been developed using a variation of a nominal group (or expert panel) approach.1 The decision to take this approach was based on the limited amount of empiric evidence currently available and the extensive experience available to both organisations regarding changes in anaesthesiology training. An initial draft was prepared by one author (GS) and shared with all members of the expert group. Each expert was initially invited to revise a specific section or sections based on his/her particular expertise. The revised sections were integrated into a second draft and edited for coherence. This second and subsequent drafts were shared with the experts and contained specific comments and queries relating to the individual sections. These included requests for clarification, insertion of relevant sources and attempts to restructure the document in order to avoid omissions and redundancy. The expert responses were incorporated into a penultimate (sixth) draft. Experts were invited to identify or add text that represent ESA/EBA consensus on the most important issues, which should be included in the summary Consensus Statement. Experts were also invited to comment on the article as a whole, including structure, format, style and content. The format of the final Consensus Statement was modified to conform with the author instructions of the European Journal of Anaesthesiology.

This Statement refers primarily to graduate education and training in anaesthesiology, commencing with completion of internship (or change from training in other disciplines) to the point of graduation or licensing to independent practice of anaesthesiology. Although reference is made to other disciplines, the overall intent is to outline the current status of and principles relevant to anaesthesiology. The article contains reference to relevant evidence and important published work selected by the experts, argument or rationale supporting certain key statements and the statements themselves.

Although implementation of CBMET has implications for each of the following, it is envisaged that different regions (countries) will address them differently based on their diverse needs, resources and healthcare systems. For that reason, they have not been considered in detail in this article:

  • (1) manpower planning
  • (2) employment and reimbursement of trainees and trainers
  • (3) economic cost and funding models
  • (4) evaluation of economic value and health impact of CBMET
  • (5) regulatory framework

Historical context

For centuries, medical doctors were trained through one or other form of apprenticeship. The core elements were ‘person to person’ instruction, observation and the opportunity to practice medicine with progressively greater independence. Respected teacher physicians were influential, and centres of training (such as Bologna and Leyden) acquired reputations for excellence.2 In 1889, William Osler and colleagues at John Hopkins implemented the first ‘time based’ medical training programme. The residency comprised a scientifically sound and structured educational component, as well as hands-on experience in the care of patients. Residents acquired progressively greater responsibility over time and undertook a period of supervised practice after completion of medical school.3 Effectively, this became the model for training in medicine for a century. Modifications to the fixed time ‘residency’ model were introduced; in the USA, the Flexner (1910)4 and Millis (1966)5 reports resulted in efforts to ensure that a continuum existed across undergraduate education and postgraduate training into independent practice, and that advances in the science and practice of medicine were incorporated promptly throughout the training continuum.

In retrospect, consideration of the inherent variation in human performance (both interindividual and over time), medical inflation (ever increasing amount of scientific information available) and the increase in number of diagnostic and therapeutic skills required of doctors of all disciplines, might have prompted revision of medical training before reports of the scale of preventable harm and public demands did so. In fact, the primary force that has led to a change from time-based to competency-based medical education has been society's requirement for greater accountability of medical practitioners and of those responsible for training medical doctors.6 This, in turn, is part of a wider move to ensure that professional performance serves society's needs first, and consistently attains high standards. The dissemination of information on medical error and preventable harm associated with healthcare have led legislators and regulatory bodies to require new and measurable forms of training, assessment of performance and management of under-performance.7–9 Although society (or societies) has (have) clearly demanded greater accountability (both of medical education and practice), it is notable that government and regulatory policy to move to CBMET has preceded the publication of large-scale or ‘Level 1’ evidence that such a move improves patient safety or decreases the incidence of avoidable medical error. Indeed, legitimate concerns about CBMET have been expressed, both on theoretical and practical grounds. Demonstration of competence in discrete skills may not imply that a trainee has achieved overall competence as a physician.10,11 It has been suggested (based on attempts to introduce competence-based education in other domains) that faculty observations of trainee behaviours assumed to represent a large number of competencies, subcompetencies or milestones simply is not feasible.12

Other forces that have tended to justify or support the move to CBMET include the need to ensure that state (or national) investment in medical education offers value for money,13 attempts to maximise the efficiency of training in terms of resource utilisation,14 the perceived difficulty in identifying and management of underperforming trainees and doctors, advances in simulation and technology enhanced learning that facilitate quantitative measurement of performance and alterations in work hours with an associated diminution in clinical learning opportunities.15

In 1996, the Royal College of Physicians and Surgeons of Canada adopted ‘CanMEDS’, a framework for improving patient care through physician training. CanMEDS attempted to describe the abilities that physicians require to meet the health needs of those they serve. These abilities were grouped into seven defined ‘roles’.16 In 1998, the USA's Accreditation Council for Graduate Medical Education began an initiative, which became its ‘Outcomes Project’ specifying six Core Competencies intended to form the basis of formal training and evaluation.17 In addition to patient care and medical knowledge, these comprised practice-based learning and improvement, interpersonal and communication skills, professionalism and system-based practice. In order to evaluate these competencies, programmes evaluated resident achievement of specific ‘milestones’ during their training. In July 2015, a joint initiative between the Accreditation Council for Graduate Medical Education and the American Board of Anesthesiology, the ‘Anesthesiology Milestone Project’ was launched.18 The Milestones were designed to provide a framework for the assessment of the residents’ progress during their training and were arranged into six levels. Broadly, similar models of medical practice were being adopted by European countries. For instance, the Medical Council of Ireland published its framework document describing ‘8 domains of good professional practice’ in 2010.19

The move from policy to implementation of CBMET has been slow and challenging. Canada has pioneered the implementation of CBMET in graduate medical education, first in 2009 in Orthopaedic Surgery at the University of Toronto20 and in July 2015 at the Department of Anaesthesiology at the University of Ottawa.21 Fraser et al.21 have reported on the progress and practical challenges encountered in implementing Competency by Design by the Department of Anaesthesiology at Ottawa.22 It has put in place a ‘ spiral’ structure over a 5-year programme that entails separate exposures to elements of clinical practice at different points during the residency. Residents are required to meet predefined milestones as they progress through ‘Transition to discipline’, ‘Foundations of Anaesthesiology’, ‘Core of Anaesthesiology’ and ‘Transition to practice’. Fraser et al.21 emphasised the need for development of rigorous assessment tools that could be applied across programmes, the additional assessment workload which CBMET inevitably entails and the careful management of organisational change. Not surprisingly, she has cited funding, maintenance of service and manpower planning as major challenges.21

In 2013, the EBA published its original training curriculum in Anaesthesiology.23 In July 2017, consensus was obtained within the EBA regarding a European Training Requirements (ETR) update. Consultation with the ESA resulted in minor revisions that were approved by the EBA in November 2017. These ETR were approved by UEMS Council in April 2018.24

The ESA and EBA recognise the value of the advances and ‘framing’ work done by learned bodies and academics around the world in preparing the ground to implement effective CBMET. In particular, the ESA and EBA endorse the general principles outlined by the International Competency based Medical Education Collaborators in 2016.25 These are

  • (1) ’That medical education must be based on the health needs of the populations served.
  • (2) The primary focus of education and training should be the desired outcomes for learners rather than the structure and process of the educational system
  • (3) The formation of a physician should be seamless across the continuum of education, training and practice’.

However, it will be necessary to develop CBMET for anaesthesiologists based on the specific requirements of the discipline, which take account of the spectrum of care provided and relevant care environments, as well as evolution of the practice of anaesthesiology.

The fundamental characteristics of competency-based medical education and training in anaesthesiology

Key advances in pedagogy and andragogy have been applied variably across higher and vocational education. Some have been successfully applied in domains relevant to medical education and training, such as adult and collaborative learning, workplace-based training, deliberate practice and feedback provision, peer-to-peer learning and reflective practice. The relevance of Bloom's taxonomy26 and Dreyfus’ model of skill acquisition27 to medical education have informed programme development, while others have developed models of learner progression specific to medicine.28,29

Consideration of educational principles also offers insight into challenges or unintended consequences that could accompany CBMET. These include management of an increased range of learning objectives while offering a decreased number of clinical learning opportunities. McKernan30 has argued that the segmentation and linear progress of learning assumed in outcomes-based education does not realistically reflect the natural learning process. He has also pointed out that the focus on measurable objectives was often flawed by evaluating what students have not learned.

Traditionally, qualification-based programmes have emphasised the training activity or process. Competency-based programmes, which use valid, reliable and complementary forms of assessment, aim to emphasise the result or effect of training. For a discipline such as anaesthesiology, we consider that the result or effect of training is critically important as a basis of entrustment decisions, being the direct determinant of the quality of care actually provided. Implicit in this quality of care is patient safety.

Anaesthesiology as a specialty has developed from a service-oriented activity, primarily based in the operating room, to include important responsibilities in other areas of medicine. Its traditional role included patient assessment, maintenance of organ function, as well as provision of analgesia and amnesia for patients undergoing procedures. Today, the practice of anaesthesiology has significantly changed towards a more holistic role in the care of hospital patients.

The Helsinki Declaration on Patient Safety in Anaesthesiology, launched in 2010 by the EBA in co-operation with the ESA, states that anaesthesiology shares responsibility for quality and safety in anaesthesia, intensive care, critical emergency medicine and pain medicine, including the whole peri-operative process and also in many other situations, inside and outside the hospital, in which patients are at their most vulnerable.31 In many countries, these different responsibilities are already integrated elements of the clinical specialty of anaesthesiology, which is reflected in the increased duration of some training programmes. In 2017, of 36 countries represented on UEMS, the duration of training has a median of 5 years and range of 2.75 to 7 years.32

Building on the principles set out by the International Competency based Medical Education Collaborators (above), the ESA/EBA propose the following as central tenets for the practical implementation of CBMET programmes:

  • (1) That medical education and training require that learners themselves are the principal actors in, and are central to their own development as professionals.
  • (2) That learning outcomes are more likely to be achieved if they are explicitly defined from the outset.
  • (3) That assessment requires the application of multiple, valid, reliable and complementary tools that can be used to enhance learning as well as to determine progression for individual trainees and to inform curriculum renewal as part of ongoing quality improvement.
  • (4) That programmes enunciate and demonstrate a responsibility for the wellbeing of trainees, which includes, but is not limited to, promotion of reflective practice.

The temptation to implement CBMET in anaesthesiology by repurposing or ‘tweaking’ traditional time-based programmes should be resisted. The Carnegie Foundation Report on reforming medical school and residency programmes set as one of its goals ‘the standardisation of learning outcomes and individualisation of the learning process’.33 This concisely states the requirement for a fundamental shift in each of the key elements. In effect, it calls for a programme to maximise the likelihood that a graduate achieves specific learning outcomes and to identify when the outcomes have not (yet) been achieved.

For CBMET (as for traditional training programmes), the following constituent elements are necessary:

  • (1) An explicit curriculum
  • (2) A clearly defined set of learning objectives
  • (3) A programme format based on a sound theoretical framework
  • (4) A programme of assessment
  • (5) Human and material resources
  • (6) Regulation and oversight

In Latin, ‘currere’ from which curriculum derives, means ‘to run’. The intention is to describe the course (as of a race). In an educational context, curriculum is variously used to describe all forms of learning, learning objectives, content and planned student experiences for a particular course or programme of study.34

The essence of CBMET is that the curriculum is built on or derives from shared, unambiguously defined learning objectives, each of which is associated with a measurable outcome. These specific outcomes are expressed as milestones or competencies, each of which relates to one or more overall domain(s) of competence that together characterise good professional behaviour. The requirements of CBMET have led to the development of more scientifically rigorous forms of training and assessment. The Objective Structured Assessment of Technical Skills 35 and the Anaesthetists Non-Technical Skills36,37 tools both have been employed widely for assessment of professional performance and have acceptable validity and reliability in simulation and clinical settings. Proficiency-based progression training requires that an individual skill is characterised in terms of observable behaviours (metrics or errors).38 These metrics and errors then serve as deliberate practice targets and sources of formative feedback discussions; collectively, they can be used to render a valid ‘score’ for an individual performance which has meaning relative to an expert-derived proficiency standard. Proficiency-based progression training appears to result in consistently improved clinician performance39,40 and possibly in improved patient outcome.41

To implement CBMET from an existing time-based programme, a curriculum must change to become focused on specific elements of competence (whatever nomenclature is applied to those elements). Specifically, a competency-based curriculum should not be limited to a list of subject areas or a specific time or number of exposure(s) to clinical events. For a Competency-Based Training Programme, assessors should be trained in the use of valid and reliable assessment tools. In particular, knowledge, attitudes, communication and professionalism should be assessed in their applied forms (in groups or individually) in clinical and/or simulated clinical environments. Learning objectives and assessment tools should be subject to review and change over time (see also Assessment below).

The UEMS24 identifies four important generic competences for any European specialist in Anaesthesiology, namely

  • (1) Expert clinician: competence enabling one to fulfil an expert role and function in the multidisciplinary settings in anaesthesia, intensive care, critical emergency medicine and pain medicine.
  • (2) Professional leader: competence in communication that enables one to deal with different aspects of human interactions and relationships. Furthermore, competences that permit effective organisation and management tasks to take place during professional activities.
  • (3) Academic scholar: responsibility to develop and maintain a high degree of professional competence, to facilitate development of colleagues and other groups of professionals and to promote development of the specialty itself.
  • (4) Inspired humanitarian: exhibit irreproachable behaviour and be aware of duties and responsibilities

However, detailed definitions of individual skills and attributes are required to facilitate training and assessment. Such skills have been itemised as milestones, competencies, subcompetencies or Entrustable Professional Activities (EPA).42 This poses the question ‘Is overall competence as an anaesthesiologist simply the sum of the composite measurable competencies?’

In anaesthesiology, Blum et al.43,44 have set out to identify critical ‘gaps’ or skill deficits relating to complex behaviour performance by soliciting expert opinion of experienced educators of trainees. They asked the question, ‘What traits characterise trainees who, upon graduation, have not achieved a minimum level of competency?’ Through a Delphi process, they condensed the responses to five key behaviours, namely

  • (1) Synthesises information to form a clear anaesthetic plan
  • (2) Implements a plan based on changing conditions
  • (3) Demonstrates effective interpersonal and communication skills with patients and staff
  • (4) Identifies ways to improve performance
  • (5) Recognises own limits.

A clear distinction exists between the reductivist approach with an emphasis on a repertoire of discrete skills and that of some experienced clinician-educators with an emphasis on integrated ‘skill-complexes’. Insufficient evidence exists at present to support one or other of these approaches. It has become possible to categorise clinician performance using artificial intelligence.45,46 In the future, it should be possible to employ a machine learning/Big Data approach to examine meaningful clinical outcomes attributable to an individual clinician's overall performance for association with that clinician's particular skills.47

A CBMET programme requires a clear structure within which training activities and assessments ‘map to’ a particular learning objective (or objectives); these should take the form of specific competencies, their constituent milestones or other predefined tasks (such as EPAs). These competencies, in turn, support achievement within one or more predefined Domain(s) of Competence. This mapping exercise must be made clear in explicit, published form to trainees and trainers.

Ideally, each programme will adopt this mapping approach so that a trainee can identify her own degree of progress in skill acquisition, and towards graduation and independent practice. In addition to attaining fundamental competencies, specific skills or possibly greater levels of performance are necessary for those training for particular scopes of practice, including clinical subspecialisation. Training in these additional skills might be undertaken by those in higher subspecialty training or who have achieved the core requirements quickly during basic training (see Training formats below).

Domains of competence

Some authorities (e.g. the Royal College of Physicians and Surgeons of Canada48 and Accreditation Council for Graduate Medical Education49) have developed useful frameworks to describe overall competence in medical practice. It is notable that these different frameworks share elements that are intuitively characteristic of the ‘good doctor’.

The General Medical Council, UK, and the Medical Council of Ireland have adopted a broadly similar approach. The Medical Council of Ireland defined and adopted eight domains of good professional practice in 2010.19 These domains apply through lifelong professional development and are intended to capture those outcomes that all doctors should strive to achieve.

  • (1) Patient Safety and Quality of Patient Care
  • (2) Relating to Patients
  • (3) Communication and Interpersonal Skills
  • (4) Collaboration and Teamwork
  • (5) Management (including Self-Management)
  • (6) Scholarship
  • (7) Professionalism
  • (8) Clinical Skills

With respect to training in anaesthesiology, the generic competencies set out in the EBA's ETR23,24 demonstrate substantial overlap with the Domains of Good Professional Practice (UK and Ireland), Core Competencies (USA) and the Framework of Physician Competencies (Canada). The ETR for Anaesthesiology further specifies 11 Domains of General Core Competencies [Appendix one (supplemental digital file, http://links.lww.com/EJA/A288), such as airway management and regional anaesthesia] and five Domains of specific Core Competencies [Appendix 1 (supplemental digital file, http://links.lww.com/EJA/A288), such as paediatric anaesthesia and multidisciplinary chronic pain management].24 This provides programme directors with a practical tool with which to set out specific learning objectives for acquisition of knowledge, skills and attitudes relevant to each of these Domains. It is intended that trainees progress towards these objectives through a hierarchy of levels of competency:

  • (1) Observer level (has knowledge of, describes)
  • (2) Performs, manages, demonstrates under direct supervision
  • (3) Performs, manages, demonstrates under distant supervision
  • (4) Performs, manages, demonstrates independently.

Higher levels of attainment in domains of specific core competencies should be attained based on the ETR, by enabling the fully trained specialist to achieve level4 through structured additional training (fellowship) of 1 to 2 years duration; these fellowships should be developed at hospital level. The aim of this additional training is to enable an anaesthetist to achieve a higher level of the competencies within certain field-domains. Therefore, depending on the flexibility of the programme and the individual capabilities of the specialist, the time required to achieve these higher levels will vary but require at least 1 year to do. We strongly encourage the directors of specific programs and heads of Departments of Anaesthesia to document the level of attainment in a form that clearly refers to the relevant ETR and states the level achieved.

Terminology and concepts: competence, competencies, entrustable professional activities and milestones

In recent years, several authors have provided detailed accounts of how certain terms which describe level of attainment have come to be used.6,50

We concur with the distinction made by ten Cate and Scheele50 between, on the one hand, competence or competency as an ability or attribute of an individual and, on the other hand, the individual's activity or behaviour. For training programmes, the implication of this is that competencies are not assessed directly but inferred from certain of a trainee's activities, such as predefined EPAs. ten Cate and Scheele50 further assert that one should only trust trainees to carry to out a specific clinical activity once they have attained each of the individual competencies that are needed to adequately complete that activity.

The ESA and EBA also concur with the definition of Competency-Based Medical Education put forward by Frank et al.6

’Competency-based education is an approach to preparing physicians for practice that is fundamentally oriented to graduate outcome abilities and organised around competencies derived from an analysis of societal and patient needs. It de-emphasises time-based training and promises greater accountability, flexibility and learner centeredness’.6

Englander et al.51 have proposed a common taxonomy for competencies. They compared competency lists from across the medical education continuum, physician specialties and subspecialties, countries and healthcare professions to produce a Reference List of General Physician Competencies (58 competencies in eight domains).51

An EPA is a key task of a discipline (i.e. specialty or subspecialty) that an individual can be trusted to perform in a given healthcare context, once certain criteria have been met.42 Although the term ‘competency’ refers to the individual trainee, the EPA is a unit of work or practice. The Royal College of Physicians and Surgeons of Canada employ both EPA and milestones within its Competency by Design programmes: ‘Milestones provide learners and supervisors with discrete information about the relevant skills of the discipline. Milestones that have been linked to an EPA are the individual skills that are needed to perform that task’.52 The key difference between EPAs and milestones are that EPAs are the tasks or activities that must be accomplished, whereas milestones refer to abilities of the individual. The EPA construct is intended to enable a trainee to demonstrate her competence in a practical way. A single EPA may require her to apply (and demonstrate) several different competencies.

The EPA is a useful concept or tool that requires both trainer and trainee to address what is required in a real clinical setting (and therefore what will benefit a patient); retains the value of a trainer's ‘clinical wisdom’ in interpreting what she sees; and draws on the sense of trust one experiences in observing a competent colleague. ‘Entrustability scales’ have been proposed to improve the reproducibility of trust decisions in anaesthesiology and other disciplines.53,54 Recently, Programme Directors in the Netherlands employed a modified Delphi approach to identify a core set of 45 EPAs to enable transition from a traditional curriculum to one based on competencies.55

These explanations illustrate the inherent challenge in attempting to train to a ‘global’ competence (which implies a single, relatively stable overarching state that is associated with, or results in, multiple ‘correct’ behaviours and attributes) when the elements of any practicable programme include a curriculum and set of assessments, each composed of a finite number of testable skills.

The relative value of designing a programme around specific competencies/milestones,56 or around EPAs or both is currently undetermined. One might anticipate that a programme based on specific competencies will take account of the individual trainee's ‘readiness’ to acquire a new skill, including her personal aptitudes, prior learning and experience and motivation. The resulting emphasis on the trainee's ‘zone of proximal development’ might be achieved in the form of highly personalised formative feedback.57 Assistance provided to the trainee would be highly responsive to the learner's level of performance.58 A programme based on achievement of EPAs could provide a trainee with detailed information on any differences between her performance and the ideal performance of the same task.41

At this point in the development of CBMET, it is vital that programmes, accrediting bodies and independent researchers report the success or otherwise of different approaches to implementing CBMET, particularly in terms of effect on clinical outcome and preventable harm. These reports should enable a movement over time towards standardisation of how programmes operate. On the basis of our current collaborative approach, the discipline of anaesthesiology could play a leading role in such progress.

Material resources and faculty development

The key resources necessary to develop and maintain a CBMET programme in anaesthesiology include human capital in the form of expertise and commitment, access to clinical learning opportunities, information technology and data management systems, as well as physical infrastructure dedicated to training/education that may be embedded in clinical environments. The adequacy of these for a given programme's activity should be the subject of a stringent quality assurance and accreditation process. Two resources require specific attention, simulation facilities and technology, and a faculty development programme.

Simulation facilities

The use of simulation, particularly technology enhanced simulation, has proven to be effective in improving learners’ knowledge, skills and behaviours.59 In particular, it offers valuable learning opportunities for complex behavioural skills such as teamwork and communication, for clinical events that occur infrequently and embedding discrete skills (drug administration, airway management) in complex and dynamic environments for the purpose of education. Over time, the digitisation of video images and ‘datafication’ of learners’ performances will generate a valuable database that can further improve training efforts both at the levels of individual learner and of the programme. The major challenges associated with simulation-based training lie in determining the effect on clinical outcome, matching the activity to specific learning objectives and optimising the resource utilisation (including faculty and training) for training effect.

One particular problem facing training programmes is the provision of learning opportunities and performance of assessment on clinical competencies that occur infrequently or would pose an inherent risk to a patient. At face value, simulation provides a possible solution. However, the evidence supporting the validity of assessment performed in a simulation setting is limited. Isaak et al.60 have demonstrated moderate to strong associations between simulation-based milestone assessment and experience level, time in training and clinical evaluations. Blum et al.61 have provided important evidence for one simulation-based performance assessment system for anaesthesiology trainees. Crucially, these investigators (the Harvard Assessment of Anaesthesia Resident Research Group) have detailed a rigorous and feasible methodology for examining validity, reliability and generalisability of other assessment tools.61,62 Chiu et al.63 have demonstrated the feasibility of developing a national simulation curriculum for anaesthesiology by generating consensus around needs analysis, scenario development and assessment type.

Simulation facilities offer a valuable resource to anaesthesia CBMET programmes. However, their application requires design of training interventions (e.g. scenarios and tasks), which address specific predefined competencies, and of assessment tools that are valid and reliable. The determinants of consistent and effective ‘transfer’ of skills acquired through simulation-based training to the clinical environment requires greater elucidation.

A faculty development programme

For any given CBMET programme, faculty will contribute at one or both of two levels: programme developers or leads and teachers and trainers. At hospitals in which CBMET in anaesthesiology takes place, all faculty should contribute at one or other level. Teachers/trainers will be familiar with the programme construct (and how it differs from traditional time-based training) and with the specific competencies to be acquired by trainees. This will entail re-education for some anaesthesiologists, and openness to this will be a desirable attribute of faculty in teaching hospitals. In managing organisational change, Fraser et al.21 emphasised the importance of ‘buy in’ of stakeholders, especially faculty and trainees. Teachers/trainers will supervise trainees, provide instruction and targeted feedback (to trainees and programme directors), and conduct standard assessments including of nontraditional competencies such as communication and leadership. To fulfil these roles, faculty will themselves acquire and develop teaching and communication skills. They should also contribute to the overall quality of the programme by being role models of continuous improvement and of reflective practice. Crucially, they will judge the level of independence and nature of supervision provided to trainees, taking account both of the pre-eminence of patient safety and the need to extend the range of responsibility granted to trainees as they progress towards independent practice. The timing, content and motivational aspects of formative feedback provided by trainers represent a critically important component of any training programme. Faculty will be supported in fulfilling these roles by provision of a faculty development programme. This programme will support a practitioner in maintaining her own professional competence and contributing to a culture of lifelong learning. Certain faculty members will make particular contributions in specific elements of the programme depending on their scope of practice and expertise. In that case, their development programme will be customised for their individual needs. Overall, the training programme will benefit as individuals make complementary contributions co-ordinated by a programme director. Any resource planning for CBMET will need to include investment in an effective and sustainable faculty development programme.

Competency-based medical education and training programme formats

Ideally, each CBMET programme format should be clearly documented and include the elements set out in Table 1 (Supplemental Digital File, http://links.lww.com/EJA/A289).

Heterogeneity of competency-based medical education and training programmes

As national or institutional CBMET programmes in anaesthesiology develop, the regular publication of programme quality and performance reports will enhance planning and refinement efforts internationally.

It is inevitable that different CBMET programmes will vary considerably in scale, rate of change and educational structure.6,24,50 For instance, the ETR cannot be applied in all European countries to the same degree, even though its authors and the EBA aimed to produce a document that could harmonise training in various European settings.24 Variation in programmes is inevitable and probably of benefit to the overall education and anaesthesiology communities, considering heterogeneous demands of society. It is reasonable that the medical education community shares experience across disciplines and over time, as new CBMET programmes are put in place and evaluated.64,65 Pioneering centres across North America and Europe, specifically in the Netherlands, have made significant advances on how best to deliver CBMET for anaesthesiologists.32,40,41,61,62 Further research in this field is warranted in order to develop evidence-based and experience-based practical guidelines on CBMET programme development and implementation.

With respect to EU countries, the ETR 201824 is unlikely to be applied in all programmes to the same degree, at least initially. A substantial degree of harmonisation of training outcomes is necessary however, as EU law66 requires that equivalent qualifications be recognised in EU member states. There is current and legitimate debate across the EU regarding the status of intensive care medicine as a discrete medical specialty in some countries and as a supra-specialty in others, and the future of training and status of nurse anaesthetists. Although both of these issues lie outside the scope of this article, they will have important and direct implications for CBMET programmes in anaesthesiology. Competencies for interprofessional collaboration and teamwork will be required in all countries, and will differ in settings in which health professional roles differ. We are confident that explicit interprofessional and interdisciplinary education will become a necessary and valuable part of all medical training.

Shared characteristics of competency-based medical education and training methods

Although the ETR in its current form (update 2018) does not include a precise description of how to deliver CBMET, it does set out requirements for training institutions, clinical activities, infrastructure and processes, trainers and tutors, quality management and the assessment of trainees’ competence gain.24 It is intended that the ETR can serve as a useful resource in addressing the practical challenges of planning and implementing an institutional CBMET.

Time-variable competency-based medical education and training and minimum duration of training

An efficient CBMET programme will be ‘time-variable’ for trainees, because it identifies an individual's progress promptly, and can promptly provide the rapidly progressing trainee with the next relevant learning opportunity.67 The time required for a trainee to complete a CBMET programme necessarily varies because trainees will achieve different competencies (or milestones) at different rates.67 It is notable that the current ETR in anaesthesiology defines a minimum training duration (5 years),24 as does the Competency by Design system at the University of Ottawa.21 This (presumably) represents a reasonable effort to manage risk during a transition period in terms of cultural change within organisations. It will also enable ‘early adopting’ institutions to acquire experience of trainee progression rates in settings wherein service delivery depends on employment of predictable numbers of anaesthetic staff. As so little is known about the rate at which skill attrition occurs after a period of training, we suggest that a minimum quantum of exposure to specific subdisciplines may be necessary to enable a trainee to integrate new skills into her work practice in a way that is stable and sustainable.

The frequency of opportunities for clinical learning for individual trainees has decreased with the implementation of working time restrictions, such as the European Working Time Directive.15 Such restrictions pose significant challenges to training programmes of all types. They necessarily result in prolonged training duration in ‘count based’ systems (unless a trainee's case load per unit time increases proportionally). Programmes that are strictly time-based may not provide trainees with sufficient clinical exposure to acquire the necessary skills for independent practice. Ideally, CBMET can address the challenge of working time restrictions by maximising the learning benefit of each case. This would entail a deliberate practice approach that explicitly targets defined skills and in which supervisors provide real time formative feedback. This would require a substantial change to the training/service provision balance that currently exists in many teaching hospitals. Retrospective review of clinical performance (in the future, making widespread use of video, audio and other records), simulated workplace learning, e-learning modules and other educational activities will all serve to offset the effects of a decrease in working time, but must be considered as adjuncts, rather than alternatives to clinical learning. As working time restrictions are implemented, complex competencies relating to specific core domains may require modules of greater duration. Similar arguments apply to trainees who choose to train on a part-time basis. Their overall duration of training will inevitably increase. It will be particularly important to consider how the inter-dependence of certain skills influence overall progression of the part-time trainee.

Workplace learning

A CBMET programme should describe and adhere to standards for clinical workplace teaching performed by medical professionals. Clinical learning by anaesthesiologists occurs (and should occur) in a wide variety of settings (e.g. ward bed-side, operating theatre, labour ward, emergency department, out of hours) and takes (and should take) many forms (e.g. facilitated by direct instruction, didactic teaching, individual clinical supervision, provision of formative feedback). When sufficient resources are in place, teaching in a CBMET programme should be rewarding and satisfying68; the achievement of competencies by trainees correlates strongly with the input of individual teachers and teaching efficiency. Substantial guidance is both necessary and now available to support competency-based teaching methodology.69–71 The EBA is currently preparing a handbook on this issue and is working with the ESA on hospital visiting and exchange programmes for faculty (’teach the tutors’ programmes) in order to facilitate change in educational culture and to empower specialists by transfer of knowledge, skills and specific attitudes in clinical workplace teaching.

Impact of competency-based medical education and training format on learner-centredness, trainee wellbeing, flexibility and subjectivity of assessment

A characteristic feature of effective CBMET is individualisation. The timing and rate of progress for a trainee through specific modules needs to take account of her concurrent learning of generic and specific competencies at increasing levels of depth or complexity, her experience and ‘readiness’ for new skill acquisition and the programme's capacity to provide clinical learning opportunities under supervision.

Training in anaesthesiology is demanding and onerous; the stresses associated with matching one's peers in rate of progress and learning can undermine a trainee's confidence and wellbeing. Educating trainees about reflection and the active promotion of reflective practice are important elements that support wellbeing and resilience as well as the achievement of professional goals.72

The degree of flexibility required of an international CBMET programme is a magnitude greater than that of a traditional time or count-based training.73 Unlike time and ‘count-based’ progression based on simple, objective criteria, one anticipates that different trainees will take quite different routes even through a single CBMET programme, depending on her particular aptitudes, motivations and the opportunities presented. This feature of CBMET represents a meaningful educational advantage, but requires staff and scheduling plans of greater flexibility. A flexible or adaptive progression route offers benefits, not only in terms of learning but also for the wellbeing of trainees, an increasingly visible consideration which lies within the scope of responsibilities of trainers. From the trainee's point of view, it is essential that learning opportunities and teaching resources are made available in a way that is fair and seen to be fair, based on individual needs.

In practical terms, CBMET teachers are required to evaluate trainee performance in order to make entrustment decisions. These cannot be based on a single case, or a single formal assessment, but require multiple sources of information. The University of Ottawa has put in place a ‘spiral’ curriculum in which trainees are exposed to certain elements of anaesthesia practice at different points in their residency.74 Decision variables consist of ability, humility, integrity, reliability and adequate exposure. The ETR aims at including objectivity in training and workplace evaluations by proposing simulated scenarios at least once or twice a year; for example, for training the management of rare complications and for identifying partial knowledge, latent errors. The REVIEW method has been developed to facilitate reflection and discussion of the hidden curriculum in the very specific microculture of a clinical team.75 Currently, the tools and evidence base available are insufficient to eliminate subjectivity from entrustment decisions and this should be acknowledged transparently by programmes for the benefit of trainers and trainees.

Programme format and the healthcare environment

CBMET programmes provide measurable benefit to anaesthesiology departments and the hospitals in which they are offered. Trainees’ feedback indicate that they regard hospitals or countries with CBMET programmes as superior.76 Trainees prefer the opportunity to mature in a competency-oriented setting with individualised educational care. The quality of education and training methods available influence trainees in their choice of medical discipline and hospital. A well structured CBMET programme should serve to attract graduating doctors to anaesthesiology as a discipline and to the hospitals in which it is offered.

Expertise in the management of organisational change is very desirable for hospitals converting from time or count-based training to CBMET.21 Hospital administrators have to set the stage to enable CBMET implementation and support their faculty in building the professional ‘micro-culture’ necessary to implement a new training programme. This will require that the rationale for CBMET and the support of the hospital is made explicit, and disseminated to patients, staff and the wider community. Implementation of CBMET at scale requires genuine collaboration between hospitals, universities, government agencies, regulatory bodies and policy-makers in order to optimise training and fulfil a society's needs.6,77,78 It is in an hospital's interest to offer training of such quality as to attract and retain first class clinicians and trainees. In documenting a CBMET programme, it is necessary to state explicitly the role and responsibilities of the employing institution. Health technology assessment in this field is warranted in order to explore the economic and social impact of investment into CBMET programmes.

Programme of assessment

Ronald Epstein's excellent review of ‘Assessment in Medical Education’ included a summary of the principles that apply specifically to assessment in the setting of CBMET.79 These principles hold good today. [Table 2. In Supplemental Digital Files, http://links.lww.com/EJA/A290. Reproduced with permission of Massachusetts Medical Society]

Which tools? Matching assessment tools or methodologies to programme objectives

Psychometrically robust tools are necessary to inform high stakes decisions, such as individual trainee advancement (within the programme), graduation (from the programme) or the need for additional training or support. In general, summative assessments serve this purpose best; in order to increase reliability, summative assessment at any point in time should include reference to the results of previous assessments, including records of clinical and scientific activities and so on (e.g. a portfolio). Formative assessments are designed to provide a trainee with detailed insights into her performance and a practical basis for improvement. Certain assessment modalities meet both of these needs. Holmboe et al.80 state that ‘CBME further requires assessment processes that are more continuous and frequent, criterion-based, developmental, work-based where possible, use assessment methods and tools that meet minimum requirements for quality, use both quantitative and qualitative measures and methods, and involve the wisdom of group process in making judgments about trainee progress’.

This insightful statement poses a number of practical challenges to those responsible for assessment of, and provision of feedback to anaesthesiology trainees. Foremost of these is the need to examine tools for validity, in particular construct validity (i.e. does the tool truly measure what it purports to measure?). Most tools applied in clinical medicine have not been examined for construct validity.60,81

For technical skills, methodologies that have been developed primarily for assessment of surgical procedures may be useful. These include ‘objective structured assessment of technical skills’ (OSATS)82 and proficiency-based progression.39 Both methodologies have had construct validity demonstrated repeatedly for different technical procedures, in different clinical settings. The Anaesthetists Non-Technical Skills (ANTS) behavioural marker system has undergone similar evaluation for nontechnical skills.83 Simulation environments can be used to examine more complex ‘integrated’ skills applied to anaesthesiology and bring the potential to assess elements of a curriculum that occur rarely in clinical practice. Chiu et al.63 have demonstrated the feasibility of applying this approach to a nationally implemented (Competency by Design) curriculum. It will be necessary to examine each specific simulation-based assessment tool for validity, reliability and generalisability as Blum et al.43,44 have done in the Harvard Anesthesiology residency programmes.

Workplace-based assessment of clinical performance poses a different set of challenges (including balancing patient care with the value of assessment, selection of frequency and form, and assessor training). The ‘do's, don’ts’ and ‘don’t knows’ of direct observations of clinical skills have been thoroughly explored and summarised.80

A multitude of factors influence the educational impact of workplace-based assessment.84 Improved understanding of these factors will lead to greater uniformity of impact and refinement of the tools themselves. ‘Teach the teachers’ courses should enhance the understanding and practice of those staff who carry out workplace-based assessment.

It is likely that, as the evidence base grows, certain robust assessment tools can be accepted as standard. For anaesthesiology, there is a need to examine whether particular forms of assessment that provide formative feedback improve performance in a simulated setting, improve performance in a clinical setting and/or improve clinical/patient outcome. A programme of assessments in clinical and simulation settings will be necessary to meet the requirements of CBMET through a general application for core competencies and, sometimes in a customised form, for individual trainees. The suite of assessments should evolve over time based on development of improved tools and as critical gaps are identified in training programmes.43

Just as an individual who has acquired several discrete skills cannot assume (or be assumed) to function as a proficient doctor, assessment tools hold validity only for those skills and attributes that they have been designed to measure. Ideally, a training programme will apply different forms of assessment (some which target integrated skills such as handover and drug prescribing in the setting of a rapidly deteriorating patient) of different trainees, at different points in their development. The schedule of assessments itself should change and improve over time. Dauphinee et al.85 have challenged the medical assessment community ‘to initiate an integrated plan to implement Competency–Based Assessment as a sustainable innovation within existing educational programs and self-regulatory enterprises’. Supranational examinations (e.g. ESA European Diploma in Anaesthesiology and Intensive Care)86 that apply rigorous methodology, informed by a growing scientific evidence base could serve to harmonise summative assessment across different countries.

When to assess and how often?

The timing of formal assessment will influence its effect on learning. The type and timing (whether fixed or discretionary) of summative assessment (sometimes based on predefined ‘milestones’) should be set out in the programme's Assessment Plan. The Assessment Plan should map to the programme's learning outcomes (e.g. those of the ETR in Anaesthesiology) and describe how and when specific formative and summative assessment tools will be used. It should also lay out clearly how specific assessments will influence trainee progression throughout the programme. Ideally, a trainee who has not achieved a proficiency standard is not simply required to ‘repeat the module’, but is provided with an individualised training plan to address specific competencies.

The feedback that will enable a trainee to improve performance should be provided at the time that it is most useful. Ideally, this is reasonably proximate to the assessment that elicited it, at a time when the trainee can direct attention to it, that an opportunity exists to ‘interrogate’ it (‘Does that mean that ..?’), and that an opportunity exists to act upon it in the fairly near future. At least for task-based learning, timely expert input can enable a trainee to progress efficiently (as with Vygotsky's Zone of Proximate Development theory),87 but if mis-timed, can be counterproductive.88 Providing appropriately timed feedback to a trainee should provide the ‘scaffolding’ to enable her progress to a higher level of performance. Thus, design of an effective schedule of assessment must take into account four dimensions: a wide range of competencies, mini-competencies, skills or milestones of various complexity, the need to provide summative and formative information to the trainee and the programme, synchronising assessments with specific clinical or programmed events and ‘readiness’ of an individual trainee to benefit from feedback.

The scale of this challenge could result in two undesirable outcomes: ‘over assessment’ in which the operation of the programme is impaired by an excessive number of distracting and resource intensive assessments and an ‘hidden curriculum’ in which trainees devote attention and effort to meeting the requirements of the ‘test’ to the detriment of their learning.

The expertise of faculty (individually and collectively) should offset these risks: an attuned supervisor may identify opportunistic events as sources of valuable feedback. Conversely, the supervisor may judge that a trainee is overwhelmed by events and is not ‘ready’ to learn a particular skill. Overlaying a basic assessment schedule with opportunistic feedback and formative assessments requires that faculty have a working understanding of the overall programme, as well as their individual teaching responsibilities.

‘Datafication’ of training and assessment

As assessments are performed and recorded, a repository of information is created that will increase over time. Much of this information exists in digital forms (text, images, video and audio, some in the form of an e-portfolio) and almost all can be created in or converted to digital form. Different ‘owners’ of these data might benefit if the resulting dataset is examined. Assessment data of trainees in a postgraduate training programme will have greatest value if they are combined with or examined beside other datasets, such as undergraduate and postgraduate academic performance, on call schedule, progression through training and subsequent clinical events. An individual trainee might access such an aggregate dataset selectively to understand their personal patterns of learning or ‘blindspots’ or performance predisposition (such as the various effects of fatigue). A programme director might identify which training interventions or assessments deliver greatest value across a cohort of trainees. In the future, combining such datasets across training programmes offers the potential to accelerate the development of CBMET programmes as effective and efficient training enterprises. A certain level of uniformity of data collected (e-portfolio content, metrics and others) is necessary to make benchmarking available. Although this ‘datafication’ of training and assessment information is not a customary activity for most medical training bodies, it is likely that they will benefit from the advances offered by others, such as edX and Khan Academy.

Competency-based medical education and training: implications for lifelong learning

The ‘formation’ of a specialist physician should be seamless throughout the continuum of undergraduate education, postgraduate training and independent practice. The challenges to achieving such a continuum have been well articulated elsewhere.89,90 These challenges can be classified as regulatory (discrete oversight bodies with differing legal and social imperatives), educational (complexity of matching learning supports to different phases of an individual's development) and operational (management of organisational and system change across cultural and administrative environments).90 In different environments, doctors require differing and appropriately matched learning support at different phases of their careers. In order to improve the congruity of, and transitions within medical education, training and development, endorsement of overarching goals (such as the triple aim of improving healthcare, improving the health of populations and reducing costs of healthcare) has been proposed.14,91 Crucially, such a unified approach would ensure that regulatory and training bodies across jurisdictions and across the learning continuum re-emphasise universally shared, transparent objectives. Regulatory Bodies have a responsibility to ensure that these objectives are clearly stated in policy and implemented so as to enable the necessary balances between healthcare delivery and education. One piece of ‘common ground’ on which the regulatory and training bodies could operate is an agreed set of quality attributes. This would entail, the sharing of data and collaboration on quality improvement initiatives. In addition to minimising the problems currently associated with phase to phase transitions, an effective continuum could promote progressive development of generic skills to enable physicians to adapt to different work environments, teams, advances in practice and stressors. Chief amongst these skills are reflection/reflective practice, communication (including assertiveness) and teamwork. Such a coherent approach would also facilitate examination of different data streams over time, provide tools to individualise training, inform career choice and selection for entry to anaesthesiology programmes. Of course, the development of propensity scores based on an individual's ‘history’ and a Big Dataset requires careful consideration of ethics, ownership of and access to the data, and the duration of its usefulness.

Upon certification for independent practice as an anaesthesiologist, education and learning should accompany the clinician lifelong. In developing the ETR, the EBA has emphasised the individual's responsibility to develop and maintain a high degree of professional competence, to facilitate development of colleagues and other groups of professionals, and to promote development of the specialty itself.24 CBMET programmes will help to establish a culture of lifelong learning within the anaesthesiology community. Specialist physicians will undergo ‘professional development’ including acquisition of higher level competence or mastery in specific core domains (such as in cardiac anaesthesiology or paediatric anaesthesiology) and of nontraditional competencies (such as a trainer and teacher for medical trainees and students, trainer for patient education and researcher). CBMET-trained doctors will continue to develop teaching and communication skills they have acquired during training. Ideally, this development will be customised to enhance their particular training practice. They will serve as role models of commitment to continuous quality improvement, and of reflective practice.

As adult learners, specialist physicians will benefit from, and engage with material based on practical experiences, which is goal-oriented, and relevant to their workplace. Professional development of specialists is not yet as clearly defined or structured as CBMET programmes for trainees, or as undergraduate curricula. This particularly applies to definition and measurement of outcomes. Fellowship Programmes and Continuing Medical Education points accumulation are two well established elements of professional development for those who have completed primary training. Continuing Medical Education points are awarded for attendance or participation at/in congresses, local audit and educational sessions, e-learning modules and/or blended learning programmes. Specified points accumulations are required by regulatory authorities for maintenance of certification and for the eligibility for liability insurance in many countries. This approach to maintenance of professional competence resembles a count-based or time-based graduate education, in that activity rather than outcome is the core. Implementation of adult learning modalities could increase efficiency and value of continuing medical education materials.91 Over time, it is likely and desirable that professional competence systems will evolve to competency-based professional development. The definition of a set of competencies for subspecialty or supraspecialty training and standards in assessment of professional development are on the agenda of the EBA.

Acknowledgements relating to this article

Assistance with the article: the authors would like to acknowledge the invaluable leadership and support of the ESA and EBA Executives in the generation of this Consensus Statement

Financial support and sponsorship: none.

Conflicts of interest: none.

EDR is the president of the European Board of Anaesthesiology (EBA-UEMS) and a co-opted member of the ESA Board of Directors, ZG is the immediate past president of the ESA, SK is the chairperson of the European Board of Anaesthesiology Standing Committee on Education & Professional Development, LNM is the chair of the ESA eLearning Committee and a council member of the ESA, OS is the immediate past chairperson of the European Board of Anaesthesiology (EBA-UEMS) Standing Committee on Education and Professional Development and a council member of the ESA.

References

1. Jones J, Hunter D. Consensus methods for medical and health services research. BMJ 1995; 311:376–380.
2. Fulton JF. History of medical education. BMJ 1953; 2:457–460.
3. Ebert TJ, Fox CA. Competency based education in anesthesiology. Anesth Analg 2014; 120:24–31.
4. Abraham Flexner. Medical Education in the United States and Canada: a report to the Carnegie Foundation for the Advancement of Teaching, bulletin no. 4. New York City: The Carnegie Foundation for the Advancement of Teaching; 1910.
5. American Medical Association. Council on Medical Education and Hospitals of the American Medical Association, Citizens Commission on Graduate Medical Education. The Graduate Education of Physicians: the report of the Citizens Commission on Graduate Medical Education (Millis Report). Chicago, IL: American Medical Association; 1966.
6. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach 2010; 32:638–645.
7. Makary M, Daniel M. Medical error – the third leading cause of death in the US. BMJ 2016; 353:i2139.
8. To err is human: building a safer health system. Institute of Medicine; 1999. http://www.nap.edu/books/0309068371/html/ [Accessed 4 March 2020].
9. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med 2014; 371:1803–1812.
10. Huddle TS, Heudenbert GR. Taking apart the art: the risk of anatomising clinical competence. Acad Med 2007; 82:536–541.
11. Regeher G, Eva K, Ginsburg S, et al. Assessment in Postgraduate Medical Education: Trends and Issues in Assessment in the Workplace. A Paper Commissioned as part of the Environmental Scan for the Future of Medical Education in Canada Postgraduate Project. Members of the FMEC PG consortium (2011). https://www.semanticscholar.org/paper/13-Assessment-in-Postgraduate-Medical-Education-%3A-Regehr-Eva/4a4d70d9775a68f9b85a4076d7c6b362ea8b82ed [Accessed 4 March 2020].
12. Klamen DL, Williams RG, Roberts N, et al. Competencies, milestones, and EPAs – are those who ignore the past condemned to repeat it? Med Teach 2016; 38:904–910.
13. Committee on the Governance and Financing of Graduate Medical Education. Board on Healthcare Services. Graduate medical education that meets the nation's health needs. In: Eden J, Berwick D, Wilensky G, editors. Washington, DC: Institute of Medicine of the National Academies. The National Academies Press; 2014. International Standard Book Number-13: 978-0-309-30355-2.
14. Berwick DM, Nolan TW, Whittington J. The triple aim: care, health, and cost. Health Aff (Millwood) 2008; 27:759–769.
15. Directive 2003/88/EC of the European Parliament and of the Council of 4 November 2003 concerning certain aspects of the organisation of working time. Official J L 299, 18 November 2003: 0009–0019. https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32003L0088&from=EN [Accessed 9 September 2010].
16. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach 2007; 29:642–647.
17. Swing SR. The ACGME outcomes project: retrospective and prospective. Med Teach 2007; 29:648–654.
18. ACGME Milestones Project. https://www.acgme.org/What-We-Do/Accreditation/Milestones/Overview [Accessed 4 March 2020].
19. Eight domains of good professional practice as devised by the Medical Council. https://www.medicalcouncil.ie/Existing-Registrants-/Good-Professional-Practice/Eight-Domains-of-Good-Professional-Practice-as-devised-by-Medical-Council.pdf [Accessed 4 March 2020].
20. Sonnadara RR, Van Vliet A, Safir O, et al. Orthopedic boot camp: examining the effectiveness of an intensive surgical skills course. Surgery 2011; 149:745–749.
21. Fraser AB, Stodel EJ, Chaput AJ. Curriculum reform for residency training: competence, change, and opportunities for leadership. Can J Anesth 2016; 63:875–884.
22. Levine MF, Shorten GD. Competency based medical education; its time has arrived. Can J Anaesth 2016; 63:802–806.
23. European Training Requirements. Training requirements for the speciality of Anaesthesiology, Pain and Intensive Care Medicine. European Board of Anaesthesiology (a Division of European Union Medical Specialties; 2013. http://www.eba-uems.eu/resources/PDFS/Training/Anaesthesiology-Training-Requirements-March-2013.pdf [Accessed 4 March 2020].
24. European Training Requirements. Training requirements for the speciality of Anaesthesiology, Pain and Intensive Care Medicine. European Board of Anaesthesiology (a Division of European Union Medical Specialties; 2018. https://www.uems.eu/__data/assets/pdf_file/0003/64398/UEMS-2018.17-European-Training-Requirements-in-Anaesthesiology.pdf [Accessed 4 March 2020].
25. Carracio C, Englander R, Van Melle E, et al. Advancing competency based medical education: a charter for clinician educators. Acad Med 2015; 91:645–649.
26. Bloom BS, Engelhart MD, Furst EJ, et al. Taxonomy of educational objectives: the classification of educational goals in Handbook I: Cognitive domain. 1st ed.New York: David McKay Company; 1956.
27. Dreyfus SE, Dreyfus HL. A five stage model of the mental activities in directed skill acquisition. Operations Research Centre, University of California Berkeley; 1980. https://apps.dtic.mil/dtic/tr/fulltext/u2/a084551.pdf. [Accessed 4 March 2020].
28. Spady WG. Organizing for results: the basis of authentic restructuring and reform. Educ Leadership 1988; 46:4–8.
29. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: (Suppl 9): S63–S67.
30. McKernan J. Perspectives and imperatives: some limitations of outcome-based education. J Curric Supervision 1993; 8:343–353.
31. Mellin-Olsen J, Staender S, Whitaker DK, et al. The Helsinki declaration on patient safety in anaesthesiology. Eur J Anaesthesiol 2010; 27:592–597.
32. Jonker G, Manders LA, Marty AP, et al. Variations in assessment and certification in postgraduate anaesthesia training: a European survey. Br J Anaesth 2017; 119:1009–1014.
33. Cooke M, Irby DM, O’Brien BC. Educating physicians: a call for reform of medical school and residency. 2010; San Francisco, CA: Jossey-Bas, 61764–61769.
34. Kelly AV. The curriculum: theory and practice. 6th ed.2009; London EC1Y 1SP: SAGE Publications Ltd, 1-55.
35. Martin JA, Regehr G, Reznick R, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84:273–278.
36. Fletcher G, Flin R, McGeorge P, et al. Anaesthetists’ Non-Technical Skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth; 2003; 90:580–588.
37. Rutherford JS, Flin R, Irwin A, et al. Evaluation of the prototype Anaesthetic Nontechnical Skills for Anaesthetic Practitioners (ANTS-AP) system: a behavioural rating system to assess the nontechnical skills used by staff assisting the anaesthetist. Anaesthesia 2015; 70:907–914.
38. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002; 236:458–463.
39. Angelo RL, Ryu RK, Pedowitz RA, et al. A proficiency-based progression training curriculum coupled with a model simulator results in the acquisition of a superior arthroscopic Bankart skill set. Arthroscopy 2015; 31:1854–1871.
40. Breen D, O’Brien S, McCarthy N, et al. Effect of a proficiency-based progression simulation programme on clinical communication for the deteriorating patient: a randomised controlled trial. BMJ Open 2019; 9:e025992.
41. Kallidaikurichi Srinivasan K, Gallagher A, O’Brien N, et al. Proficiency-based progression training: an ’end to end’ model for decreasing error applied to achievement of effective epidural analgesia during labour: a randomised control study. BMJ Open 2018; 8:e020099.
42. ten Cate O. Entrustability of professional activities and competency-based training. Medical Education 2005; 39:1176–1177.
43. Blum RH, Boulet JR, Cooper JB, et al. Harvard Assessment of Anesthesia Resident Performance Research Group. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesthesiology 2014; 120:129–141.
44. Blum RH, Muret-Wagstaff SL, Boulet JR, et al. Harvard Assessment of Anesthesia Resident Performance Research Group. Simulation-based assessment to reliably identify key resident performance attributes. Anesthesiology 2018; 128:821–831.
45. Winkler-Schwartz A, Yilmaz R, Mirchi N, et al. Machine learning identification of surgical and operative factors associated with surgical expertise in virtual reality simulation. JAMA Netw Open 2019; 2:e198363.
46. Shorten G. Artificial intelligence and training physicians to perform technical procedures. JAMA Netw Open 2019; 2:e198375.
47. Shorten G, Srinivasan KK, Reinertsen I. Machine learning and evidence-based training in technical skills. Br J Anaesth 2018; 121:521–523.
48. Royal College of Physicians and Surgeons of Canada. CanMEDS: better standards, better physicians, better care. http://www.royalcollege.ca/rcsite/canmeds/canmeds-framework-e [Accessed 30 August 2019].
49. Malik MU, Diaz Voss Varela DA, Stewart CM, et al. Barriers to implementing the ACGME Outcome Project: a systematic review of Program Director Surveys. J Grad Med Educ 2012; 4:425–433.
50. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med 2007; 82:542–547.
51. Englander R, Cameron T, Ballard AJ, et al. Toward a common taxonomy of competency domains for the health professions and competencies for physicians. Acad Med 2013; 88:1088–1094.
52. Royal College of Physicians and Surgeons of Canada. EPAs and milestones. http://www.royalcollege.ca/rcsite/cbd/implementation/cbd-milestones-epas-e [Accessed 6 September 2019].
53. Weller JM, Castanelli DJ, Chen Y, et al. Making robust assessments of specialist trainees’ workplace performance. Br J Anaesth 2017; 118:207–214.
54. Rekman J, Gofton W, Dudek N, et al. Entrustability scales: outlining their usefulness for competency-based clinical assessment. Acad Med 2016; 91:186–190.
55. Wisman-Zwarter N, van der Schaaf M, Ten Cate O, et al. Transforming the learning outcomes of anaesthesiology training into entrustable professional activities: a Delphi study. Eur J Anaesthesiol 2016; 33:559–567.
56. Andolsek K, Padmore J, Hauer KE, et al. Clinical Competency Committees – a guidebook for programs [2nd ed.]. Chicago, IL: Accreditation Council for General Medical Training; 2019.
57. Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005; 80:549–553.
58. Dunphy Bruce C, Dunphy SL. Assisted Performance and the Zone of Proximal Development (ZPD); a potential framework for providing surgical education. Aust J Educ Dev Psychol 2003; 3:48–58.
59. Cook DA, Hatala R, Brydges R, et al. Technology enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011; 306:978–988.
60. Isaak RS, Chen F, Martinelli SM, et al. Validity of simulation-based assessment for Accreditation Council for Graduate Medical Education Milestone Achievement. Simul Healthc 2018; 13:201–210.
61. Blum RH, Boulet JR, Cooper JB, et al. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesthesiology 2014; 120:129–141.
62. Blum RH, Muret-Wagstaff SL, Boulet JR, et al. Simulation-based assessment to reliably identify key resident performance attributes. Anesthesiology 2018; 128:821–831.
63. Chiu M, Tarshis J, Antoniou A, et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anaesth 2016; 63:1357–1363.
64. Ferguson PC, Kraemer W, Nousiainen M, et al. Three-year experience with an innovative, modular competency-based curriculum for orthopaedic training. J Bone Joint Surg Am 2013; 95:e166.
65. Stodel EJ, Wyand A, Crooks S, et al. Designing and implementing a competency-based training program for Anesthesiology Residents at the University of Ottawa. Anesthesiol Res Pract 2015 Article ID: 713038.
66. Recognition of professional qualifications in practice. The organisation is the European Commission. Directive 2005/36/EC https://ec.europa.eu/growth/single-market/services/free-movement-professionals/qualifications-recognition_en [Accessed 9 September 2019].
67. Teunissen PW, Kogan JR, Ten Cate O, et al. Learning in practice: a valuation of context in time-variable medical training. Acad Med 2018; 93:S22–S26.
68. van den Berg JW, Mastenbroek NJJM, Scheepers RA, Jaarsma ADC. Work engagement in health professions education. Med Educ 2017; 39:1110–1118.
69. Walsh A, Koppula S, Antao V, et al. Preparing teachers for competency based medical education: fundamental teaching activities. Med Teach 2018; 40:80–85.
70. Steinert Y. Faculty development: from program design and implementation to scholarship. GMS J Med Educ 2017; 34:Doc49.
71. Aronson L. Twelve tips for teaching reflection at all levels of medical education. Med Teach 2011; 33:200–205.
72. DunnGalvin A, Cooper JB, Shorten G, et al. Applied reflective practice in medicine and anaesthesiology. Br J Anaesth 2019; 122:536–541.
73. Hoff RG, Frenkel J, Imhof SM, et al. Flexibility in postgraduate medical training in the Netherlands. Acad Med 2018; 93:S32–S36.
74. Stodel EJ, Wyand A, Crooks S, et al. Designing and implementing a competency-based training program for anesthesiology residents at the University of Ottawa. Anesthesiol Res Pract 2015; Article ID: 713038.
75. Mulder H, Ter Braak E, Chen HC, et al. Addressing the hidden curriculum in the clinical workplace: a practical tool for trainees and faculty. Med Teach 2019; 41:36–43.
76. Kietaibl S, Blank A, De Robertis E. Medical training in anaesthesiology. Updated European requirements. Eur J Anaesthesiol 2019; 36:473–476.
77. Ferguson PC, Caverzagie KJ, Nousiainen MT, et al. Changing the culture of medical training: an important step toward the implementation of competency-based medical education. Med Teach 2017; 39:599–602.
78. Caverzagie KJ, Nousiainen MT, Ferguson PC, et al. Overarching challenges to the implementation of competency-based medical education. Med Teach 2017; 39:588–593.
79. Epstein RM. Assessment in medical education. N Engl J Med 2007; 356:387–396.
80. Holmboe ES, Sherbino J, Long DM, et al. The role of assessment in competency-based medical education. Med Teach 2010; 32:676–682.
81. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 2009; 302:1316–1326.
82. Faulkner H, Regehr G, Martin J, et al. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med 1996; 71:1363–1365.
83. Fletcher G, Flin R, McGeorge P, et al. Anaesthetists’ Non-Technical Skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth 2003; 90:580–588.
84. Lörwald AC, Lahner FM, Greif R, et al. Factors influencing the educational impact of Mini-CEX and DOPS: a qualitative synthesis. Med Teach 2018; 40:414–420.
85. Dauphinee WD, Boulet JR, Norcini JJ. Considerations that will determine if competency-based assessment is a sustainable innovation. Adv Health Sci Edu Theory Pract 2019; 24:413–421.
86. European Society of Anaesthesiologists. European Diploma on anaesthesiology and intensive care. https://www.esahq.org/education/edaic/about/ [Accessed 9 September 2019].
87. Vygotsky LS. Thought language Cambridge. MA: MIT Press; 1962.
88. Kneebone RL. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005; 80:549–553.
89. Sklar DP. Reflections on the Medical Education Continuum and how to improve it. Acad Med 2014; 89:1311–1313.
90. Sehlbach C, Balzanb M, Bennett J, et al. Certified … now what?’ On the challenges of lifelong learning: report from an AMEE 2017 Symposium. J Eur CME 2018; 7:1428025.
91. Taylor DCM, Hamdy H. Adult Learning Theories: Implications for Learning and Teaching in Medical Education: AMEE Guide No. 83. Med Teach 2013; 35:1561–72.
Copyright © 2020 European Society of Anaesthesiology. All rights reserved.