Journal Logo

The European Hematology Exam

The Next Step toward the Harmonization of Hematology Training in Europe

Navarro, José-Tomás1; Birgegård, Gunnar2; Strivens, Janet3; Hollegien, Wietske W.G.4; van Hattem, Naomi4; Saris, Manon T.4; Wondergem, Marielle J.5; Toh, Cheng-Hock6; Almeida, Antonio M.7

doi: 10.1097/HS9.0000000000000291
HemaTopics (EHA Only)
Open

1ICO-German Trias i Pujol Hospital. Josep Carreras Research Institute. Universitat Autònoma de Barcelona, Spain

2Institute for Medical Sciences, Uppsala University, Sweden

3Centre for Higher Education Studies, The University of Liverpool

4European Hematology Association, Executive Office, The Hague, The Netherlands

5VU University Medical Center, Amsterdam, the Netherlands

6Roald Dahl Haemostasis & Thrombosis Centre, Royal Liverpool University Hospital, UK

7Hospital da Luz, Lisboa, Portugal.

Correspondence: José-Tomás Navarro (e-mail: jnavarro@iconcologia.net).

Citation: Navarro JT, Birgegård G, Strivens J, Hollegien W, van Hattem N, Saris M, Wondergem M, Toh CH, Almeida A. The European Hematology Exam: The next step toward the harmonization of Hematology training in Europe. HemaSphere 2019;00:00. http://dx.doi.org/10.1097/HS9.0000000000000291

The authors have indicated they have no potential conflicts of interest to disclose.

Online date: September 27, 2019

This is an open access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal. http://creativecommons.org/licenses/by-nc-nd/4.0

Back to Top | Article Outline

Introduction

Following the implementation of the European Hematology Curriculum,1 the creation of an exam based on it emerged as the next logical step on the way to harmonize Hematology training throughout Europe. In 2015, the project of the European Hematology Exam was launched by the Education and Curriculum committees of the European Hematology Association (EHA). The purpose was not to substitute for the national authorities’ certification role but, instead, to produce a tool to assess knowledge and provide a stamp of quality for those who, by passing the exam, reach the standards of knowledge as determined in the European Curriculum.

As with all EHA harmonization programs, the final aim of the exam is to ensure high quality patient care throughout Europe. In addition, as the exam gains recognition and prestige in different countries, professional mobility will be facilitated in Europe. The exam is open to all professionals with an interest in hematology but the main target group is those who have just finished their specialty training.

With the above-mentioned objectives, the European Exam Working Group was created, consisting of hematologists with education experience and educationalists, and had its first meeting in November 2015. This group recommended that the aim of the European Exam should be to test knowledge according to the topics and recommended levels of knowledge outlined in the European Hematology Curriculum. The first step was benchmarking among national hematology societies and European societies of other medical specialties. A survey performed in 2015 among the presidents of 25 European national hematology societies showed that 52% of countries conducted an end-of-training exam. Additionally, the majority (72%) were willing to endorse a European hematology knowledge test. Given their long experience in knowledge assessments, the Swedish and the Belgian exams were used for more in-depth benchmarking. A second benchmarking step focused on European exams carried out by the other specialty societies. The structure and implementation of final exams held by the European Respiratory Society, the European Society of Cardiology, and the European Society of Anesthesiology were reviewed and information was exchanged with those responsible for these exams.

Back to Top | Article Outline

The exam preparation

As a result of the above preliminary research, the Exam working group decided that the most appropriate approach was to create a knowledge test with multiple-choice questions (MCQs). The strength of this method is that it allows a large sample size of test items even within a limited testing time, testing knowledge in several areas with high reproducibility and it can be used in a web-based format with automatic objective correcting and scoring.2,3 The questions were to be in English as this is the official language of EHA and the recognized scientific language worldwide. Additionally, mobility requires a good level of English and, on the other hand, translating questions into different languages would raise important interpretation difficulties. Following careful consideration, it was felt that competence assessment would be beyond the remit of this project and is best assessed by national training programs.

Nevertheless, to ensure high quality of the exam, the MCQ's should test the highest classification level that can be tested with this type of examination. For this purpose, question-writing guidelines were developed by the educational experts of the Exam working group. The Guidelines for the development of Questions & Answers for the European Exam in Hematology gives the guidance on how to construct questions capable of testing skills other than just simple data recall, such as applying knowledge to practice. Existing guidelines for MCQ writing were studied and validated recommendations taken into account.4 The questions should depict a realistic context and consist of 3 parts:

  • The stem: Typically comprises a clinical scenario that provides the background of the knowledge item being tested. The stem must be clinically plausible and comprise a few lines or paragraphs providing all the information required for the correct answer. It should also include information that makes each of the wrong answer options plausible. It may incorporate laboratory results, or still images (such as morphology slides, scans or blood films). Questions without a clinical stem may be used sparsely for specific knowledge recall testing.
  • The question: The question should be direct, unambiguous and should address a single issue. The knowledgeable participant should be able to answer the question from the stem before reading the options.
  • The answer options: Five options for the participant to choose from, one of which is the ‘best’ answer (clearly correct) and the others of which are plausible alternatives (‘distractors’).

The guidelines are the pivotal tool for the question writing process, following a set format that was designed to be compliant and in line with current thinking on best practice.

The next step was to recruit a group of question writers for producing questions based on the above-mentioned guidelines. Following suggestions from the Scientific Working Groups of EHA, 20 experts on the different sections of the European Curriculum were selected to participate in the creation of a database of MCQs. The question-writers group was trained in the writing process by the educational experts of the working group in face-to-face workshops. To ensure the quality and reliability of the questions, a peer review system was set up. Question writers were paired according to sections of the Curriculum. These 2 writers exchanged and reviewed each other's questions as many times as needed, submitting a “final” version to the database for later review by an expert panel.

In order to produce each exam, questions from the database were selected semi-randomly, keeping a balance between question difficulty and different curriculum sections. At a final review meeting, 120 to 130 selected questions were reviewed by a panel of 6 experts. The final exams comprised 100 questions representing all (sub)sections of the European Hematology Curriculum V3. Examinees were given 2.5 hours to answer all questions.

The exam was run on a secure online platform and each applicant answered the exam questions on an individual computer provided for this purpose. The informatic tool allowed simple statistics to be collected during the exam and a subsequent deeper analysis.

Back to Top | Article Outline

The results of European Hematology Exam

Participation and profile of cohort.

The exam has had 3 sittings to date. The first took place at the 22nd Congress of EHA, in Madrid in 2017, the second in Stockholm during the 23rd EHA Congress in 2018, and the third, recently in Amsterdam at the 24th Congress of EHA.

In addition, in 2018 and 2019, parallel sessions were organized by national societies. In 2018, the Swiss Society of Hematology (SSH) took the initiative to organize a parallel session in Bern, Switzerland. This session was open to all Swiss candidates and served as a successful pilot for the implementation of the European Hematology Exam as part of the official Swiss end-of-training exam.

In 2019, 4 more national hematology societies organized a parallel session, to give local participants, who could not access the congress, the possibility to participate. At the same time as in Amsterdam, candidates in Bern (Switzerland), Madrid (Spain), Lisbon (Portugal), Athens (Greece) and Ankara (Turkey) took the exam.

Two hundred ninety-three candidates have so far participated in the 3 iterations of the exam, including the parallel sessions. These candidates have come from 48 countries representing 5 continents (Table 1).

Table 1

Table 1

The median age of the applicants was 36 years (27,73) and 54.5% were female. The profession, main field of interest and main work activity are shown in the graphics (Fig. 1).

Figure 1

Figure 1

In the first 2 years, it was noted that a relatively high number of registered candidates did not subsequently participate in the exam in the end (42% in 2017 and 41% in 2018). EHA aims to provide access to the exam to as many people as possible. Therefore, after the 2018 exam, the reasons for cancellation/not-attending were investigated through a survey, to define any causes that could be mitigated. The majority of no-show candidates did have the intention to participate. Reasons for cancellation/not showing were mostly related to not being able to travel to the congress, such as lack of funding, travel/visa issues or inability to take the time off. A small majority of candidates who could not finally take the exam expressed the view that they would prefer to have the exam later in the congress.

The first 2 iterations took place on the day before the start of the Congress. The third iteration took place on the second day of the Congress, as the expectation was that this might make it easier for candidates to attend, taking into account their opportunities for taking days off and receiving travel funding for visiting the congress. As expected, there was a decrease in the percentage of registered candidates that did not participate in the exam (28%).

Back to Top | Article Outline

Quality control and methodology

To guarantee independent judgment of the quality of the exam and the passing score definition, EHA collaborates with CITO, an internationally recognized professional research and knowledge institute in the field of testing and educational measurement.

To define the cut-off score for the first exam, the Angoff method was used.5,6 Applying this method means that 6 judges (the same experts that made the final review of the questions) evaluated the difficulty of each question, leading to the calculation of a final passing score of 64. Fifty-two candidates (81.2%) passed the exam.

For the 2018 exam, 2 standard-setting methods were combined. First, the Angoff method, and second, an equating procedure based on item response theory, where the results of the 2018 exam could be placed on the same ability scale as the 2017 exam by means of a number of overlapping questions used in both exams. This means that independent of the year in which a candidate participates, the chance that they pass the exam is equal. For 2018, this resulted in a cut-off score of 58 (or more) items correct to pass the exam. This means that 64 candidates (80.0%) passed.

CITO's Psychometric Research- and Knowledge Center performed a psychometric analysis of the results of both iterations, testing the discriminatory power of questions and other markers of test quality. In both cases CITO deemed the exam to be a robust method of knowledge testing.

At the time of writing this article, the outcomes of the 2019 exam are still being analyzed. These will be published in the Exam report on the EHA website.

Back to Top | Article Outline

Questionnaires

Directly after having finished the exam questions, the participants were asked to participate in a short survey on the same platform as the exam. In 2018, 97% found the exam relevant (78%) or partly relevant (19%) to their hematology training. The 87% experienced the exam setup as good or very good, and 95% of candidates were able to finish the exam within the 2.5 hours.

To get insight in the impact of achieving success in the European Hematology Exam, the candidates who passed the 2017 and 2018 editions were asked to complete a short survey by e-mail. 39 candidates (33.6%) responded. 62% of the candidates who responded expected that passing this would help in their career in the future, and 10% reported that it had helped already. The 72% indicated that passing the exam helped in their personal development. Figure 2 shows the main reasons for participating in the exam.

Figure 2

Figure 2

Back to Top | Article Outline

The future of the European Exam project

Annual exam and National parallel sessions

After 3 successful iterations of the Exam, the EHA Curriculum Committee and the European Exam Working Group were encouraged to pursue this project further. The fourth iteration will be in Frankfurt, Germany, during the 24th Congress of EHA, and the aim is to hold a yearly exam during the annual EHA meeting.

After the successful pilot with the Swiss Society of Hematology (SSH) in 2018, EHA has invited the national societies of other European countries to organize a parallel session as well. This will give candidates who cannot travel to the EHA Congress the possibility to still participate in the exam. Five national societies have decided to team up with EHA to offer this opportunity in their country in 2019, and more societies are considering this for future iterations, including societies from countries outside of Europe.

With the backbone of the European Curriculum, EHA has been developing an extensive educational program with the final aim of harmonizing Hematology training throughout Europe. The European Exam is the next step to reach this objective, which ultimately will generate higher quality patient care. The strong interest in participation from hematologists inside and outside of Europe will give firm support to a wider harmonization.

Back to Top | Article Outline

Further development – the progress test

The development of trainees’ knowledge over the years of training can be monitored by progress testing, a method used in undergraduate medical training for many years. A progress test can be used as an examination7 but also as a learning tool, testing knowledge and giving feedback to trainees about their knowledge development. Such a progress test has been used successfully in Sweden since 2013. Using MCQs from the large question databank developed for the EHA exam such a learning tool will now be implemented on the EHA Campus on-line platform and will help trainees to follow their development of knowledge, to inform themselves of areas in need of extra study and to prepare for the final exam.

Back to Top | Article Outline

References

1. Almeida A, Ar C, Hellström-Lindberg E, et al The European hematology curriculum: an electronic passport promoting professional competence and mobility. Hemasphere. 2018;2:e49.
2. Little JL, Bjork EL, Bjork RA, et al Multiple-choice tests exonerated, at least of some charges: fostering test-induced learning and avoiding test-induced forgetting. Psychol Sci. 2012;23:1337–1344.
3. Pham H, Trigg M, Wu S, et al Choosing medical assessments: Does the multiple-choice question make the grade? Educ Health. 2018;31:65–71.
4. Case SM, Swanson DB (2002) Constructing Written Test Questions for the Basic and Clinical Sciences (3rd Ed.) Philadelphia: National Board of Medical Examiners. 2002 Retrieved from: https://www.nbme.org/pdf/itemwriting_2003/2003iwgwhole.pdf (30/04/190.
5. 1971;Angoff WH. Thorndike RL. Scales, norms, and equivalent scores. Educational,
6. Ricker KL. Setting cut-scores: a critical review of the angoff and modified angoff methods. Albert J EducatRes. 2006;52:53–64.
7. Muijtjens AM, Timmermans I, Donkers J, et al Flexible electronic feedback using the virtues of progress Medical Teacher. 2010;32:491–495.
Copyright © 2019 The Authors. Published by Wolters Kluwer Health Inc., on behalf of the European Hematology Association.