Internal Assessment in New MBBS Curriculum: Methods and Logistics : International Journal of Applied and Basic Medical Research

Secondary Logo

Journal Logo

Educational Forum

Internal Assessment in New MBBS Curriculum

Methods and Logistics

Badyal, Dinesh Kumar1,2,; Sharma, Monika3

Author Information
International Journal of Applied and Basic Medical Research 10(2):p 68-75, Apr–Jun 2020. | DOI: 10.4103/ijabmr.IJABMR_70_20
  • Open



Assessment plays a crucial role in implementation of competency-based curriculum. Assessment in Latin is “assidere” which means to “to sit with.” Therefore, it is something we do with or for students and not to students.[1] Assessment in competency-based curriculum focuses on improving learning as an ongoing and longitudinal assessment so that facilitators can identify the needs of the learner, plan remedial measures, and provide learning opportunities to improve learning.[2] New competency-based curriculum has been implemented in undergraduate medical education, i.e., “Bachelor of Medicine, Bachelor of Surgery (MBBS) from 2019 in India. There are major changes in assessment and internal assessment (IA) methods, and logistics are to be decided by universities and medical colleges.[3]

Internal Assessment: Concept

IA is the assessment, which is conducted throughout the professional year. The primary purpose of IA is to provide constructive feedback for improving learning. Therefore, this assessment continues at various formal and informal venues. IA is to be conducted by the teachers teaching that particular subject in a particular institute so that that they can track the progress of the learner and provide continuous support.[4] The continuous nature of IA will help in assessing all competencies that is postulated to be a factor in the successful implementation of competency-based curriculum.

The garden analogy to compare the growth of the plant and a learner can provide an important insight into the implementation of IA concept.[5] Here, the gardener takes care of the needs of the plant, observing the leaves, branches, and color of the plant. Same way, the teacher can take care of the growing learner by observing many direct and indirect markers of growth. In this way, IA has the potential to help the teachers provide timely and appropriate remedial action and guide learning.


IA involves a longitudinal process to assess all competencies. All competencies need to be assessed. However, it is easier said than done. The assessment can be a collective effort that assesses a number of competencies together at a time in the formal part. Since competencies are a mix of all domains of learning, assessment should also target procedural skills such as intubation, minor surgeries, and emergency care and soft skills such as communication skills, professionalism, and ethics. However, assessing all competencies is not possible on the day of university (summative) examination but can be assessed with IA that is continuous. It overcomes the limitations of day-to-day variability and allows larger sampling to topics, competencies, and skills.[4] It can be used to provide feedback to students to improve their performance throughout the course.

Internal assessment in competency-based curriculum

IA will help the achievement of competencies as continuous assessment will provide opportunities for feedback. The feedback will provide directions for remedial measures. This cycle can be repeated till competency is achieved.[267] This means that assessment will help in learning leading to the achievement of competencies. This is also known as assessment for learning (AFL).

In the new UG medical curriculum in India apart from minimum 50% IA marks and a minimum attendance percentage, certifiable competencies are to be marked in the logbook by teachers. Each subject has specific certifiable competencies, for example, pharmacology has four.[8] The number of assessment opportunities in IA makes these assessments low stakes, and there is less stress on students. The nonthreatening environment in assessment can help learners to perform to the optimum level. In new curriculum, IA marks will not be added to university marks.[3] This dissociation takes away the stress on learners and also teachers. It also empowers learners and at the same time, asks for greater accountability, flexibility, and learner contentedness. The importance of IA is retained as a qualifying criterion and a separate mention on the marksheet.

Type of Methods

IA in competency based curriculum (CBME) should focus on the assessment of a student's level of achievement of competencies, i.e., what the student can do. To target the multiple competencies and multiple domains of learning, such as cognitive and behavioral competencies, IA needs to be frequent and multifaceted. Multiple assessment methods improve the content-related evidence for validity and give more information to the teacher about the learning level and needs of the students.[9]

Several methods of IA can be utilized for undergraduate students. Cognitive competencies can be assessed by constructing modified case vignette-based multiple-choice questions (MCQ) and case-based discussions targeted at assessing a student's ability to analyze and interpret a clinical or practical problem. Other common methods usually used include theory tests (including short answer questions (SAQs), long answer questions (LAQs), and reasoning questions) and viva voce. All of these methods should be used in IA. Where competencies are skill based, assessment must include direct observation of the skill performance. This followed by feedback, helps involve the student in identifying his level of achievement and further learning needs. Some of the methods to assess practical and clinical skills are objective structured clinical/practical examination (OSCE/PE), direct observation of procedural skills (DOPS), and mini-clinical evaluation exercises (mini-CEX). Let us have a look about the utility of these methods in IA.

Multiple choice questions

The use of MCQ is useful when these questions are scenario-based versus single liners. However, avoid “window dressing” just to make a long statement in stem to show that it is scenario based. The construction of good MCQs takes time and can also be interpreted as art. In new curriculum a part of the assessment of knowledge can be done with MCQs. Figure 1 shows various parts of an MCQ. The stem should be aligned with the learning objective and should have single objective. The language used in stem should be simple to understand hence avoid negative and double negative terms as far as possible. Use common principles for answer choices like ascending/descending order or sequential order. The quality of an MCQ is judged by the strength of distractors. If a student does not know the correct answer after reading questions, the good distractors should distract the student away from the right answer. Higher domains in the cognitive domain can also be assessed by MCQs.

Figure 1:
Structure of multiple choice question

Short answer questions

These provide an opportunity to include more content areas in a theory test. Care should be take to make sure that the answer is substantial and dose not finish with few words only. Therefore, choose the competency/learning objective carefully for the construction of good SAQs. If SAQ is vague, students will write all they know irrespective of what is being asked. Assessors will find it difficult justify their own criteria generated for the contents of the answer.

Objectively structured clinical examination/Objectively structured practical examination

One of the most important aspects of the training of a medical student is the acquisition of practical/clinical skills. Objective assessment of practical/clinical skills provides the examiners with an opportunity to objectively assess the competencies expected of an undergraduate student. An OSCE/PE consists of set of “stations” with predesigned objective skills that a student must perform sequentially. These stations may include psychomotor skills, such as performing an abdominal examination, administering intramuscular injections or soft skills such as communicating treatment, taking history, etc. Stations can be constructed as observed stations where an examiner scores the student on a structured checklist or global rating scale that includes further units of actions a student must perform to be considered competent on the said skill. Besides being objective, this method of assessment helps in standardizing and uniformly assessing all the students.

Utmost care must be taken while designing the station tasks and the checklists to ensure that one does not objectify a task to an extent that the assessment becomes more of a scoring exercise rather than the assessment of competence. People often misunderstand OSCE/PE as spotting tests where the student is expected to respond to a very specific question. While spottingasks the student to identify specimens/objects or recognize pathological changes, the OSCE/PE focuses on the assessment of achievement of competencies or skills. The expert panel designing the stations needs to ensure that the stations do not transform into an interesting format of assessing knowledge. Testing knowledge and its application can be reserved for other formats of assessment, such as MCQs or structured essays instead. A set of 12–20 stations can be optimal for the assessment at UG level. This helps in better sampling and improves the reliability of the assessment. However, feasibility restricts this number in medical colleges according to the availability of infrastructure and other resources.[10]

OSCE/PEs can be used for skill assessment as ward leaving or send up examinations. They not only give us inputs on a student's skill performance but also provides feedback to teachers and helps identify gaps in teaching. Designing an OSPE for pre- and para-clinical subjects is a challenging exercise. The teachers need to identify the essential practical skills in their respective subjects and develop a “station bank” for further use. The problems in OSCE/PE lie in need for faculty training in the conduct and designing of the OSCE/PE and the resource-intensive nature of the test.[1112]

Mini-clinical evaluation exercises

The mini-CEX is a tool of direct observation of a doctor-patient encounter, during which the observing examiner rates the student on various aspects such as history taking, physical examination skills, professional behavior, analytical skills, communication, and the overall organization of the patient encounter. This observed encounter is followed by a feedback session where the teacher and student can identify areas in student performance that he/she must work on. Within each observed encounter, the teacher and student may start with an agreement on what area of competence the student will be assessed on to have better feedback on the specific area. Multiple such encounters over the period of training, not only gives a more reliable and valid assessment of student performance, it also gives ample opportunities for a student to work upon and improve. Given the multiple settings available to observe, such as outpatient, ward or emergency settings, multiple mini-CEX encounters will have higher reliability of assessment of student competence.

As the mini-CEX focusses less on the specific psychomotor skills, it is a good tool for assessment of student's ability to manage a case along with his/her professional behavior and communication skills. Mini-CEX has been studied by various subject experts and has been found to be a feasible and acceptable method of assessment.[13]

Direct observation of procedural skills

DOPS is a method of assessing technical skills. Where OSCE/PE is the assessment of a fragment of skills, DOPS gives the opportunity to assess the student perform a complete skill. DOPS was earlier introduced for surgical programs, but it has now been found useful in the assessment of skills in medical and para-clinical subjects as well. It continues to be frequently used in postgraduate programs and is being considered useful for undergraduate programs as well. The student is observed performing a skill and rated on a structured format. The feedback following the observation provides inputs to the student about his/her performance and gaps. The Medical Council of India (MCI) competency document for all subjects lists the skills an undergraduate student must master. Students can be assessed on the skills as part of a ward leaving examination. Multiple such exams can be used to assess the various skills required for the subject, thus ensuring that competence has been achieved. Thus, wider sampling improves the validity of the assessment. The issues with the wider use of DOPS lies in need for observer training.[14]

Components of Internal Assessment

There has to be a minimum number of tests for a learner for each for theory and practical per subject in preclinical, para-clinical subjects, and clinical subjects. An end of posting clinical assessment should be conducted for each clinical posting in each professional year.[8]

Pre-university send-ups are to be held in all subjects. Before university examinations, departments can conduct additional tests as and when required with the purpose of providing formative feedback to the students. In subjects that are taught at more than one phase, proportionate weightage must be given for IA for each phase. The most important aspect now is to decide what to be included for calculating the percentage of IA. A formal IA plan should be framed by universities for all affiliated colleges.

Assessment of knowledge can include theory tests, send-ups, seminars, quizzes, interest in subject, scientific attitude, etc., The assessment of practical/clinical skills and attitudinal domains can include practical/clinical tests, OSCE/OSPE, DOPS, and mini-CEX. Select case-related issues for self-study records maintenance including logbooks and attitudinal assessment such as sincerity, and ethics. IA should be a continuous process considering routine activities and periodic examinations.

Feedback in Internal Assessment

Each department needs to plan a structured feedback system so that learners get regular feedback about their leaning. The plan should include feedback at regular intervals to students and the provision of remedial measures based on feedback for all learners. Faculties, as well as students, must be trained in giving and receiving feedback. A culture of feedback in college must be created. The results of IA should be periodically (and on demand) shared with students. As far as possible, the entire faculty of the department should be involved in IA.[4] The new curriculum emphasizes to make IA accessible to students so that they get enough opportunities and time to improve.

Remedial Actions

A student whose final IA in a subject is less than required to appear in university examination should be given a chance to improve IA. The colleges should provide enough support to students to implement remedial measures. The remedial measure should be specific and targeted to the deficiencies. The colleges should make sure that these remedial measures are not misused, i.e., extra classes just to complete attendance where students complete a big percentage in few days in all subjects. There should be regular classes for students with deficiencies to improve their learning. Similarly, tests should be conducted at appropriate intervals and not one after other to complete the IA marks.

It is recommended that “Universities shall guide the colleges regarding formulating policies for remedial measures for students who are either not able to score qualifying marks or have missed on some assessments due to any reason.”[89] All students who are detained or fail for various reasons should be provided with:

  1. Regular classes in that subjects at appropriate intervals. These classes should be spread over time if multiple subjects are involved. The students should not be clubbed with the next batch of students just to make them sit there and complete attendance. The classes should be scheduled for improvement in specific subjects and topics
  2. Similarly, regular tests can be planned with 2–3 weeks intervals in between tests. Tests should include theory as well as practical/clinical
  3. Attendance should be added to previous attendance to calculate percentage. The absolute number of classes attended should be added to earlier attended classes. The denominator should be as given in regulations.

The source data (SD) should be maintained properly. The data should be accessible to students and also shown regularly to students as planned by departments. Students should be asked to sign on IA marks regularly and provided remedial measures as needed [Figure 2]. The signing at regular intervals safeguards the departments from legal hurdles later when students/parents raise objections and make unawareness as their main reason for not taking remedial actions.

Figure 2:
Internal assessment page for each student in records (page 1)

Example of Marks Distribution in Internal Assessment

The marks for IA can be kept as 100 or 200.[7] Let us take the example of a subject with a total of 100 marks of IA. Figure 3 shows a model to include various components in calculating IA. The detailed generic model is given in Figure 4, which can be used by colleges. It is recommended to have hard-bound IA record register having multiple sheets as per the number of students in each subject in the course.

Figure 3:
Division of marks in internal assessment
Figure 4:
Components and marks of internal assessment for colleges (page 2)

One page IA sheet should be used for each student. A SD (detailed data of tests, etc.) will be filled in the SD sheet by the department. The Main sheet will display IA for theory and practical. The entire SD used to fill the various marks on IA form should be maintained properly.

Theory Examination

MCI document provides detailed changes in theory question papers in university examinations. These also can be used for IA theory tests conducted by departments. It is recommended to use a combination of various types of questions, for example, structured essays (LAQ), SAQ, and MCQs.

The marks in all theory papers are 100. If a subject has two papers, the marks will be 100 + 100 = 200. The distribution of marks is to be decided by universities. An example of distribution marks in a theory paper of 100 marks is provided in Table 1.[9] The distribution of topics as per the new curriculum must be done by universities. An example of the distribution of paper A and B topics is given in Tables 2-4. This has been reviewed by subject experts.

Table 1:
Sample theory paper marks distribution (for sample questions ref to MCI-CBA)
Table 2:
Topic wise division of paper A and B of first professional theory papers based on new MCI curriculum
Table 3:
Topic wise division of paper A and B of second professional theory papers based on new MCI curriculum
Table 4:
Topic wise division of paper A and B of third professional theory papers based on new MCI curriculum

In Nutshell

Medical colleges and universities need to plan the format for IA in collaboration for the new MBBS curriculum. However, the IA is to be conducted by the department for their subjects in each college and should be displayed regularly to leaners. Use multiple methods, multiple teachers, multiple teat items, and multiple venues for authentic IA. IA methods and logistics planning must take into consideration the MCI guidelines, infrastructure available, resources available, and number of trained faculty available. The faculties in medical colleges must be trained in assessment so that they use multiple encounters of AFL.

Financial support and sponsorship


Conflicts of interest

There are no conflicts of interest.


1. A Short Glossary of Assessment Terms. Last accessed on 2020 Feb 15 Available from: https://serccarletonedu/introgeo/assessment/glossaryhtml
2. Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, et al Competency-based medical education: Theory to practice Med Teach. 2010;32:638–45
3. Badyal DK, Singh T. Internal assessment for medical graduates in India: Concept and application CHRISMED J Health Res. 2018;5:253–8
4. Rusman E Growing and Watering Plants.Last accessed on 2020 Feb 15 Available from: https://lilabeu/growing-and-watering-plants-assessment-for-learning-complex-skills-with-video-enhanced-rubrics/
5. Modi JN, Gupta P, Singh T. Competency-based Medical Education, Entrustment and Assessment Indian Pediatr. 2015;52:413–20
6. Shah N, Desai C, Jorwekar G, Badyal D, Singh T. Competency-based medical education: An overview and application in pharmacology Indian J Pharmacol. 2016;48:S5–9
7. Medical Council of India. Competency Based Assessment Module for Undergraduate Medical Education. 2019Last accessed on 2020 Feb 15 Available from: https://wwwmciindiaorg/CMS/information-desk/for-colleges/ug-curriculum
8. Medical Council of India. Regulations on Graduate Medical Education (Amendment) Addition as Part-II for MBBS Course Starting from Academic Year 2019-20 Onwards. 2019Last accessed on 2020 Feb 15 Available from: https://mciindiaorg/ActivitiWebClient/open/getDocumentpath=/Documents/Public/Portal/Gazette/GME-06112019pdf
9. Badyal DK, Singh S, Singh T. Construct validity and predictive utility of internal assessment in undergraduate medical education Natl Med J India. 2017;30:151–4
10. Gruppen LD, Davis WK, Fitzgerald JT, McQuillan MAScherpbier AJ, van der Vleuten CP, Rethans JJ, van der Steeg AF. Number of stations, and examination length in an objective structured clinical examination Advances in Medical Education. 1997 Dordrecht Springer
11. Gupta P, Dewan P, Singh T. Objective structured clinical examination (OSCE) revisited Indian Pediatr. 2010;47:911–20
12. Dinesh Badyal Practical Manual of Pharmacology. 20182nd ed New Delhi Jaypee Brothers
13. Singh T, Sharma M. Mini-clinical examination (CEX) as a tool for formative assessment Natl Med J India. 2010;23:100–2
14. Norcini J, Burch Vaneesa. Workplace Based Assessment as An Educational Tool AMEE Guide No31Last accessed on 2020 Feb 15 Available from: https://wwwresearchgatenet/publication/5690073

Internal assessment; MBBS; methods; new curriculum

© 2020 International Journal of Applied & Basic Medical Research | Published by Wolters Kluwer – Medknow