Skip Navigation LinksHome > August 2014 - Volume 89 - Issue 8 > SNAPPS-Plus: An Educational Prescription for Students to Fa...
Academic Medicine:
doi: 10.1097/ACM.0000000000000362
Research Reports: Research Reports

SNAPPS-Plus: An Educational Prescription for Students to Facilitate Formulating and Answering Clinical Questions

Nixon, James MD, MHPE; Wolpaw, Terry MD, MHPE; Schwartz, Alan PhD; Duffy, Briar MD, MS; Menk, Jeremiah MS; Bordage, Georges MD, PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Nixon is vice chair for education, Department of Medicine, University of Minnesota Medical School, Minneapolis, Minnesota.

Dr. Wolpaw is vice dean for education, Pennsylvania State University College of Medicine, Hershey, Pennsylvania.

Dr. Schwartz is professor, Department of Medical Education, University of Illinois at Chicago College of Medicine, Chicago, Illinois.

Dr. Duffy is assistant professor, Department of Medicine, University of Minnesota Medical School, Minneapolis, Minnesota.

Mr. Menk is research fellow, Biostatistical Design and Analysis Center, University of Minnesota Medical School, Minneapolis, Minnesota.

Dr. Bordage is professor, Department of Medical Education, University of Illinois at Chicago College of Medicine, Chicago, Illinois.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: The institutional review boards at the University of Minnesota and the University of Illinois at Chicago approved this study.

Previous presentations: Information contained in this paper was presented in part at the following meetings: International Association of Medical Science Educators conference, St. Andrews, Scotland, June 2013; MHPE Conference, Chicago, Illinois, July 2013; Association for Medical Education in Europe Conference, Prague, Czech Republic, August 2013; Alliance for Academic Internal Medicine Conference, New Orleans, Louisiana, October 2013; and Association of American Medical Colleges Annual Meeting, Philadelphia, Pennsylvania, November 2013.

Correspondence should be addressed to Dr. Nixon, University of Minnesota Medical School, Department of Medicine, 420 Delaware St. SE, MMC 741, Minneapolis, MN 55455; e-mail:

Collapse Box


Purpose: To analyze the content and quality of PICO-formatted questions (Patient–Intervention–Comparison–Outcome), and subsequent answers, from students’ educational prescriptions added to the final SNAPPS Select step (SNAPPS-Plus).

Method: Internal medicine clerkship students at the University of Minnesota Medical Center were instructed to use educational prescriptions to complement their bedside SNAPPS case presentations from 2006 to 2010. Educational prescriptions were collected from all eligible students and coded for topic of uncertainty, PICO conformity score, presence of answer, and quality of answer. Spearman rank–order correlation coefficient was used to compare ordinal variables, Kruskal–Wallis test to compare distribution of PICO scores between groups, and McNemar exact test to test for association between higher PICO scores and presence of an answer.

Results: A total of 191 education prescriptions were coded from 191 eligible students, of which 190 (99%) included a question and 176 (93%, 176/190) an answer. Therapy questions constituted 59% (112/190) of the student-generated questions; 19% (37/190) were related to making a diagnosis. Three-fifths of the questions (61%, 116/190) were scored either 7 or 8 on the 8-point PICO conformity scale. The quality of answers varied, with 37% (71/190) meeting all criteria for high quality. There was a positive correlation between the PICO conformity score and the quality of the answers (Spearman rank–order correlation coefficient = 0.726; P < .001).

Conclusions: The SNAPPS-Plus technique was easily integrated into the inpatient clerkship structure and guaranteed that virtually every case presentation following this model had a well-formulated question and answer.

SNAPPS is a six-step, learner-centered technique for student case presentations.1 The steps are Summarize the findings, Narrow the differential, Analyze the differential by comparing and contrasting alternatives, Probe the preceptor about uncertainties, Plan management, and Select an area for further learning related to the patient. By design, this technique shifts the focus of the presentation from a delivery of facts obtained from the patient, to a discussion of the student’s differential diagnosis, management plan, and areas of uncertainty related to the case.2 Shifting the focus in this manner is desirable for many reasons. First, it provides a window into the student’s thinking, allowing the preceptor to better assess the student’s clinical reasoning skills.1 Additionally, when students express their uncertainties, preceptors will respond by providing targeted teaching tailored to the student’s uncertainties.3 An assumption with SNAPPS is that, once students select an area for further reading, they will actually research and read about their chosen topic. In this manner, SNAPPS also serves as a means of providing targeted self-directed learning to develop skills for handling uncertainty in the future.

If clerkship students are similar to practicing physicians, however, about 80% of these questions may remain unanswered.4 These unanswered questions are then lost opportunities to develop the skills to handle uncertainty and learn in the context of care. Key reasons for failing to answer clinical questions relate to asking unclear questions and an inability to find the information needed to answer the questions.5

Questions that physicians and laypersons ask using natural language usually lack the clarity necessary to search medical databases for relevant answers.6 However, questions that clearly characterize the Patient, Intervention considered, Comparison of alternatives, and desired Outcome (PICO-formatted questions) return a higher percentage of relevant citations when searching medical databases.7 Medical students can be taught to formulate questions according to the PICO format,8 but it is unclear whether once taught they will use this skill to ask and answer clinical questions in the course of routine patient care.

“Educational prescriptions” have been proposed as a way to ensure that questions are asked according to the PICO format and are subsequently answered.9–11 PICO-formatted educational prescription templates are typically slips of paper, similar to a traditional paper prescription pad, preprinted with the four PICO components. Students can then fill in each component of their question and subsequent answers. The completed educational prescription is seen as a contract between the student and the attending to ensure that a question is answered. To date, to our knowledge, whether the use of educational prescriptions increases the likelihood of obtaining answers has not been tested.

We propose “SNAPPS-Plus” as a modification of the SNAPPS case presentation technique that formalizes the last Select step by having students complete an educational prescription. This modified technique builds on SNAPPS successes at facilitating students’ identification of uncertainties by improving the students’ ability to ask clear questions and to follow through on answering their questions. In this manner, SNAPPS-Plus has the potential to develop students’ skills to deal with uncertainty once identified.

The purpose of our study was to analyze the content and quality of the questions and subsequent answers of students’ educational prescriptions. We raised five research questions:

* What is the nature of the uncertainties (topics) expressed by students when formulating a question using an education prescription in the context of SNAPPS presentations during inpatient clerkship rotations?

* How well do the questions conform to the PICO format?

* Are answers provided?

* What is the quality of the answers?

* What is the relationship between the PICO conformity score and the presence and quality of answers?

We hypothesized that training students to formulate questions according to the PICO format on an educational prescription would result in high-quality questions and high likelihood of providing quality answers.

Back to Top | Article Outline


Since 2006, students at the University of Minnesota Medical Center (UMMC) have been generating PICO-formatted questions on educational prescription forms as a complement to their SNAPPS case presentations during their third-year internal medicine inpatient clerkships. Sixty-four students complete their third-year clerkship each year at this 850-bed hospital located in Minneapolis, Minnesota. Students were expected to attend a one-hour lecture during the first week of the clerkship, although attendance was not taken. This lecture described how to give a SNAPPS case presentation, how to formulate questions following the PICO format, and how to use resources such as Medline, MD Consult, and StatRef to answer the questions. During the session, students were given the opportunity to practice a SNAPPS presentation and to write their own PICO questions. Following this session, students were told to use the SNAPPS technique for case presentations.

For the last step of SNAPPS, the Select step, they were instructed to complete an educational prescription for six of their case presentations, but were not graded on these prescriptions. For the educational prescription forms we used a two-ply, carbon-copy pad similar in size to a traditional paper pharmacy prescription pad. The front of the form was a template with a space for the students to indicate the Patient, the Intervention, the Comparison intervention, and the intended Outcome. The back of the form had space for the student to indicate the source searched, the level of evidence, and the “Clinical Bottom Line.” The student formulated a question according to the PICO template on the educational prescription form before presenting the patient and the question on rounds. The student was then responsible for finding an answer to the question, writing the answer on the back of the educational prescription, and presenting these results to the team before rounds the next day.

Copies of the educational prescriptions were collected over a four-year period (May 2006–May 2010) from a total of 191 students. This included all students assigned to UMMC for their third-year internal medicine clerkship. UMMC is one of the five sites at which University of Minnesota Medical School students could complete their third-year medicine clerkship. Each student submitted six educational prescriptions. We numbered each set of prescriptions from 1 to 6 and randomly selected one for coding using a random number generator. We identified prescriptions by student, but did not code student names as part of the data analysis and reporting. The coders and the researcher observed strict confidentiality. Students at the University of Minnesota, as part of a general procedure, gave their consent for the use of this data set for research purposes. The institutional review boards at the University of Minnesota and the University of Illinois at Chicago approved this study.

We coded the educational prescriptions for four outcome variables: the topic of the uncertainty, PICO conformity score, presence of an answer, and quality of the answer. Two coders (J.N. and B.D.) independently coded each educational prescription. We checked for concordance between the two independent coders for each variable every 30 educational prescriptions. The coders met to discuss any discrepancies in coding and reached a consensus.

The coders used a rubric developed by Sackett9 to categorize the students’ topics of uncertainty into the following categories: etiology, clinical diagnosis, therapy, harm, prognosis, prevention, cost analysis, and others. See Table 1 for details of the seven main topics and subtopics.

Table 1
Table 1
Image Tools

We adapted the PICO conformity score from a scoring rubric described previously by Thomas and Cofrancesco8 and modified on the basis of pattern analysis developed by Huang and colleagues.12 Students received zero to two points for each of the following four areas:

* Patient type clearly stated: 2 = yes, 1 = somewhat, 0 = no

* Intervention clearly stated: 2 = yes, 1 = somewhat, 0 = no

* Comparison clearly stated: 2 = yes, 1 = somewhat, 0 = no

* Patient outcome clearly stated: 2 = yes, 1 = somewhat, 0 = no

The final PICO conformity score is the sum of the four components for a possible total score of eight points (Table 2).

Table 2
Table 2
Image Tools

We coded the presence or absence of an answer as 0 = no and 1 = yes. We evaluated the quality of the answer based on three characteristics:

* Directness: Does it directly answer the question posed on the educational prescription? 0 = no or not clear, 0.5 = generally, 1 = specifically

* Evidence: Is there evidence provided to support the answer? 0 = no, 1 = yes

* Preferred management: Does the answer indicate a preferred management for this patient? 0 = no, 1 = yes

The answer quality score is the sum of the directness, evidence, and preferred management scores for a possible total score of three points.

We presented counts and percentages for the categorical variables, used Spearman rank–order correlation coefficient to compare ordinal variables, and used the Kruskal–Wallis test to compare the distribution of the PICO scores between answer groups. We used the McNemar exact test to test the association between higher PICO scores (7 or greater) and the presence of an answer. We performed all analyses using R statistical software (version 2.15.1) (R Core Team, Vienna, Austria; 2013).

Back to Top | Article Outline


We coded a total of 191 educational prescriptions from 191 eligible students (100% participation), of which 190 educational prescriptions (99%, 190/191) included a question (1 was left blank). Of the 190, 176 (93%, 176/190) included an answer to the question. The concordance between the two coders for the seven main topics of uncertainty was 85% and 83% concordance to within one point for the PICO scores. There was 94% concordance between coders for the presence of an answer; concordance was 93% for directness, 78% for evidence, and 70% for preferred management.

Back to Top | Article Outline
Topics of uncertainties

Fifty-nine percent (112/190) of the questions generated by the students addressed therapeutic issues. Of these, 72% (81/112) were related to medication. Clinical diagnosis accounted for 19% (37/190) of the questions, and the majority of these (73%, 27/37) were related to optimal data acquisition, with the remainder related to data interpretation (Table 1).

Back to Top | Article Outline
Conformity with PICO

Three-fifths of the questions (61%, 116/190) received either seven or eight points on the eight-point conformity scale (Table 3). Of the 190 questions, 184 (97%) included the Patient, 171 (91%) an Intervention, 132 (69%) a Comparison, and 156 (82%) the Outcome.

Table 3
Table 3
Image Tools
Back to Top | Article Outline
Quality of answers

The quality of the answers varied, with 37% (71/190) of the answers meeting all the criteria for high-quality responses by providing direct answers to the questions, evidence to support the answer, and a preferred management for the patient (Table 4).

Table 4
Table 4
Image Tools
Back to Top | Article Outline
Relationship between PICO conformity and presence and quality of the answers

Students providing an answer had a median PICO score of 7, whereas those without answers had a median score of 3 (Kruskal–Wallis test, P < .001). Students with a PICO score of 7 or greater were 61 times (confidence interval = 10.5–2448, P < .001) more likely to provide an answer than students with a score of 6 or less. There was a positive correlation between the PICO conformity scores and the quality of the answers (Spearman rank–order correlation coefficient = 0.726, P < .001; r2 = 0.53); see Figure 1.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline


A previous study by Wolpaw and colleagues2 has shown that students expressed uncertainties only 10% to 30% of the time during traditional case presentations. With SNAPPS-Plus presentations, participating students identified uncertainties 99% of the time, and by design each uncertainty was identified as an area for further reading. Furthermore, 93% of the students using SNAPPS-Plus found answers to their uncertainties. Although a direct comparison is not possible, a previous study evaluating SNAPPS without educational prescriptions in the outpatient clinic found that uncertainties were identified 84% of the time and that 51% of students identified resources for finding an answer.2 It is important to note that in the original SNAPPS study uncertainties were defined as questions or requests for teaching arising in the context of the clinical presentation, whereas in the current SNAPPS-Plus study we defined uncertainties as PICO-formulated questions appearing on the educational prescription. In our study, with the addition of an educational prescription to SNAPPS presentations during inpatient medicine clerkships, students generated questions on their PICO-formatted educational prescriptions and answered virtually all of these questions raised about their patients.

The topics of uncertainty that we catalogued on the inpatient clerkship using SNAPPS-Plus (i.e., 59% related to therapeutic decisions and 19% to diagnosis) were different from those in the study by Wolpaw and colleagues2 in family medicine outpatient clinics (55% related to diagnosis). The differences between the two studies could be due to the question that is being analyzed (question raised during presentation versus that appearing on the educational prescription), the difference in clerkship structure (ambulatory clerkship versus inpatient clerkship), or differences in types of patients (patients presenting to clinic versus those who are admitted).

Wolpaw and colleagues2 performed an analysis to determine the uncertainty on a transcript of the SNAPPS presentation. The researchers found that the preceptor provided targeted, on-the-spot teaching related to these uncertainties 80% of the time. The questions we analyzed were those left at the end of the presentation and discussion with the preceptor. These remaining questions were written by the students as educational prescriptions to be researched and the answers subsequently presented to the team. By their very nature, questions arising during the context of the SNAPPS case presentations, and discussed with the preceptors, may be different from those that are left after the presentation. It is also possible that when converting a question to the PICO format, students favor therapy-type questions because the PICO framework may be better suited to these types of questions.12

Furthermore, the difference in topics between the two studies could relate to the structure of the inpatient setting compared with the outpatient setting. Diagnoses have often been assigned and tests have often been ordered by the emergency room or clinic prior to the patient arriving on the ward for the student to interview. Additionally, students will typically “staff” their patients with their resident before presenting them to the attending. It is possible that many questions related to diagnosis are already answered in the students’ minds or that the questions that make it onto the educational prescription form are actually “team” questions rather than solely “student” questions.

Finally, it is possible that the types of patients and problems encountered in each setting are different. Crowley and colleagues13 found that therapy questions accounted for 53% of questions raised by residents in the inpatient setting. Those that remained were diagnostic test questions (22%), etiology questions (9%), prognosis questions (9%), and harm questions (5.1%). The similarity between these results and ours raises the possibility that therapy questions may come to the forefront for learners when caring for hospitalized patients.

As we hypothesized, teaching students to formulate questions using the PICO format on educational prescriptions resulted in the majority of their questions being of very high quality. Only 3% of clinical questions posed by physicians in practice contain all four PICO components12 compared with 62% with the use of educational prescriptions in the present study. Similar to prior studies, the most frequently omitted components were the comparison (omitted by 31%) and the outcome (omitted by 18%).

The great majority of students in our study (93%) provided answers to their questions. Although there was variability in the quality of the students’ answers, there was a positive correlation between higher-quality questions (those more closely corresponding to the PICO format) and higher-quality answers (those directly answering the question, providing evidence, and describing a preferred management). Although it is possible that higher-performing students are better at both asking questions and finding answers, it is also possible that the PICO format helped students to formulate clear questions that in turn helped find relevant answers.

The SNAPPS-Plus technique is relatively simple and required only a one-hour introductory session. This intervention requires few resources and could easily be transferable to other settings. Additionally, it is targeted for use at the bedside (rather than the classroom) where teaching interventions are best integrated into clinical practice and are more likely to change attitudes and behaviors.14

Our study has limitations. First, our method for grading the quality of the answers had not been previously validated, and there was a learning curve for coding the educational prescriptions. Also, we conducted our study at a single institution in a single clerkship. Future studies should address whether these findings could be replicated in other disciplines and institutions. The generalizability of our findings may also be limited by the timeframe of when our data were collected and ongoing curricular reforms nationwide. Additionally, we do not know much about the students’ approach to answering questions and the extent to which they may have asked peers or others for help as they worked on their questions. Nor do we know much about how the time of year may have affected their approach. These are questions that should be explored in future studies of this topic. Although we found a correlation between high-quality PICO-formatted questions and high-quality answers, it is possible that better students both ask higher-quality questions and also provide better answers. Future studies should separate these two factors and determine the extent to which PICO formatting produces a greater likelihood of finding quality answers.

Studies should also be directed toward a direct comparison between SNAPPS and SNAPPS-Plus, particularly looking at differences between questions raised in the context of the presentation versus those that make it onto an educational prescription. Of particular interest is whether the same rich conversations related to uncertainty occur in the context of a SNAPPS-Plus presentation, or whether the educational prescription as a concluding step elicits learning issues that focus more closely on patient management. Also worth exploring is whether the PICO format itself narrows the question type and if more openly formatted educational prescriptions are better suited for students in different settings or different levels of training. From a curricular standpoint, one could ask whether it is appropriate for the majority of third-year students to ask questions related to therapeutic and management decisions; or, should the focus be more on diagnostic reasoning? Further exploration of the uncertainties raised by students should be very informative regarding the types of learning that take place in the context of caring for patients.

SNAPPS-Plus is a relatively simple modification of the SNAPPS case presentation technique and can be easily integrated into the current inpatient clerkship structure. For our students, adding an educational prescription to the SNAPPS approach guaranteed that virtually every encounter had a question and that almost all of these questions appearing on the educational prescription were answered. SNAPPS-Plus could also promote the development of lifelong learning skills by teaching students to ask clear questions that will facilitate the search for relevant answers concerning self-identified areas of uncertainties that remain after discussions with the attending. For meaningful learning to occur in the clinical setting, physicians need to identify, express, and resolve their uncertainties. SNAPPS-Plus could help students develop the skills to target learning where it is most needed throughout their career.

Acknowledgments: The authors would like to thank Bradley J. Benson, MD, for developing the educational prescription template used in this study.

Back to Top | Article Outline


1. Wolpaw TM, Wolpaw DR, Papp KK. SNAPPS: A learner-centered model for outpatient education. Acad Med. 2003;78:893–898

2. Wolpaw T, Papp KK, Bordage G. Using SNAPPS to facilitate the expression of clinical reasoning and uncertainties: A randomized comparison group trial. Acad Med. 2009;84:517–524

3. Wolpaw T, Côté L, Papp KK, Bordage G. Student uncertainties drive teaching during case presentations: More so with SNAPPS. Acad Med. 2012;87:1210–1217

4. González-González AI, Dawes M, Sánchez-Mateos J, et al. Information needs and information-seeking behavior of primary care physicians. Ann Fam Med. 2007;5:345–352

5. Ely JW, Osheroff JA, Ebell MH, et al. Obstacles to answering doctors’ questions about patient care with evidence: A qualitative study. BMJ. 2002;324:1–7

6. Lloyd K, Cella M, Tanenblatt M, Coden A. Analysis of clinical uncertainties by health professionals and patients: An example from mental health. BMC Med Inform Decis Mak. 2009;9:34

7. Schardt C, Adams MB, Owens T, Keitz S, Fontelo P. Utilization of the PICO framework to improve searching PubMed for clinical questions. BMC Med Inform Decis Mak. 2007;7:16

8. Thomas PA, Cofrancesco J Jr.. Introduction of evidence-based medicine into an ambulatory clinical clerkship. J Gen Intern Med. 2001;16:244–249

9. Sackett DL Evidence-Based Medicine: How to Practice and Teach EBM. 20002nd ed New York, NY Churchill Livingstone

10. Ismach RB. Teaching evidence-based medicine to medical students. Acad Emerg Med. 2004;11:e6–e10

11. Schneider SS, Bazarian J, Spillane L, Zwemer FL. Educational prescriptions: Problem-based learning in the ED. Acad Emerg Med. 2008;9:1053

12. Huang X, Lin J, Demner-Fushman D. Evaluation of PICO as a knowledge representation for clinical questions. AMIA Annu Symp Proc. 2006;2006:359–363

13. Crowley SD, Owens TA, Schardt CM, et al. A Web-based compendium of clinical questions and medical evidence to educate internal medicine residents. Acad Med. 2003;78:270–274

14. Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329:1017

© 2014 by the Association of American Medical Colleges


Article Tools



Article Level Metrics