A Crowdsourcing Model for Creating Preclinical Medical Education Study Tools : Academic Medicine

Secondary Logo

Journal Logo

Articles

A Crowdsourcing Model for Creating Preclinical Medical Education Study Tools

Bow, Hansen C. PhD; Dattilo, Jonathan R.; Jonas, Andrea M.; Lehmann, Christoph U. MD

Author Information
Academic Medicine 88(6):p 766-770, June 2013. | DOI: 10.1097/ACM.0b013e31828f86ef

Abstract

Through the preclinical medical school curriculum, students build an essential knowledge base that will serve as the foundation for subsequent, more advanced lessons in clinical decision making. Students generally complete one to two years of preclinical course work, in which they learn the basic science of anatomy, physiology, and pathology. Students then transition to clinical, or hospital-based, course work to build on this basic science foundation and learn the more intricate details of diagnosing and managing clinical disease. Building an adequate foundation in the basic sciences during the preclinical years is paramount to success in medical school and patient care. However, during preclinical course work, students have only a limited time to learn new information, which they then must be able to recall both for exams during the course and when making diagnostic decisions later in their careers.

Research has shown that students score higher on exams when they actively engage with the content while studying.1 Using flashcards to study, for example, not only is effective in facilitating students’ short-term recall2–4 but also is superior to other, more elaborate study techniques, including concept mapping.5 Unlike computer-aided simulations, flashcards can be quickly and easily developed and disseminated.6 Several reports have demonstrated the efficacy of using flashcards specifically in medical education.1,7 Medical students possess a limited amount of time to master a large amount of material and often employ flashcards as a means of self-assessment, which facilitates how they prioritize topic areas during their study time.3

Recent trends in medical education emphasize scholarly collaboration; for example, more of the medical school curriculum now is devoted to team-based learning.8,9 Also supporting this trend is the shift to pass–fail grading for preclinical courses at the majority of medical schools in the United States, including the Genes to Society (GTS) curriculum at the Johns Hopkins University School of Medicine. In contrast to other grading systems that pit students against each other, pass–fail grading encourages students to collaborate and share resources without adversely affecting their success.

In the world of technology, the trend toward collaboration has been characterized by the crowdsourcing movement. Crowdsourcing is a term originally coined by Jeff Howe and Mark Robinson10,11 of Wired magazine to describe a problem-solving approach that outsources tasks to an undefined, often anonymous, population. Howe and Robinson identified two crucial components of crowdsourcing—the use of the open call format (a public solicitation to participate) and access to a large network of potential laborers. Whereas crowdsourcing has received media attention for its implementation in popular business models, including iStockphoto, Threadless, and the Goldcorp Challenge,12 it also holds great potential for peer collaboration in educational contexts. Studies have shown that crowdsourcing collaborations between a diverse group generate more complete and higher-quality ideas than collaborations between only the top-performing individuals in a field.12

The trend toward collaboration in medical education, combined with the rise of user-friendly collaborative software, such as Google Drive, prompted us to take an innovative approach to team-based learning at the Johns Hopkins University School of Medicine. Our model differs from previous strategies in two ways. First, in contrast to past approaches to mastering large volumes of material, which involved a hybrid of note taking, outlines, and flashcards, we redefined how students study preclinical material. Second, our work builds on the principle of shared effort. In contrast to previous models of distributing course notes, in which a single top performer distributed his or her notes to the entire class, our model relies on a diverse group of students with different strengths and weaknesses collectively creating a public resource. In this article, we describe our crowdsourcing model for creating study materials, emphasize its simplicity and user-friendly features, and discuss how students in any educational discipline could implement a similar model.

Developing and Implementing a Crowdsourcing Model for Creating Study Materials

We initially envisioned a system in which collaboration between students could occur unobtrusively in real time, instead of through the exchange of periodic mass e-mails. Through our personal experiences as medical students, we recognized that one form of effective collaboration was getting together with classmates and friends to quiz each other on the course material. We believed that the addition of technology to this process could make it more efficient, effective, and accessible.

Because the GTS course examinations were based solely on the material covered in course lectures, in January 2011 we started by writing a set of about 20 questions and answers for each lecture. We chose to write questions about topics that struck a balance between being too obvious and too esoteric. We wanted these sets of questions and answers to be available immediately (or at least within 24 hours of a lecture) so that the content of the lecture would be fresh in students’ minds and so that the students who reviewed lectures daily could have the most recent information available. Furthermore, we wanted students who were interested in contributing and editing the question and answer sets to be confident that the system was reliable and up to date.

We chose to use Google Drive as the online forum for our question and answer sets for several reasons: (1) Multiple people can simultaneously edit and view a document, spreadsheet, or PowerPoint, (2) it is user-friendly, and (3) just about all students are familiar with it. We created a publicly accessible and modifiable central Google spreadsheet. We shared the Web address of this spreadsheet via e-mail with the entire Johns Hopkins University School of Medicine class of 2014 at the start of the project. Students could access the most recent question and answer sets by going first to the central Google spreadsheet, in the same way that online readers of a newspaper could find the most recent articles by going to the newspaper’s homepage. By using this model, we avoided cluttering the inboxes of students who were not interested in accessing the spreadsheet through e-mails when new content was added.

This central Google spreadsheet contained links to additional Google spreadsheets, which included the question and answer sets for each lecture. Similar to other online collaboration platforms, such as Wikipedia, both the central Google spreadsheet and the spreadsheets of questions and answers could be modified by anyone with the link. Thus, following the crowdsourcing model, any student could add material, correct answers, and build custom question sets to aid his or her classmates in understanding the topics. We noticed that some students would write questions and answers in real time during a lecture; other students in the lecture hall would then edit them instantaneously.

We intentionally avoided barriers that we felt would deter adoption of our system, including the requirement that students sign in to view or modify documents, additional buttons to change from view mode to edit mode, and layers of Web pages to reach one’s target document (e.g., central document > neurology > movement disorders > Parkinson disease lecture versus central document > Parkinson disease lecture). We reduced concerns that students may adversely alter the shared documents or that the documents may be lost by using the revision history as a periodic backup of previous versions and by returning documents that were adversely altered to a previous version.

After accessing the central Google spreadsheet, students could choose to use the question and answer sets however they wanted. Some read them like a set of notes, whereas others desired more interaction, similar to a friend or classmate quizzing them. Although we could have referred students in this latter group to Web sites such as StudyBlue13 or Anki,14 which convert spreadsheets to digital flashcards, we felt that these online resources had several limitations: (1) They required Internet access, which may not be available at all locations where students study, (2) they were relatively complicated to use, requiring registration, sign-in, and multiple button clicks to get to the intended selection, and (3) they did not present information in a clean and clutter-free interface. Furthermore, these online resources could not easily create flashcards from PowerPoint slides, a medium some students preferred to the spreadsheets. Unlike in a spreadsheet, students could embed an image in a PowerPoint slide, which was an important capability when creating questions for image-dense topics, such as anatomy, histology, dermatology, and radiology.

Instead of referring students to outside Web sites, we developed and freely distributed a Java-based flashcard program that translates an Excel or PowerPoint file into a digital flashcard. More information about the software itself is available on request from the authors. For Excel files, the program recognizes the text in Column A of Sheet 1 as the list of questions and the corresponding text in Column B as the list of answers. For PowerPoint files, the program recognizes the title of the slide as the answer and text and images in the lower portion of the slide as the question. We purposefully limited the program to a simple set of controls. Initially, only the question appears; by pressing the down arrow on the keyboard, users prompt the answer to appear. To keep score and identify topics for further study, the user indicates that his or her answer is wrong by pressing the left arrow and is correct by pressing the right arrow. Pressing backspace returns the user to questions he or she has previously seen. After cycling through all of the questions in a random order, the program then returns to the questions the user answered incorrectly, again in a random order. The program also notes the number of times the user answers each question incorrectly by marking it in Column D of the spreadsheet.

Impact of Our Crowdsourcing Model

During the 13 months of their preclinical course work, the 120 students in the class of 2014 created 16,150 questions total (an average of 135 questions per student). They wrote an average of 1,346 questions per organ system block (range, 319–2,316) and 36 questions per lecture (range, 10–68). In addition to questions based directly on lectures, they also created flashcard sets based on the information in the required textbooks and on the material covered on the United States Medical Licensing Examination Step 1.

Results of an analysis of changes in exam scores

We evaluated the effectiveness of our flashcard system by analyzing the differences in exam scores between the class of 2013 and the class of 2014 at the Johns Hopkins University School of Medicine. The content and questions on exams do not change substantially between years, so a very similar test is given at the end of each organ system block.15 We hypothesized that our flashcard system would have an impact on overall student performance on exams. We estimated that one-third to one-half of the class of 2014 used our Google Drive and flashcard program system, based on the number of users accessing the documents in the days prior to exams.

We obtained exam score averages and standard deviations for the classes of 2013 and 2014 from the Office of Curriculum Development. Prior to the introduction of our flashcard system, the exam scores for both classes were very similar, with the class of 2013 performing slightly better than the class of 2014. However, after the introduction of our flashcard system, the exam scores for the class of 2014 were higher than those for 2013 on all subsequent exams. Because of our desire to protect the anonymity of the students, we did not collect demographics data; thus, we were unable to parse students into user versus nonuser groups for further analysis. Consequently, although the trends we observed are consistent with our hypothesis that the resources we provided positively influenced student performance, we cannot definitively claim causality.

Results of a student survey

In addition to our analysis of exam scores, we also administered a survey in 2012 to the 120 medical students in the class of 2014. This survey explored their personal experiences with both the Google Drive and flashcard program (see Supplemental Digital Appendix 1, https://links.lww.com/ACADMED/A125). We obtained institutional review board approval for our survey. At the end of the GTS course, we invited students by e-mail to complete a Google survey. The survey assessed the frequency of students’ use of the flashcard system, their perception of its value, its similarity to actual exam questions, and their anticipated use of the system in the future. The survey did not request personal information, such as name, gender, or age, to keep students’ responses confidential and to avoid deterring students from further use of the flashcard system.

We received 32 responses to our survey over the course of three days, resulting in a 27% response rate. However, on the basis of our estimation that about half of the students used the flashcard system, this response rate represents about 53% of perceived users. All of the respondents used the flashcards to study. Ninety-seven percent reported using the question and answer sets at some point, and 75% reported still using them at the conclusion of the GTS course.

Of the 32 respondents, 56% used Google Drive to study for more than half of the exams. The majority (56%) used it once a week. Seventy-four percent rated the helpfulness of the flashcards to learn new material greater than or equal to a 7 out of 10, and 90% rated their helpfulness to retain material greater than or equal to a 7 out of 10. Lastly, 87% of respondents rated their belief that the flashcards improved their exam scores greater than or equal to a 7 out of 10.

Several respondents also provided narrative comments regarding their experiences:

I also think a lot of the questions in the flashcards are better questions (either more focused on the main points or require more application of concepts) than what ended up on our exam and that faculty should look at the flashcards for ideas about how they should write their questions.

I used the flashcards when I had time to study extra materials (outside of the [lecture notes]) before exams. Overall, I found them extremely helpful in illustrating the main points of every lecture. They have definitely helped me add a few points to my exam scores.

I love the flashcards program and cards…. The way that I used them is I would copy the cards from a given lecture into an Excel file. Then I would review the lecture slides and correct, modify, and expand the cards provided as well as write new questions of my own. I would then come back and review all the cards in the day or two leading up to the test. I thought this system worked great for me.

The last comment validates our belief that the design of the flashcard system made question and answer sets easily accessible so that students were able to either modify them on the central Google spreadsheet or download them and make their own additions and deletions.

Unique Features of Our Crowdsourcing Model

Often, before tests, students get together to quiz and teach each other the course materials. Although this method of studying builds camaraderie, identifies students’ weaknesses, and disseminates knowledge, it also has several limitations: Groups with more than a few people are less successful, students find it difficult to agree on a time to meet, and, often, the discussions stray from the subject matter. We believe that an online repository of questions and answers preserves many of the benefits of classmates quizzing each other while also overcoming several of the limitations. More specifically, the central Google spreadsheet and the question and answer sets incorporated features that we believe are crucial to its success: (1) It is easy to use, (2) anyone can make modifications, (3) users are anonymous, and (4) users can see modifications in real time.

First, from the beginning, we knew that user-friendliness would be an essential feature in a system that students widely adopt. Many students bookmarked the central Google spreadsheet and visited it frequently, as they would the homepage of a newspaper online. Because no sign-in is required, a single click from a student’s bookmarks menu is the only step required to access the spreadsheet. This easy access contrasts some schools’ proprietary resources that require users to enter a username and password each time they access the Web site, even if it is multiple times in one day.

Next, anyone who is able to access our resources also is able to modify them. We were pleasantly surprised when several students created their own sets of questions and answers and added links to them from the central Google spreadsheet. Each had a different emphasis; for example, a concise, “high-yield” version included fewer questions, and another version included multiple-choice questions. Furthermore, other students added to the spreadsheet links to glossaries that they had created. Similarly, students provided commentary regarding whether they thought a question was of high quality or had the wrong answer, and they added links to additional resources clarifying their point.

In addition, we believe that the ease to make modifications substantially increased their level of engagement. The anonymity for students accessing, posting, and modifying the questions and answers likely encouraged their critique and discussion of the resources in addition to their use of them. We also permitted students to download the spreadsheets so that they could make their own private additions, deletions, and modifications.

Finally, whereas students in previous classes shared static documents and outlines, ours was the first attempt to build a “living” learning resource. Because changes and commentary can be seen in real time, and a substantial number of students use the spreadsheet during class, we believe that it creates a forum for live chatting during a lecture so that students can clarify misunderstandings as they happen. More broadly, course content and lectures change each year. To keep the spreadsheet relevant from one year to the next, all students are able to update the question and answer sets.

We believe that several features of our flashcard program facilitated its adoption by a large number of students. First, we chose to use Excel and PowerPoint because we knew that nearly all students would be comfortable using these programs. Because changes to questions and answers can be made in Excel and PowerPoint, instead of through our flashcard program, students had no learning curve to overcome. Second, we recognized that speed was important, so we intentionally designed our program so that users would need only single keyboard strokes (up, down, left, right, delete) to interact with the flashcards, rather than on-screen buttons, which require precision, thus reducing the speed of use. Third, an Internet connection may not be available in all locations where students study, such as on the subway, on an airplane, and around the home and school campus. Because our program involves local copies of both the software and the questions and answers, students can run the program without an Internet connection, facilitating studying whenever and wherever students want. Fourth, we aimed for a simple design in developing the program. The interface is not cluttered with fancy designs, advertising, or company logos. It provides only the information students need.

Finally, we developed our model based on the premise that each student’s success is more valuable than the adoption of the software, not on the principle of software market domination. We not only permitted the use of other software to run the flashcards but also actively encouraged it by providing links to these resources.

Limitations to Our Crowdsourcing Model

Despite our crowdsourcing model’s popularity and success, it and the subsequent student survey have several limitations. First, we implemented the flashcard system with a single medical school class at a single institution, limiting our ability to extrapolate how successful it would be at other schools with different educational priorities. Second, we acknowledge that the students who responded to our survey likely were those who used our flashcard system, affecting our ability to examine why students chose not to use it. Perhaps the greatest limitation, however, is that we did not objectively assess the changes in students’ performance. We did not ask students to identify themselves either while using the flashcard system or completing the survey, so we were unable to discern user versus nonuser differences and to draw conclusions regarding improvements in students’ performance as a result of using our flashcard system. Although we acknowledge that asking students to identify themselves could have eliminated this limitation, we chose not to do so because we believed that it also would have deterred them from using the flashcard system and completing the survey. Future studies of such models should include both an examination of how question and answer sets are modified or corrected over time and the development of a method to track how many students participate in this process of refinement.

One feature limiting the adoption of our flashcard system at other medical schools is the fact that someone (either a student or an administrator) must create the central Google spreadsheet. In addition, medical schools may be uncomfortable with students sharing material based on their proprietary curricula. Also, the community must continually edit the material so that it remains relevant and concise. Finally, although we are fortunate at our medical school that the atmosphere is one in which students collaborate to contribute to the success of their peers without jeopardizing their own success, we recognize that other schools may have different approaches to student collaboration.

Our flashcard program itself also has several limitations. First, users must exit the program to make changes to the questions and answers. In addition, they must make each change twice (once to the local copy saved with the Java software and once to the online copy). Next, users must periodically download the most up-to-date questions and answers that reflect their classmates’ changes.

In Conclusion

Despite the limitations, our crowdsourcing model of developing electronic, sharable flashcards in medical education is novel, and the results of our exam score comparison and student survey show promise for its future. We began by building an infrastructure that facilitated the use of question and answer sets as flashcards, and then we developed foundational content based on the curriculum of a first-year, basic science course. Students in the class of 2014 at the Johns Hopkins University School of Medicine subsequently improved and modified this content to produce a continually evolving, high-quality resource. Our flashcard system has the potential to decrease the amount of test preparation each student must do by enabling students to share in the effort of generating concise, high-yield study materials. In the future, we hope to focus our efforts on expanding the compatibility of our flashcard system with mobile devices.

Our crowdsourcing model for creating educational material could be extended beyond the generation of question and answer sets and flashcard programs for use by medical students alone. Aside from modifying slightly the technology that we developed, few barriers exist to expanding our model to the collaborative generation of lecture summaries, clinical guidelines, and annotated figures. We believe that the simplicity of the software we developed greatly affected the success of our model, and others interested in implementing a similar model should take advantage of the capabilities of the software with which students are familiar.

In conclusion, we created a novel educational model that incorporates recent collaboration trends in both medical education and technological advancement. Our question and answer sets and flashcard program could serve as a model for collaborative learning in other educational systems that require the assimilation of large amounts of information.

Acknowledgments: The authors wish to thank the Johns Hopkins University School of Medicine Class of 2014 and course directors Henry Fessler, MD, and Michael Borowitz, MD, for their participation and feedback in this study.

Funding/Support: None.

Other disclosures: None.

Ethical Approval: The Johns Hopkins University institutional review board granted ethical approval for this study.

References

1. Allen EB, Walls RT, Reilly FD. Effects of interactive instructional techniques in a Web-based peripheral nervous system component for human anatomy. Med Teach. 2008;30:40–47
2. Cantillon P. Do not adjust your set: The benefits and challenges of test-enhanced learning. Med Educ. 2008;42:954–956
3. Kornell N, Son LK. Learners’ choices and beliefs about self-testing. Memory. 2009;17:493–501
4. Larsen DP, Butler AC, Roediger HL 3rd. Test-enhanced learning in medical education. Med Educ. 2008;42:959–966
    5. Karpicke JD, Blunt JR. Retrieval practice produces more learning than elaborative studying with concept mapping. Science. 2011;331:772–775
    6. Lehmann HP, Lehmann CU, Freedman JA. The use of simulations in computer-aided learning over the World Wide Web. JAMA. 1997;278:1788
    7. Bottiroli S, Dunlosky J, Guerini K, Cavallini E, Hertzog C. Does task affordance moderate age-related deficits in strategy production? Neuropsychol Dev Cogn B Aging Neuropsychol Cogn. 2010;17:591–602
    8. Seidel CL, Richards BF. Application of team learning in a medical physiology course. Acad Med. 2001;76:533–534
    9. Thompson BM, Schneider VF, Haidet P, et al. Team-based learning at ten medical schools: Two years later. Med Educ. 2007;41:250–257
    10. Howe J. The rise of crowdsourcing. Wired. June 2006;14 http://www.wired.com/wired/archive/14.06/crowds.html. Accessed February 6, 2013
    11. Howe J. Crowdsourcing: A definition. Crowdsourcing. June 2, 2006 http://crowdsourcing.typepad.com/cs/2006/06/crowdsourcing_a.html. Accessed April 2, 2013
    12. Brabham DC. Crowdsourcing as a model for problem solving. Convergence. 2008;14:75–90
    13. . StudyBlue. http://www.studyblue.com. Accessed February 6, 2013
    14. Elmes D. Anki. http://ankisrs.net. Accessed February 6, 2013
    15. Bow H. Personal communication with H. Fessler, M. Borowitz, Johns Hopkins University School of Medicine Curriculum Directors. February 2012

    Supplemental Digital Content

    © 2013 Association of American Medical Colleges