Simulation-based research requires the coordinated effort of research teams to design projects, recruit subjects, and carry out performance assessments of individuals or teams. These efforts are labor intensive, time consuming, and logistically challenging, especially in the context of multicenter simulation-based research trials.1 In many studies, data are collected by using performance-based assessment tools to rate subject or team performance via videotaped sessions.2–5 These videos often need to be reviewed by multiple reviewers independently, with the data subsequently scored, collected, and tabulated for statistical analysis. This process can become cumbersome for studies involving a high volume of subjects and videos to review or a large number of video reviewers who are geographically isolated from one another.1 Currently, several simulation companies offer simulation-based Learning Management Systems (LMS) which are designed primarily to facilitate simulation center management, video capture, simulation-based education, and performance assessment.* None of these existing systems were specifically developed to help facilitate the simulation-based research process. In this article, we describe an internet-based research portal, which is free for use and accessible from www.cesei.org, capable of managing the research process for simulation-based studies and streamlining the video review process by providing an online environment for training and calibrating reviewers who are then able to view and rate videos.
LEARNING MANAGEMENT SYSTEMS
Faced with the challenge of doing simulation-based research, we developed a research website as part of a previously existing LMS to help facilitate simulation-based research. LMS software is specifically designed to deliver, track, and manage the training of learners.6–9 Several commercial options are also available, which have been developed to help manage the administrative and educational aspects of simulation centers.* The Center for Excellence in Simulation Education and Innovation (CESEI) at Vancouver General Hospital (VGH) has designed and implemented a fully functional LMS which has been in use for several years. The LMS educational portal was developed first and contains curriculum development, delivery, assessment, tracking and administrative capabilities, in addition to an internet-based peer review process for new educational modules. At CESEI, this dynamic website has been successfully used for various interprofessional courses, involving both practical knowledge delivery and experiential learning via screen-based simulations.
Despite the widespread use of LMS across educational institutions, there have been no published descriptions of LMS that have been equipped with the functionality to overcome the multiple logistical and administrative challenges associated with simulation-based research projects.1 Specifically speaking, existing simulation-oriented LMS do not provide a research-oriented user interface, do not allow for customizable study design and subject code allocation, and do not allow customizable assignment of videos to both teams and individuals based on the study arm.* In addition, existing LMS do not enable users to customize data spreadsheets based on research arms or outcome measures. To help overcome these challenges, we developed a LMS-based research portal specifically designed to manage and facilitate single and multicenter simulation-based research projects. Located on the internet at www.cesei.org, users simply need to follow the link to the research portal and register as a new user to obtain a login and password. The research website provides a user orientation to the functionality of the website and a step-by-step description of how to use the resource. This research portal helps researchers design their project, setup data collection using customized assessment tools, upload videos for performance assessment, and finally, download data-filled spreadsheets for statistical analysis. Table 1 provides a comparative summary of the research-specific functionality of existing simulation-oriented LMS and the research portal described in this article.
When initiating projects, principal investigators can create a study team online and assign administrative privileges to a predetermined number of team members (eg, subsite investigators or research coordinator). The research portal has been encrypted so that only the principal investigator and individuals he or she chooses can have access to study data via password-protected login. In addition, different levels of access to the website can be assigned and customized for each team member depending on the specific needs of the project. Anonymity is ensured by making identifiable information available only to preselected project administrators or identifiers can be stripped completely depending on the investigator's choice and the study institutional review board protocols. Demographic data can be collected online by the system, with individual variables customized to the needs of the project. Alternatively, demographic data can also be entered into the system manually after being collected by research assistants. Once the project is created on the system, individual subject details, including demographics and study data, can be accessed online by preselected project administrators via password-protected access. The system can be used as a communication tool as well, capable of sending predrafted emails to all team members either intermittently or at preselected intervals.
Once the project team is created, researchers can then use the portal to design their project via automated schema generation based on an arm-group-subject hierarchy. The research portal was designed to accommodate both small and large projects or single versus multicenter projects. The user simply needs to specify the number of research arms, the number of subjects per arm and the system will then create a project homepage displaying all the subjects in each arm. Assignment of subject codes and randomization of subjects to the different study arms can be done either automatically or manually. Studies can be designed to have a customizable number of research arms and subjects per arm. Subgroups of subjects within each study arm can also be created as needed, allowing for flexibility of study design and subcategorization of recruited subjects. Although this does not differ much from the usual nested subgroup or group reports available on existing LMS, it does provide additional functionality that is specific to research projects. For example, multicenter studies can create subgroups within each arm that reflect all subjects within an arm from a particular recruitment site. In addition, unlike many existing LMS, data from collected from recruitment sites can be uploaded to the research portal from various locations in a seamless manner. Finally, this design also allows for customizable assignment of data (eg, for video review) to either individuals or groups of subjects, providing the flexibility required to allow researchers to study individual performance (eg, task training) or team performance (eg, team-based simulation).
PROJECT MANAGEMENT—ASSESSMENT TOOLS
Simulation-based studies often require the use of tests or assessment tools to collect outcomes data. The research portal allows for data collection to be conducted online and distributed based on specified study requirements. Data collection tools, such as tests or surveys, can be manually created online within the research portal itself. Various testing formats can be applied, including multiple choice tests, short answer tests, Likert scale ratings, or open-field questions. Upon creation of a test, correct answers and relative values can be entered for each question, thus enabling the portal to automatically mark or assign a score for each completed test. Once tests or surveys are created, they can then be automatically distributed to subjects and/or video reviewers at predetermined, scheduled intervals (eg, 0, 3, and 6 months), thus allowing for follow-up assessment, often important in simulation studies to assess retention of knowledge or skills. Notification of testing phases is sent out by the system to subjects via email reminders. Custom test schedules can be created according to study arm, subgroup, or even individualized according to the subject.
Multiple assessment tools or tests can be created for a particular research project. When multiple tools are used, they can be tagged or linked to specific subjects via study codes (to maintain confidentiality and anonymity). When subjects complete a test, the system will automatically mark the test and report the results online in tabular format. The time and date of test completion will be noted for each event. The research portal has a built-in system to minimize subject dropout. Subjects who do not complete study tests in a timely fashion are automatically sent email reminders at predefined intervals. Subjects can also be given the option of opting out of a certain test after a set number of email reminders, with study administrators being notified of this via email.
PROJECT MANAGEMENT—VIDEO REVIEW
As studies in simulation often require performance evaluation, the research portal has been designed to accommodate the uploading of video, which can then be attached to data collection tools and distributed online for video review. Videotaped simulation sessions can be uploaded to the secure server and then posted online for assignment of a file name and a random file code. File names are visible to selected project administrators, while random codes permit identification of videos when posted for viewing by video reviewers for performance assessment. Once uploaded, videos can be played, paused, rewinded, and fast-forwarded, thus allowing reviewers ample opportunity to process thoughts and comments when completing performance assessments. A built-in timer is attached to each video, making it easy to assess and rate time-specific items often associated with performance assessment tools.
Once uploaded, videos are manually tagged to study subject(s), and subsequently assigned for review by specific video reviewers using predefined assessment tools that have been created online and entered into the system. Management and distribution of videos in this fashion allows for videos to be rated using different tools and various reviewers. For example, a videotaped recording of a healthcare team managing a simulated trauma scenario can be reviewed using both a clinical performance tool and a separate behavioral performance tool—either by the same reviewer or by multiple different reviewers. Data collected from multiple reviewer ratings of the same video can subsequently be used to calculate inter-rater reliability. Each video reviewer is given limited access to the research portal and can login to view his or her list of assigned tasks. Project administrators can also login and follow-up on reviewer progress, as the system will tag each video asset with a timestamp when it is viewed and rated by a particular video reviewer. New video assets can be uploaded and assigned intermittently, and data from completed assessments are collected automatically and available for viewing by project administrators.
Data collected from tests and assessment tools on the research portal are stored and available for access online by selected project administrators, or alternatively, downloadable in excel spreadsheet format for statistical analysis. When choosing to download data in spreadsheet format, project administrators can sort data according to arm, group, subject, assessment tool, or video reviewer. This allows for the rapid and efficient creation of outcomes-specific spreadsheets, thus saving researchers hours of valuable time by not having to manually enter data into spreadsheet files. The presentation of spreadsheets can also be customized, with data sorted by subject into rows or columns. For individual tests or assessment tools, researchers can choose to include the extent of detail available on each spreadsheet. For example, spreadsheets can be customized to include or exclude question text, response text versus numeric variable, score per question, time stamps, subject demographics, reviewer notes, and weights or percentages of performance scores.
The research portal has fairly basic technical requirements. The operating systems and software required for the research portal include Microsoft Windows Server 2003 Service pack 2 or Window Server 2008 or Linux Red Hat 4 or 5.2. Other software requirements include PHP 5, Apache or IIS, Adobe Flash Media Interactive Server 3. Use of an internet browser such as Firefox 3.x, Safari 4.x, or Internet Explorer 8.x is required, with plug-ins such as Adobe Flash Player 10.x or Quick Time installed to view videos online. The servers for the EXPRESS project were physically located at the Vancouver General Hospital, Center for Excellence for Simulation Education and Innovation. The minimum server requirements include a 3.2 Intel Pentium 4 processor (dual Intel Xeon or faster is recommended), 2GB RAM (4GB recommended), and a 1GB Ethernet Card. Our server capacity and size is more than 62.5GB on an ultra SC, ultra SCSI, or SATA drives. At CESEI, the LMS is located behind a firewall and thus a special port had to be opened on the firewall in order for the LMS to be functional. In our case, the specific firewalls opened for this project included HTTP port 80, RTMP port 1935, and SMTP port 465.
USE OF RESEARCH PORTAL FOR EXPRESS SCRIPTED DEBRIEFING STUDY
This research portal was first used by the EXPRESS research collaborative for a multicenter research study aimed at assessing the benefits of (a) scripted debriefing for novice facilitators and (b) simulator realism.1 Use of the research portal allowed us to successfully manage data collected from more than 400 study subjects from 15 different recruitment sites across North America. Subject demographics and data from multiple-choice tests were automatically collected from subjects in a seamless fashion as they logged into the system to complete the tests. Simulation and debriefing videos were collected and processed by individual recruitment sites and then uploaded to the portal for review. In total, more than 350 videos were uploaded to the website. Videos were reviewed by experts using three different assessments tools to rate different categories of performance (clinical performance, behavioral performance, and debriefing performance). Our team of 24 video review experts comprised individuals from Canada, United States of America, Australia, and Japan. The videos were grouped and randomly assigned online to specific video review experts. The research portal enabled each reviewer to complete their assigned tasks by logging into the system from their own city, viewing the videos, and rating each video with a preassigned assessment tool. The research portal enabled rapid extraction of data via the video review process, with reporting of this data via download in excel spreadsheet format.
CHALLENGES AND FUTURE DIRECTIONS
We have successfully developed and used an internet-based research portal to facilitate simulation-based research. To help disseminate the use of this free and valuable tool to the simulation-based research community, we have setup a link to the research portal accessible from www.cesei.org, which outlines the functionality of the portal and provides a step-by-step guide on how to use this resource effectively. In anticipation of questions and/or technical support requirements for new users, we plan on creating videos which will be posted online to answer common questions related to use of the website. In some cases, new users with projects requiring specific or unique needs (eg, expansion of functionality of website, enhanced technical support, or large data storage requirements) will be considered on a case-by-case basis. Details regarding technical support and data storage capacity can be found online. As demand for use of the website grows, there may be a need in the future to implement a screening or prioritization process—at the moment however, all users are able to access this resource without screening.
The use of this portal has enabled timely execution of a multicenter research trial and, in the process, demonstrated its potential impact for simulation-based research. In addition, this portal could be used to evaluate simulation-based educational programs. We hope to further enhance the capabilities of this resource in the future by integrating real-time data collected by simulators (eg, performance tasks such as pulse check, chest compressions) with the data management software already built into the portal. We envision that the use of this portal will enable novice and expert researchers alike to carry out their projects in an efficient, coordinated, and timely manner, thus ultimately helping to advance the field of simulation worldwide.
Kristen Nelson, MD; Walter Eppich, MD; Jenny Rudolph, PhD; Robert Simon, EdD; JoDee Anderson, MD; Aaron Donoghue, MD; Akira Nishisaki, MD; Mike Moyer, BS, MS; Marisa Brett-Fleegler, MD; Monica Kleinman, MD; Judy Leflore, PhD; Matthew Braga, MD; Susanne Kost, MD; Glenn Stryjewski, MD; Steve Min, MD; John Podraza, MD; Joseph Lopreiato, MD; Melinda Fiedor Hamilton, MD; Jonathan Duff, MD; Jeffrey Hopkins, RN; Kimberly Stone, MD; Jennifer Reid, MD; Douglas Leonard, MD; Kathleen Ventre, MD; Laura Corbin, MD; Kristine Boyle, MS; Marino Festa, MBBS; Frank Overly, MD; Stephanie Sudikoff, MD; Takanari Ikeyama, MD; Louis Halamek, MD; Stephen Schexnayder, MD; Jack Boulet, PhD; Liana Kappus, MEd; John Gosbee, MD, MS; Laura Gosbee, MASc; Jennifer Manos, RN; Matthew Richard, BSc.
The authors thank Byron Tredwell for his innovative design of the research portal, Anne Marie White, EXPRESS research coordinator, for her hard work and commitment to this project, the AV/IT team at CESEI for their support—Albert Ho and Ferooz Sekandarpoor for setting up the videotaping and processing hundreds of videos and Albert Ho for revising the research portal homepage.
1. Cheng A, Hunt EA, Donoghue A, et al. EXPRESS—Examining Pediatric Resuscitation Education using Simulation
and Scripting. The birth of an international pediatric simulation
research collaborative—from concept to reality. Simul Healthc
2. Hugh Devitt J, Kurrek MM, Cohen MM, et al. The validity of performance assessments using simulation
3. Mayo PH, Hackney JE, Mueck JT, et al. Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator. Crit Care Med
4. Lee SK, Pardo M, Gaba D, et al. Trauma assessment training with a patient simulator: a prospective, randomized study. J Trauma
5. Carbine DN, Finer NN, Knodel E, et al. Video recording as a means of evaluating neonatal resuscitation performance. Pediatrics
6. Dumpe ML, Kanyok N, Hill K. Use of an automated learning management system to validate nursing competencies. J Nurses Staff Dev
7. Knobl B, Paiva T, Jungmann D, et al. ENN-ICS—implementation and evaluation of a multilingual learning management system for sleep medicine in Europe. Stud Health Technol Inform
8. Zajaczek JE, Gotz F, Kupka T, et al. eLearning in education and advanced training in neuroradiology: introduction of a web-based teaching and learning application. Neuroradiology
9. Vollmar HC, Schurer-Maly CC, Frahne J, Lelgemann M, Butzlaff M. An e-learning platform for guideline implementation—evidence and case-based knowledge translation via the internet. Methods Inf Med
*Available at: www.meti.com, www.ems-works.com, www.blinemedical.com, www.cae.com.
*Available at: www.meti.com, www.ems-works.com, www.blinemedical.com, www.cae.com.