Pulley, Jill M. MBA; Harris, Paul A. PhD; Yarbrough, Tonya RN; Swafford, Jonathan; Edwards, Terri RN; Bernard, Gordon R. MD
Ms. Pulley is director of research support services and implementation manager, Vanderbilt Clinical and Translational Science Award, Vanderbilt Institute for Clinical and Translational Research (VICTR), Vanderbilt University Medical Center, Nashville, Tennessee.
Dr. Harris is director, Office of Research Informatics, and operations director of biomedical informatics, VICTR, Vanderbilt University Medical Center, Nashville, Tennessee.
Ms. Yarbrough is research manager, VICTR, Vanderbilt University Medical Center, Nashville, Tennessee.
Mr. Swafford is health systems analyst programmer, Office of Research Informatics, VICTR, Vanderbilt University Medical Center, Nashville, Tennessee.
Ms. Edwards is assistant director, Research Support Services, Vanderbilt University Medical Center, Nashville, Tennessee.
Dr. Bernard is associate vice chancellor for research and principle investigator, CTSA and VICTR, Vanderbilt University Medical Center, Nashville, Tennessee.
Correspondence should be addressed to Ms. Pulley, Vanderbilt Institute for Clinical and Translational Research, Vanderbilt University Medical Center, 2525 West End Avenue Sixth Floor, Nashville, TN 37203; telephone: (615) 343-2842; e-mail: Jill.email@example.com.
Regulatory complexities and new and changing federal and institutional policies in the research environment have increased significantly over the past decade and created the need for a greater number of reviews and approvals required for the initiation and conduct of clinical and translational research. Each review provides a critical function, for example, in protecting the rights and welfare of human subjects,1 protecting vulnerable populations, ensuring ethical principles are upheld, maintaining financial feasibility and security, maintaining compliance with regulations, ensuring biosafety, and protecting privacy. However, responsibility for the numerous regulatory review processes is typically not centralized within institutions, forcing research teams to interact with multiple departments independently to gain approval to initiate a study. Ambiguity concerning submission requirements and expected times associated with the review process can create additional work for research teams and ultimately delay important scientific projects.
The problem is exacerbated when federal and institutional policies are added or modified, essentially creating a “moving target” for researchers. Regulatory requirements can be particularly burdensome and/or confusing to new or junior investigators, interdisciplinary research teams, and investigators who initiate their own research. These are critical populations within Clinical and Translational Science Award (CTSA) programs. These challenges could theoretically lead to a scenario in which a scientist with a novel research idea is dissuaded from conducting studies at least in part because of the complexity of the regulatory approvals process.2
Researchers need assistance with research initiation processes. In response, academic research institutions have recently begun to create offices devoted to providing assistance to researchers with such processes.3,4 Vanderbilt (which because this is a joint initiative, refers to both Vanderbilt University and Vanderbilt Medical Center in this paper) initiated the Research Support Services office in 2004 to develop enterprise-wide programs to facilitate clinical and translational research, and to work directly with investigators on study initiation and conduct.
In addition, many institutions have realized the need for informatics to provide technical solutions for the creation and management of administrative applications for researchers.5 Vanderbilt also created an Office of Research Informatics in 2007 focused on providing informatics tools and services for the clinical and translational research domain. The pairing of these complementary departments was designed to generate comprehensive initiatives with the goal of decreasing the researcher's administrative processes. Generally, we believe systems that effectively provide researchers with routine and frequently needed administrative support increase staff efficiency.
To further assist researchers, an assessment of the Vanderbilt regulatory review and approval process was conducted in 2007 and identified up to 20 potential applications, authorizations, or needed agreements required prior to initiating research. Streamlining the regulatory approval process for scientific investigators was therefore identified as a critical target for improvement. We examined all required review and approval procedures within the Vanderbilt research enterprise and found that requisite authorizations were triggered by a limited number of basic study characteristics. Our hypothesis was that an interactive informatics tool collecting only those specific characteristics that generate the need for a specific approval could efficiently produce a tailored, accurate list of required authorizations. We report here details concerning the informatics tool allowing researchers to receive a Vanderbilt Customized Action Plan (V-CAP) for individual research studies, a reference workflow associated with Vanderbilt regulations and policies, and researcher usage statistics for 36 months of operation. We also suggest a list of recommendations for other academic medical centers considering a similar systems-based approach for assisting researchers in the regulatory approvals process.
Workflow for a Proposed Informatics Solution
The V-CAP research planning assistant was designed around the concept of allowing investigators to describe characteristics of research they are proposing via a series of dialogue screens. The V-CAP system is accessed through Vanderbilt's centralized online research initiation, planning, and support researcher portal. Users register a project name and are then presented with a series of 26 questions concerning a specific research study (see Appendix 1). Questions are shown sequentially to users, and embedded program branching logic ensures that researchers are only presented questions that are relevant based on answers to previous questions. For example, questions regarding investigational drugs are never displayed if the user answers “no” to the question that asks whether the study will use investigational drugs. Figure 1 shows how the branching logic is configured (the full detailed branching logic diagram can be viewed at http://www.mc.vanderbilt.edu/victr/pub/newspub/vcap.html).
During the interview process, researchers may view context-sensitive help for each question. Each help screen provides a brief description of the relevant regulation or policy, ancillary or educational information related to the needed approvals, and relevant assistance available from the institution.
Once all relevant questions have been answered during the interview process, the user is presented with a review/validation screen and can modify answers as needed. After reviewing, the user chooses to initiate the V-CAP and is presented with a printable list of required approvals, an electronic link to each required form, and a link to an associated document called “What to provide/What to expect,” which describes the process by which a researcher seeks and receives the particular approval (view screen shots at http://www.mc.vanderbilt.edu/victr/pub/newspub/vcap.html). Users are also provided with links to more detailed instructions if available on the individual department's Web site and a reference or link to the national or local policy that requires the action/approval, if applicable. The generation of a V-CAP also automatically triggers an IRB-approved, anonymous follow-up survey to assess the user's experience with the software.
Informatics System Design
Assessment of System Implementation and Utilization
The V-CAP system was launched in October 2006 and was designed to be continually upgraded through results of evaluation efforts and as new regulations arise or institutional policies are created. A versioning concept was implemented in the database architecture that allows branching logic to change over time without affecting users' preexisting V-CAPs.
Approximately 650 V-CAP studies have been created by over 400 unique users at Vanderbilt University (see Figure 2). On the basis of the first 550 V-CAP studies, the average number of authorizations needed based on the output of the V-CAPs is 3.6 (range: 1–7). Of those that started and finished in one sitting (∼414), the average time to answer questions was 3 minutes 38 seconds. Overall, the help dialogue screens were accessed 244 times. Figure 3 provides a frequency histogram representation of individually end-user-selected, context-sensitive help topics. The most frequently selected help topics related to research on human subjects, IRB exemption criteria, and information on the translational technologies and resources available through the institution's system of shared core facilities.
We recently implemented an automated feedback survey after a V-CAP is generated. Since implementing this feature, 79 people have used the current version of the V-CAP and received an automated feedback request e-mail, and 25 completed surveys, for an estimated response rate of 32%. Other individuals responding to the survey responded to a separate mass solicitation of feedback sent to 248 individuals, for a total combined sample size of 34. User feedback shows that half (17) of the respondents say they learned something new from the process. Most (n = 27; 79%) would use this tool again for future research projects. Most (n = 30; 88%) found the interactive format easy to use. A majority (n = 27; 79%) responded that the V-CAP was helpful in directing their approval process and that the questions were applicable to their research. A majority also found the standardized “What to provide/What to expect” forms, which accompany the V-CAP, helpful.
Lessons Learned and the Evolution of the V-CAP System
Although the sample size is small and not representative of the entire clinical and translational research community, the number of respondents who reported they learned something new from the V-CAP system suggests that the tool serves not only a compliance function but also as an education resource. To assess this impact, we recently implemented a function which inquires about the number of needed approvals believed to be necessary prior to engaging and self-entering study details within the V-CAP interface. Before completing their V-CAP survey, 11 individuals were asked how many approvals they believed were needed; the average number of approvals believed to be needed before going through the V-CAP was 2.0, and the average number of necessary approvals according to their responses to the interactive survey questions (based on a very small sample (n = 11) of researchers to date) was 3.6. These data are very preliminary but will continue to be assessed.
By referring users to support staff very early in the approval process for questions and support regarding the regulations and requirements for approval, the V-CAP might help to reduce time spent on unnecessary approvals, although we did not measure that variable. Future development will include institutional support modules providing access to ancillary support, in real time, based on user needs.
There are limitations to these findings. Because the V-CAP utilization and feedback data were collected using an evolving system rather than on the basis of a snapshot in time, there may be nuances resulting from a particular systems modification or sequence of events. The generalizability of findings might be limited to comparable academic medical institutions, because some of the questions are associated with local policies and requirements specific to Vanderbilt as an academic medical center. However, most questions are based on national policies and regulations and are likely relevant at other institutions with similar research portfolios. The fact that the applicability of a large number of complicated regulations across various institutional and governmental bodies can be synthesized and determined on a small number of questions potentially has significant value for other institutions. Thus, we believe the overall approach and informatics system have broad scalability and applicability for implementation at other institutions in the support of clinical and translational research.
Recommendations for implementing a similar system
We have compiled a short list of recommendations for institutions wishing to develop a similar system. Our own system is continually evolving as we add features and functions, but the following recommendations represent design principles that have been important in the V-CAP system implementation and ongoing development:
* Design from the standpoint of making things as easy as possible for the research team.
* Create a partnership of regulatory content experts and informatics technical experts.
* Separate the content (questions, branching logic, resulting recommendations) from the presentation layer (visual user interface) used by the research team by storing content in a relational database.
* Create easy methods for content experts to modify and test questions, branching logic and recommendations independently. This will free informatics staff to develop new features while enabling content experts uninhibited access to methods to continuously improve the project for researchers.
* Build in metrics (real-time dashboards) for evaluation by program management experts. Monitoring all aspects of program usage by researcher end users will enable continuous quality improvement and allow ready assessment of resource utilization during regular prioritization planning efforts.
* Create easy methods for researcher end users to ask questions and suggest improvements, both in the interview process and during feedback surveys.
The V-CAP system is just one solution to the administrative barriers related to conducting clinical and translational research, but it is one that has been received favorably within our institution. Our data suggest that the V-CAP application is an important tool supporting regulatory compliance because it provides investigators and study personnel with a checklist of the regulatory requirements for approval to conduct research, theoretically reducing the chance that an approval might be overlooked. It also provides the researcher with information necessary to initiate the various approvals necessary for regulatory compliance simultaneously rather than sequentially. The informatics-systems-based approach of the V-CAP is scalable to other academic medical centers and may serve as a useful model to help researchers navigate the complexity of the regulatory approvals process.
This work was supported in part by Vanderbilt CTSA grant 1 UL1 RR024975 from NCRR/NIH.
1Parvizi J, Tarity TD, Conner K, Smith JB. Institutional review board approval: Why it matters. J Bone Joint Surg Am. 2007;89:418–426.
2Zerhouni EA. Translational and clinical science—Time for a new vision. N Engl J Med. 2005;353:1621–1623.
3Arbit HM, Paller MS. A program to provide regulatory support for investigator-initiated clinical research. Acad Med. 2006;81:146.
4de Melo-Martín I, Palmer LI, Fins JJ. Viewpoint: Developing a research ethics consultation service to foster responsive and responsible clinical research. Acad Med. 2007;82:900–904.
5DiLaura R, Turisco F, McGrew C, Reel S, Glaser J, Crowley WF Jr. Use of informatics and information technologies in the clinical research enterprise within US academic medical centers: Progress and challenges from 2005 to 2007. J Investig Med. 2008;56:770–779.
Table. Appendix 1 Se...Image Tools