To achieve the health goals of the 21st century, researchers from multiple disciplines must bridge their differences and together address the challenging problems that face us.
—The Institute of Medicine, 20011
In 2005, the Institute of Medicine and the National Academy of Engineering called for a more robust engagement between medicine and systems engineering, management science, and information science to facilitate more rigorous, efficient, modern methods of research.2 Fundamental to the necessary expansion and interdisciplinary character of the successful future clinical research enterprise will be the development and support of clinical investigators who not only are capable of rigorous hypothesis generation, study design, and data analysis within their research area but also are proficient at moving beyond their own discipline to integrate advances in other disciplines to implement, evaluate, and spread creative interdisciplinary solutions. However, developing a formal infrastructure to provide, manage, and fund this transformed research process across academic disciplines has been challenging for academic health centers.
In 2007, we developed a novel model at Vanderbilt University specifically to address these challenges, based on the premise that timely, focused, expert guidance from faculty representing multiple perspectives and disciplines would improve the quality and potential impact of today’s clinical and translational research. As part of Vanderbilt’s National Institutes of Health (NIH) Clinical and Translational Science Award (CTSA) grant and its program, Translating Discovery into Practice, we invested the resources and organizational structure to implement and evaluate an innovative model of internal research support, which we have called a “Studio.” Herein, we describe the Studio Program as well as the results of our evaluation of its first four years.
Studio Definition and Purpose
The Studios are a series of integrated, dynamic roundtable discussions that bring together relevant research experts from diverse academic disciplines to focus on a specific project or investigation at a specific stage of research. These sessions are intended to refine hypotheses and research questions; to promote the most appropriate and most rigorous study designs and research methods; to ensure the most effective and efficient approaches to study implementation; to examine and consider new study analyses in an effort to maximize both the amount and the rigor of information a project generates; and to facilitate the translation of research findings into publication, practice, and policy.
We have classified Studios into two broad categories for purposes of administrative management. The “bench to bedside” (hereafter referred to as T1) Studio Program captures research and proposed research that involve uncovering pathophysiology and mechanisms of disease, as well as early-phase feasibility, safety, and efficacy trials. These T1 Studios include the type of research typically conducted within the general clinical research center. The “bedside to practice and policy” (hereafter referred to as T2) Studio Program captures research and proposed research that involve clinical and comparative effectiveness research, epidemiology, health services research, health behavior and health education research, implementation science, community-based research, and health policy.
Key personnel are the two Studio directors—one for T1 research (I.B.) and one for T2 research (R.S.D.)—plus Studio managers (T.T.H. and L.R.B.) and Studio moderators (see “Studio types and expert panels” below). The Studio directors (two senior research faculty members) consult with the Studio managers to examine Studio requests to ensure that each request is both appropriate and sufficiently focused for a successful Studio session. Studio directors also work with the Studio managers to identify a Studio moderator who is suitable to lead a discussion on the given topic and to select appropriate faculty experts. Finally, the Studio directors guide program evaluation. The Studio managers have experience in different facets of clinical research (e.g., clinical epidemiology/trials, health services research, community and behavioral health), hold master’s degrees in public health, and devote 50% of their time to Studio work. The Studio managers schedule the Studio sessions, compile the expert review forms and notes from each session into a report with useful feedback for the requesting investigator (hereafter simply “investigator”), implement program evaluation, and use Studio evaluation feedback to continually improve the Studio Program and its processes. The CTSA principal investigator (G.R.B.) also sends letters to department chairs semiannually to recognize repeat participants (experts, moderators) for their contribution (in the form of institutional service), which factors into faculty performance reviews.
Studio Processes and Description
Studios are available, free of charge, to all Vanderbilt and Meharry Medical College investigators, and we put no limit on the number of Studio sessions an investigator or department can request. Meharry Medical College and Vanderbilt University are funded jointly through the CTSA grant and, since 1999, have had a formal alliance to support and promote collaborative research. List 1 provides a summary of the Studio process.
Investigators who believe that their research might benefit from rigorous, interdisciplinary review may request a Studio on a voluntary basis at any stage of their project. The CTSA Scientific Review Committee, which allocates pilot grant resources, may also refer investigators for a Studio. Either way, all investigators initiate the process by completing an online Studio request form that takes less than 20 minutes to complete. This form, built into StarBRITE,3 the Vanderbilt one-stop, Web-based research portal, requires investigators to provide a brief summary of their project as well as the specific questions that they would like the experts to help answer. Within one to three business days of submission, one of the Studio managers contacts the investigator to start the scheduling process. The Studio session is scheduled for a date approximately three to six weeks after the request, depending both on the availability of the investigator and panel members and on the urgency of the need. Although the Studio managers do not reject applications, they may ask the investigator to defer the Studio until he or she is more prepared.
Studio types and expert panels
Investigators select one of seven Studio types: hypothesis generation, study design, grant review, implementation, analysis and interpretation, manuscript review, or translation (Table 1). In addition, investigators are also asked to suggest faculty experts either by name or by specific area of expertise (e.g., a biostatistician with expertise in propensity score analysis). The Studio directors (working with one of the Studio managers) identify, on the basis of their personal knowledge of local expertise, additional experts who are appropriate for the stage and focus of the research. They also conduct a PubMed search of Vanderbilt investigators and/or search an NIH database listing the 553 Vanderbilt faculty members who have served on one or more NIH study sections (i.e., grant review committees) in the past 10 years. Members of a core team of senior faculty associated with the Studio Program (CTSA leadership) also serve as experts.
Most experts are Vanderbilt or Meharry faculty members, but occasionally external experts participate, usually by phone. Studio managers consider previous expert participation to avoid overtaxing any individual faculty member. Junior faculty members are purposely invited to participate as experts for the educational or professional development value of the experience. The Studio manager recruits a senior investigator who has both experience with the Studio process and expertise in the type of research being discussed at the Studio to be the moderator. New moderators receive specific instructions the first time they guide a session. The Studio moderator may also assist in selecting experts, and he or she leads the session. Every Studio’s expert panel includes at least one biostatistician, assigned by the chair of the Department of Biostatistics. Also, a member of Vanderbilt’s ethics faculty reviews every Studio request to determine whether the study may benefit from ethics input.
Before each Studio session, experts receive via e-mail all documentation relevant to the research. The investigator who requests the Studio provides the documents including any specific questions, concerns, or areas of focus he or she would like addressed.
Studio managers record the sessions, which are scheduled for 90 minutes. To ensure efficiency within each session and consistency across the series of Studios, we developed a Studio moderator checklist (see Supplemental Digital Form 1, http://links.lww.com/ACADMED/A94). Each Studio session begins with a brief introduction of the participants and their roles/areas of expertise. Next, the moderator provides an overview of the Studio’s purpose and reviews the agenda for the session. Then (with the understanding that the assembled experts have read the relevant documents before the session), the investigator provides a 15-minute summary of the project, including the specific questions, concerns, or areas of focus that he or she would like to address.
The moderator guides the discussion that follows, allowing each expert to provide feedback and ask questions. If appropriate, the moderator incorporates ethics and compliance with regulatory requirements into the discussion. The moderator’s goal is to guide the session so that it maintains focus, stays solution-oriented, and remains nonthreatening. The goal is to create a safe space for interdisciplinary discussion in a context-free zone that allows for focused, uninterrupted thinking. We define “context-free zone” as a neutral environment, away from one’s office, laboratory, department, and home, that allows for more creative thinking. Although Studio sessions are not open to the community, interested parties may attend and observe, provided the investigator agrees. At the conclusion of the session, the moderator asks for final comments and summaries and attempts to resolve any conflicting advice. Before leaving the room, experts are expected to complete the “Vanderbilt Institute for Clinical and Translational Research (VICTR) Expert’s Comments Form,” which requires them to provide a brief, free-text evaluation of the strengths and weaknesses of the Studio research (see Supplemental Digital Form 2, http://links.lww.com/ACADMED/A95).
After each Studio session, the Studio manager compiles the recommendations of the experts and, within a few days, provides the investigator with a transcript and a summary of the session.
Studio Costs and Funding
Operating costs for the Studio Program result from three basic expenses: the Studio managers’ salaries, the honorarium for the experts and moderator, and, when appropriate, meals for participants. To demonstrate that we value their time, we offer all moderators and experts (regardless of their experience or expertise) $150; experts accept this honorarium when appropriate, within compliance policies for effort reporting. Participants receive meals only when Studio sessions occur at breakfast or lunch time. The NIH CTSA grant serves as the primary funding mechanism of the Studio Program. Institutional funds cover grant review Studios and Studios that are not permitted under the CTSA guidelines (e.g., animal studies). The CTSA grant and the institution bear the entire cost of the Studio Program.
To illustrate the annual costs for the program, we examined the expenses for 2009 when we conducted 55 Studios. The total cost that year was approximately $85,025. Thus, each Studio session cost approximately $1,546 (Table 2). During the first four years of the program, 44% of experts (362/822) have received payments, totaling $41,700. Given the value of a successful investigator and project, these costs are minimal.
Studio Promotion and Incorporation Into Other Programs
We advertise and promote the Studio Program at Vanderbilt University and Meharry Medical College through various means: e-mail announcements, town hall educational sessions, research-skills workshops, and departmental overviews of Vanderbilt’s CTSA program. Further, as mentioned, the CTSA Scientific Review Committee refers investigators to Studios. Lastly, faculty experienced with the Studio Program regularly encourage investigators to bring a project to the Studio Program.
In 2008, in an effort to support masters of science in clinical investigation (MSCI) trainees as they conducted their research projects, we required the Studio process for first-year MSCI trainees. Each new trainee was automatically scheduled for a Studio session in the first few months of his or her program. On the basis of positive feedback from the trainees, in 2010 we incorporated the Studios into the curriculum for second-year MSCI trainees as well. We are now developing plans to incorporate the Studio Program into the Emphasis Research Program (i.e., a program of required, mentored longitudinal research)4 for the medical students.
Studio Program Evaluation and Results
In an effort to continually improve the Studio Program, we evaluate each Studio session. The Vanderbilt University institutional review board has deemed this evaluation exempt research.
We ask investigators, experts, and moderators to complete a survey about their perceptions of the value of the Studio process after each Studio session (see Supplemental Digital Form 3, http://links.lww.com/ACADMED/A96, and Supplemental Digital Form 4, http://links.lww.com/ACADMED/A97). These evaluations are not anonymous to the Studio managers because an electronic linkage to the appropriate Studio session is required for outcomes analysis. In January 2011, the Studio managers assessed the outcomes of the first four years of the Studio Program. They compiled data on usage, data from the investigator and expert surveys, and data on the number of CTSA grants awarded, using REDCap electronic data capture tools hosted at Vanderbilt.5
The results presented here (and in “Lessons Learned” below) emanate from this January 2011 evaluation as well as from the periodic discussions that Studio leaders convene to gather feedback and improve the program.
Studio Program usage
We hosted a total of 157 Studio sessions during the first four years: 11 in 2007, 44 in 2008, 55 in 2009, and 47 in 2010 (Table 3). Among the seven Studio types, study design was by far the one that investigators most often requested (n = 90). Of the 157 Studio sessions, 121 were for T1 research, and 36 were for T2 research. Although assistant professors (n = 43) and fellows (n = 36) requested the most Studios, the program supported the full range of medical researchers from medical students to full professors and department chairs (Table 3). Thirteen investigators have requested subsequent Studios: 11 requested a second, and 2 have requested three sessions.
The number of Studio sessions that each expert participated in ranged from 1 to 29. Approximately 30% of the experts participated in only 1 Studio session; nine experts participated in 10 or more Studio sessions. Except for reasons of travel or a perceived subject matter mismatch, no expert or moderator has thus far declined to participate.
Time from request to session
The median time from request to Studio session was 41 days (95% bootstrap confidence interval: 35–44), with a range of 6 to 140 days; the mode was 28, and interquartile range was 28 to 58 days (Figure 1). The period between request and session includes preparation time requested by investigators. MSCI program trainees often request that their Studio sessions occur several months in the future to align better with their training schedules. The median time from request to Studio session for the non-MSCI investigators was 39.0 days (range: 6–134) versus 50.5 days (15–140) for MSCI investigators (P < .001). Nearly all of the investigators who completed their evaluation (98%; 117/120) agreed that the scheduling/communication for their Studio session was handled in a timely and efficient manner (Table 4).
Satisfaction with the Studio Program
The response rate for our survey was 78% (122/157) for investigators and 49% (406/822) for experts and moderators. Both groups perceived the Studios as a valuable addition to Vanderbilt’s clinical research infrastructure (Table 4). Investigators were almost unanimously satisfied with the Studio sessions. Of those who responded, all but one (121/122; 99%) agreed that the Studio Program improved the quality of their science, and all reported that they would recommend the Studio Program to their colleagues. Similarly, 98% of the experts who contributed to the Studio Program (398/406) responded in their evaluations that the process was worth their time. Experts reported anecdotally that they enjoyed participating in the intellectual debates about the formulation of research questions and the design, implementation, and analysis of specific research projects.
As mentioned, Vanderbilt’s CTSA Scientific Review Committee referred investigators with pilot research proposals that were deemed not yet ready for funding to the Studio Program. Studio sessions proved effective in improving the proposal so that pilot funding was awarded on subsequent review in nearly three-fourths of the proposals; the remaining proposals were sufficiently revised through the process that the investigators refocused their efforts in a different direction.
Evaluation limitations and conclusions
Our estimation of the Studio Program’s success could be biased because those who were unsatisfied with the program might have failed (given that feedback was not completely anonymous) to complete the evaluation forms or may have been unwilling to openly criticize the program. Also, we have neither a control group nor—to date—any long-term end points; however, the process improvements reported are likely to lead to long-term success.
Through our evaluation of the first four years of the VICTR Studio Program, we found that investigators have been highly satisfied with the Studio Program and have reported that the sessions have improved the quality of their science. We also found that the vast majority of faculty members selected to serve as experts reported that participating in the Studios was worth their time and that they believed that their input improved the quality of the science. These are noteworthy findings because they suggest that creating an effective, widely accepted internal research support system is possible.
One of the goals of the academic health centers, as outlined by the CTSA program, is to remove barriers for clinical and translational researchers.6 These barriers generally fall into three categories: the research workforce, research operations, and organizational silos.7 The Studio Program offers universities and their investigators a means of overcoming each of these barriers.
The Studio Program constitutes a form of faculty development for investigators. Studio sessions provide valuable and needed support for both junior and senior investigators; the sessions help investigators to effectively develop high-quality clinical research and, in turn, succeed in academia. In particular, we have received positive feedback from the MSCI program trainees, who now enthusiastically participate in the Studio process as a routine part of their research projects. The Studios therefore facilitate building the research workforce.
The Studio Program has also contributed to improving research operations, most significantly by bringing together study investigators and faculty with expertise in such areas as study design, data acquisition (including effective use of existing databases), measurement, study implementation, data analysis, informatics, regulatory affairs, and ethics. Rich discussion on these topics during Studio sessions informs the research under consideration in an efficient and effective way.
Although investigators who bring their research to Studios have mentors and receive feedback from within their laboratory or division, the Studio sessions can provide deeper and broader expertise and further improve their projects. Studio leaders believe that it is vital for the experts in the Studios to actually hear one another’s feedback, rather than forcing the investigator to relay one expert’s feedback to another. Without the Studio Program, investigators would face the problem of continuously resolving an endless cycle of conflicting advice that resolved previously conflicting advice. When together, experts and investigators can come to a common solution they might not have considered otherwise, that evolves from their interdisciplinary interactions. Further, bringing many experts together prevents junior investigators from becoming caught in a maelstrom of mentoring.
Lastly, Studios also serve to break down the third barrier—organizational silos. As Califf et al8 suggest, silos continue to be a major challenge: “The siloed nature of academic institutions can render fundamental communication among researchers difficult.” Studios have routinely brought together researchers with expertise in clinical and translational sciences such as biostatistics, education, engineering, epidemiology, and ethics. The Studio Program has also brought together professionals from the fields of clinical health (e.g., medicine, nursing, and allied health), informatics, management science, public health, and the social sciences (including anthropology, economics, psychology, and sociology). Finally, Studios have brought together clinical investigators and scientists with expertise in basic laboratory science such as molecular physiology and biophysics, cell and developmental biology, microbiology, and pharmacology. Anecdotal evidence indicates that new synergistic collaborations have organically emerged from the Studio Program.
The Studios reduce not only intrainstitutional barriers but also interinstitutional barriers. Specifically, the Studio Program has strengthened the Meharry–Vanderbilt partnership by facilitating effective collaborations and enhancing the exchange of information about institutional capabilities.
Although the Studio Program has been successful, Studio managers note three specific challenges to the implementation of an effective Studio session. First, investigators need to be properly prepared for the session. On rare occasions, inadequate preparation has reduced the effectiveness of a session. Second, some investigators have failed to keep their presentations brief, limiting the time for feedback. Third, on one occasion, too many (n = 10) experts participated in a session and the pre-review material was underdeveloped, which reduced the value of the session. The ideal number of experts depends on the topic, the investigator, and the moderator, but we have found that five experts per session generally provide the right balance.
We also acknowledge that this Studio Program may not be as successful at other academic health centers that have a different academic culture or that do not have a CTSA grant or other source of support. Vanderbilt has a collaborative, generous, supporting, and open culture of inquiry in which the type of effort and interactions that are generated by the Studio Program are embraced.
Finally, the Studio process as described is an ongoing, evolving program—not a static entity. We use ongoing feedback for continual improvement. However, the basic elements (see below) remain constant and serve as the guiding principles of the program.
Although it is difficult to measure the impact of the Studios without a control group, our experiences with the program over its first four years suggest that it has improved the research process and the quality of the science generated through at least four mechanisms. First, a diverse set of experts in a room together brainstorming about a particular research issue has allowed the Studio Program to function as an incubator, often creating a synergy that generates novel hypotheses and study design features. The moderators play an important role in making sure each expert receives time to provide input and is heard. The success of the Studio Program relies on the ability to keep existing experts returning for, and new experts willing to attend, future Studio sessions.
Second, connecting investigators to collaborators with complementary expertise has established new connections among individuals who might not otherwise have met. Third, the panel of Studio experts often represents more than a century of collective experience in clinical research, which creates a think-tank environment that enhances research quality and ensures appropriate protocol design and scientific rigor. Having an ethicist on the panel often helps investigators improve their research design by addressing ethical issues early in the planning phase, just as including biostatisticians helps guarantee the quality of the quantitative analyses. Finally, the formal Studio process provides a timely mechanism to gather busy experts together to assist an investigator in a focused manner. Given the very busy schedules of such experts, a single investigator acting alone would have a difficult, if not impossible, task scheduling such a discussion in a timely manner without the infrastructure and culture provided by the Studio Program.
In Table 5, we provide guidelines for implementing a Studio Program based on our experience.
In conclusion, with the support of the CTSA grant and institutional funding, Vanderbilt has created a comprehensive, fully funded, sustainable review and research support mechanism that provides robust interdisciplinary engagement among faculty from multiple schools and departments to improve the quality of clinical research. Investigators and participating faculty experts have enthusiastically endorsed the Studio Program.
Acknowledgments: The authors thank David Robertson, MD, Vivian Siegel, PhD, Lynda Lane, MS, RN, Tonya Yarbrough, RN, Li Wang, MS, and Paul Harris, PhD, for their constructive feedback on this manuscript during a Studio session.
Funding/Support: This work was supported in part by Vanderbilt Clinical and Translational Science Award grant 1 UL1 RR024975 from the National Center for Research Resources, National Institutes of Health.
Other disclosures: None.
Ethical approval: The Vanderbilt University institutional review board determined that the evaluations described in this article are exempt research.
1. Pellmar TC, Eisenberg LInstitute of Medicine (US) Committee on Building Bridges in the Brain, Behavior, and Clinical Sciences. Bridging Disciplines in the Brain, Behavioral, and Clinical Sciences. 2000 Washington, DC National Academy Press http://books.nap.edu/openbook.php?record_id=9942
Accessed April 26, 2012
2. Reid PP, Compton WD, Grossman JH, Fanjiang G. Committee on Engineering and the Health Care System, Institute of Medicine and National Academy of Engineering. Building a Better Delivery System: A New Engineering/Health Care Partnership. 2005 Washington, DC National Academies Press http://www.nap.edu/catalog.php?record_id=11378
Accessed April 26, 2012
3. Harris PA, Swafford JA, Edwards TL, et al. StarBRITE: The Vanderbilt University Biomedical Research Integration, Translation and Education portal. J Biomed Inform. 2011;44:655–662
4. Gotterer GS, O’Day D, Miller BM. The Emphasis Program: A scholarly concentrations program at Vanderbilt University School of Medicine. Acad Med. 2010;85:1717–1724
5. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42:377–381
6. Zerhouni EA, Alving B. Clinical and translational science awards: A framework for a national research agenda. Transl Res. 2006;148:4–5
7. Heller C, de Melo-Martín I. Clinical and Translational Science Awards: Can they increase the efficiency and speed of clinical and translational research? Acad Med. 2009;84:424–432
8. Califf RM, Berglund LPrincipal Investigators of National Institutes of Health Clinical and Translational Science Awards. Linking scientific discovery and better health for the nation: The first three years of the NIH’s Clinical and Translational Science Awards. Acad Med. 2010;85:457–462
Supplemental Digital Content
© 2012 Association of American Medical Colleges