Secondary Logo

Journal Logo

Innovation Reports

A Tiered Quality Assurance Review Process for Clinical Data Management Standard Operating Procedures in an Academic Health Center

Ittenbach, Richard F., PhD; Baker, Cynthia L.; Corsmo, Jeremy J.

Author Information
doi: 10.1097/ACM.0000000000000225
  • Free

Abstract

Problem

In pharmaceutical research, standard operating procedures (SOPs) have long been considered the principal mechanism through which safe, effective clinical trials can be conducted and reported to the scientific community.1 Enactment of the Food and Drug Administration (FDA) Safety and Innovation Act of 20122 has empowered the FDA to offer guidance and enforce standards for FDA submissions involving the development of drugs, biologics, and devices. Implicit in this legislation is the recognition of the need for improved data-related processes and standards for all FDA-regulated research. This increases the likelihood that sponsors outside the pharmaceutical industry (e.g., Centers for Disease Control and Prevention, National Institutes of Health, National Science Foundation) will adopt the FDA’s standards for data acquisition and review. SOPs can be a tremendous asset in instituting minimum standards of data quality across the entire spectrum of research studies.1

At academic health centers such as ours, which already operate with SOPs related to clinical care, creating SOPs for clinical data management (CDM) may be a logical next step in an institution’s ongoing effort to implement a reliable quality assurance program. However, those academic health centers for which SOPs represent unknown territory may need guidance.3–5 Care must be taken to develop new SOPs in a rigorous, professionally acceptable, and institutionally appropriate manner. The process should take into account several layers of involvement, expertise, and buy-in from the scientific and academic communities.3,5

As the nature and scope of research operations at the Cincinnati Children’s Hospital Medical Center (CCHMC) expanded, so too did the degree of sophistication among our data management staff. Few divisions, however, had researchers with well-defined working knowledge of good clinical data management practices (GCDMP),5 which resulted in a high degree of variability in CDM operations enterprise-wide. The CCHMC set out to develop SOPs that would raise the level of rigor of CDM operations in accord with GCDMP and reduce unintended variability. Here, we describe how the CCHMC used an innovative, tiered quality assurance review process involving teams of internal and external data management professionals to develop, review, approve, and implement CDM SOPs.

Approach

Developing a verifiable, tiered review process

Quality assurance principles are assumed to be a part of all clinical and research operations at the CCHMC. Before we could use SOPs to facilitate the collection and dissemination of quality research data, we needed to make certain that they were developed using a rigorous quality assurance process. Consequently, in December 2011 CCHMC leadership established and empowered an executive steering committee to oversee efforts aimed at developing CDM capacity and expertise as a part of the 2015 strategic plan. It was imperative that the SOPs be in place and ready for implementation by fall 2012.

The executive steering committee then prioritized SOP development as an immediate need and appointed an SOP oversight subcommittee, which was charged with designing and implementing a strategy to develop, approve, and implement a series of CDM SOPs enterprise-wide. This subcommittee identified the priority areas that should be addressed in the SOPs and designed a quality assurance review process with three tiers of review panels (see Figure 1): (1) the SOP development team; (2) the faculty review panel, composed of active investigators and senior administrators invested in clinical research; and (3) an external advisory panel (EAP) of established national experts.

Figure 1
Figure 1:
Tiered quality assurance review process for clinical data management standard operating procedures (SOPs), Cincinnati Children’s Hospital Medical Center.

Roles of the three review panels

SOP development team.

This team consisted of 16 clinical data managers and technical experts from across the CCHMC, representing 9 clinical/research divisions. It was led by a certified clinical data manager (C.L.B.) who was also a member of the SOP oversight subcommittee. To address all priority areas identified by the oversight subcommittee, the development team determined that they would create an initial set of 12 SOPs and a data management plan template. The SOPs were fairly traditional in nature and consistent with CDM SOPs used in large pharmaceutical organizations and contract research organizations (e.g., archiving study data, database audits; see List 1).

List 1 Standard Operating Procedures (SOPs) for Clinical Data Management, Cincinnati Children’s Hospital Medical Centera Cited Here

Archiving study data

Case report form: Creation, approval and release

Case report form: Tracking, storage, and archival

Database audit

Database design and setupb

Database lock

Data discrepanciesb

Data entry processesb

Data exports

Data management plans

Handling external electronic data

Medical coding

a The drafts of these 12 SOPs plus a clinical data management plan template were reviewed using a tiered quality assurance review process (outlined in Figure 1).

b Work instructions were written to accompany this SOP, as requested by the external advisory panel in its second round of reviews.

Faculty review panel.

Review of the proposed SOPs by a panel of eight senior faculty members and senior administrators provided the faculty perspective on new data management standards and encouraged buy-in and adoption by senior investigators. This panel, which was chaired by the senior director of the Office of Research Compliance and Regulatory Affairs (J.J.C.), included faculty from five clinical divisions who were actively engaged in clinical research as well as legal counsel for the research enterprise. The faculty review panel served three primary purposes: It identified discipline-specific concerns with the SOPs, validated the importance of CDM SOPs, and provided input on the strategy for implementing the SOPs. The faculty review panel highlighted several important benefits associated with CDM SOPs, including consistent and more rigorous, efficient, and easily reconciled data-related operations.

External advisory panel (EAP).

Involvement of the EAP was viewed as crucial to minimizing the impact of individual bias and unit-specific needs on the CCHMC’s long-term goals. A team of three highly experienced CDM professionals with extensive experience in SOP development and implementation was assembled to make certain that the SOPs would be consistent with national best practice standards. Requirements for inclusion on the EAP included a minimum of 10 years of active CDM experience in both pharmaceutical and academic settings, regulatory and/or auditing experience, and active and established national leadership in the field of CDM. The CCHMC’s expectations of the EAP members were as follows:

  1. Participation in an introductory telephone conference with the chair of the SOP oversight subcommittee and other EAP members for orientation to the EAP’s mission, tasks, and expected timelines
  2. Completion of formative evaluation of the first round of draft SOPs within 7 to 10 days following receipt
  3. Participation in an interim telephone conference with the chair of the SOP oversight subcommittee and other EAP members to identify themes and subthemes across all external reviewers’ feedback and to develop a mechanism for addressing content gaps
  4. Completion of a summative evaluation of the revised SOPs within 7 to 10 days following receipt, for final review and approval by the executive steering committee, along with recommendations for implementation
  5. Participation in a final telephone conference, with open invitations to all SOP participants from all three panels, to officially close the process and clarify concerns relative to any outstanding issues

In addition to providing written comments on each SOP, EAP members rated the SOPs individually and collectively, using the following 5-point Likert-type scale: very undeveloped (1), somewhat developed but with substantive weaknesses (2), addresses basic attributes across all relevant areas (3), relatively well developed with only one or two weaknesses (4), and exceptionally strong across all relevant areas (5). They also rated the general quality of the SOPs and the composition, structure, and naming convention. Finally, they were asked to identify any additional SOPs that needed to be developed. They were encouraged to provide comments to support their ratings.

Outcomes

The EAP members’ written comments were summarized, and median scores were calculated for each SOP. The decision was made by the SOP development team, and endorsed by the oversight subcommittee, to focus subsequent revisions and development efforts on those SOPs with a median score of 2 (somewhat developed) or lower. This evaluation approach was used for both rounds of review by the EAP.

As part of this process, 6 SOPs were targeted for refinement and further development, but none of the 12 proposed SOPs were dropped from further consideration, and no additional SOPs were recommended by the EAP. Once the SOPs were modified, the SOP development team shared them with the faculty review panel for their input and evaluation.

Following the second (and final) round of EAP review, the 12 finished SOPs and the data management plan template were presented by the chair of the executive steering committee (R.F.I.) to the institutional administrators and the research committee of the CCHMC Board of Trustees along with a plan for rolling them out to the institution. The SOPs and rollout plan were approved in late August 2012 by the chair of the CCHMC Research Foundation; they were then reviewed and signed off by general counsel in early October 2012.

The entire process took approximately 10 months. The first 2 months (December 2011–January 2012) were devoted to committee formation, planning, and contracting with the external advisors. Creating, reviewing, and evaluating the 12 SOPs and data management plan template took approximately 8 months (December 2011–July 2012); part of this period ran concurrently with the planning and contracting phase. Once the SOPs were finished, it took just over 2 months for completion of the institutional review and approval process (August–early October 2012).

Implications for practice

The CCHMC’s dedication to providing high-quality, valid, and reliable data for all types of clinical research is demonstrated here in the effort invested in the innovative, multitiered process used to develop and review CDM SOPs. Once these SOPs are fully implemented across the CCHMC, they will establish a foundation for data quality expectations and provide guidance to all researchers enterprise-wide. Many academic health centers today could benefit from adopting a similar quality assurance review process as they develop their own CDM SOPs. These SOPs should address the issues that arise when multiple studies attempt to collect the same data, often from the same subjects, using minimally different forms and protocols and resulting in tremendous redundancy of effort—and thereby delay science and opportunities for improved care. Well-designed SOPs can allow for more thoughtful data management, including more efficient study protocols, faster turnaround on data acquisition and processing, increased opportunities for larger combined databases, and quicker dissemination of results.

Despite our relatively smooth process, challenges emerged that had to be dealt with as they arose. Involving such a large number of researchers (nearly 30 overall) resulted in a wide range of opinions about what was appropriate and necessary. SOP development team meetings were at times demanding, but allowed for discussions as to how procedures could be implemented, maintained, and accepted by research staff. Operating under strict timelines meant that there was limited time to make large-scale revisions. However, the need to have standards, with recognized weaknesses, in place sooner rather than later outweighed the need for “perfect” SOPs. Everyone involved in the process from the SOP development team to the Board of Trustees understood that the SOPs would need to be implemented and adhered to for a period of time to test them in order to identify areas in need of further development. All review comments and discussion notes were archived as a resource for use later, when the SOPs will undergo reevaluation following implementation (described below).

Implementation and quality assurance strategy

The SOP oversight subcommittee strongly believed that a phase-in process would be critical to success. As such, the following staggered, risk-based approach to SOP implementation was developed:

  • Phase 1: Data Management Center staff (compliance date of January 2013)
  • Phase 2: All FDA-regulated multisite studies and all federally funded, multisite studies where the CCHMC is considered responsible for CDM (rolled out in spring 2013; six-month adoption period began in July 2013; enforcement began January 1, 2014)
  • Phase 3: All other FDA-regulated studies (rolled out immediately after phase 2; six-month adoption period began in January 2014; planned enforcement date of July 2014)
  • Phase 4: All other investigator-initiated studies (rollout to immediately follow phase 3; six-month adoption period scheduled to begin in July 2014; planned enforcement date of January 2015)

In July 2013, the CCHMC’s existing research compliance auditing and quality assurance program began to include institutional SOPs in the audit process. During each of the phased adoption periods described above, noncompliance with the CDM SOPs will be noted on audit findings for educational/awareness purposes for the research projects included in that phase. After the phase’s enforcement date, a research project’s noncompliance with institutional CDM SOPs will be considered an audit deficiency requiring mandatory follow-up, per institutional policies and procedures.

At the time of this writing, we are on schedule with respect to the imple mentation phases and enforcement dates noted above. Most important, the Data Management Center has been operating under the CDM SOPs since early 2013. In addition, all FDA-regulated studies working in conjunction with the Data Management Center are operating under the SOPs. Audits of these studies for compliance with the SOPs are proceeding as planned. With respect to non-FDA-regulated, multisite studies, for which these standards are not yet being enforced, adoption of the SOPs has been met with less enthusiasm, in part because of cost constraints and the nonregulated nature of the studies. Efforts are under way to devise modifications that retain the integrity of the SOPs but lessen investigator burden in specific areas for these studies.

Next Steps

All processes, no matter how well intentioned, are only as good as the products that result and the efforts put into their development. As such, our CDM SOPs and data management plan template will need to be revisited and revalidated once research teams have had the opportunity to use them across settings, conditions, and a range of clinical studies. We recommend that others developing SOPs establish a formal evaluation system, similar to the one described here, with multiple tiers of support and verification. Future evaluations at the CCHMC should allow for the inclusion of measures of receptivity and satisfaction with the SOPs and the broader process. Finally, a strategy is being developed to calculate the return on investment of personnel, training, processes, and equipment using econometric modeling; it will be the subject of future research.

Acknowledgments: The authors wish to extend a special note of thanks to the three review panels: the external advisory panel members, who ensured that the SOPs met national standards of best practice; the Cincinnati Children's Hospital Medical Center (CCHMC) faculty review panel, who ensured that efforts were consistent with best scientific practices; and, especially, the SOP development team members—the core team of CCHMC clinical data managers who worked tirelessly to compose, compile, revise, and resubmit draft versions of the SOPs and their supporting documents: Rachel Akers, MPH, CCDM, Lynn Darbie, MS, Daniel Jeffers, MS, CCDM, Eileen King, PhD, Carolyn Powers, and Jason Stock.

References

1. Fong DYT. Data management and quality assurance. Drug Inf J. 2001;35:839–844
2. U.S. Food and Drug Administration. Fact Sheet: Reauthorization of User Fees for Prescription Drugs Will Ensure a Predictable and Efficient Human Drug Review Program. 2012 Washington, DC U.S. Department of Health and Human Services http://www.fda.gov/RegulatoryInformation/Legislation/FederalFoodDrugandCosmeticActFDCAct/SignificantAmendmentstotheFDCAct/FDASIA/ucm311238.htm. Accessed January 23, 2014
3. Gough J, Hamrell M. Standard operating procedures (SOPs): Why companies must have them, and why they need them. Drug Inf J. 2009;43:69–74
4. Gough J, Hamrell M. Standard operating procedures (SOPs): How to write them to be effective tools. Drug Inf J. 2010;44:463–468
5. Society of Clinical Data Management. . Measuring data quality. Good Clinical Data Management Practices. 20112011 ed Brussels, Belgium Society of Clinical Data Management:1–11
© 2014 by the Association of American Medical Colleges