Share this article on:

Creating an Infrastructure for Training in the Responsible Conduct of Research: The University of Pittsburgh's Experience

Barnes, Barbara E. MD, MS; Friedman, Charles P. PhD; Rosenberg, Jerome L. PhD; Russell, Joanne RN, MPPM; Beedle, Ari MBA; Levine, Arthur S. MD

Research Issues at AHCs

In response to public concerns about the consequences of research misconduct, academic institutions have become increasingly cognizant of the need to implement comprehensive, effective training in the responsible conduct of research (RCR) for faculty, staff, students, and external collaborators. The ability to meet this imperative is challenging as universities confront declining financial resources and increasing complexity of the research enterprise.

The authors describe the University of Pittsburgh's design, implementation, and evaluation of a Web-based, institution-wide RCR training program called Research and Practice Fundamentals (RPF). This project, established in 2000, was embedded in the philosophy, organizational structure, and technology developed through the Integrated Advanced Information Management Systems grant from the National Library of Medicine. Utilizing a centralized, comprehensive approach, the RPF system provides an efficient mechanism for deploying content to a large, diverse cohort of learners and supports the needs of research administrators by providing access to information about who has successfully completed the training. During its first 3 years of operation, the RPF served over 17,000 users and issued more than 38,000 training certificates. The 18 modules that are currently available address issues required by regulatory mandates and other content areas important to the research community. RPF users report high levels of satisfaction with content and ease of using the system. Future efforts must explore methods to integrate non-RCR education and training into a centralized, cohesive structure. The University of Pittsburgh's experience with the RPF demonstrates the importance of developing an infrastructure for training that is comprehensive, scalable, reliable, centralized, affordable, and sustainable.

Dr. Barnes is associate professor of medicine and assistant vice chancellor for continuing education in the health sciences, University of Pittsburgh, Pittsburgh, Pennsylvania.

Dr. Friedman is professor of medicine and associate vice chancellor and director of the Center for Biomedical Informatics, University of Pittsburgh, Pittsburgh, Pennsylvania. He is currently on leave as Senior Scholar at the National Library of Medicine.

Dr. Rosenberg is research integrity officer and chair, Conflict of Interest Committee, University of Pittsburgh, Pittsburgh, Pennsylvania.

Ms. Russell is administrator of research education, Office of Clinical Research, University of Pittsburgh, Pittsburgh, Pennsylvania.

Mr. Beedle is systems program manager, Research Conduct and Compliance Office, University of Pittsburgh, Pittsburgh, Pennsylvania.

Dr. Levine is professor of medicine and molecular genetics and biochemistry, senior vice chancellor for the health sciences, and dean, School of Medicine, University of Pittsburgh, Pittsburgh, Pennsylvania.

Correspondence should be addressed to Dr. Barnes, Center for Continuing Education in the Health Sciences, 200 Lothrop Street (overnight mail: 3708 Fifth Avenue), Pittsburgh, PA 15213; telephone: (412) 647-8212; fax: (412) 647-1244; e-mail: 〈barnesbe@upmc.edu〉.

National publicity of the consequences of research misconduct has placed universities under intense pressure to demonstrate that scientific investigation is not only academically rigorous but also safe for the human participants and animal subjects involved.1 To assure that faculty, staff, and students are knowledgeable of ethical principles and regulatory requirements, research centers have long recognized the need for training programs in the responsible conduct of research (RCR).2,3 For many years, this type of education was carried out through the mentor–trainee model, in which senior investigators inculcated into their own apprentices not only the theory and methods of science but also the broader vision of values and responsibility. Through the last several decades, the viability of this approach has been threatened by the rapid expansion of research enterprises, increasing numbers of young scientists and students, multiple demands on senior researchers, and growing complexity of ethical issues associated with research involving human participants and animal subjects.4 Well-publicized cases of scientific misconduct have created a call to action for the scientific community to address variability in the quality of RCR training both within and across institutions.5

When the National Institutes of Health (NIH) announced in October 2000 a requirement that applicants for funding be able to provide a description of education completed in the protection of human research participants for all key personnel listed in NIH grant proposals,6 it became evident to many academic institutions that RCR training must be systematically deployed to assure consistency of the educational experience and to make information about who has completed the requirement readily available to a variety of stakeholders.7 Unfortunately, it is only in the last few years that reports have begun to appear in the literature describing how academic institutions can realistically implement such programs.8 As research organizations struggle to cope with limited financial resources and increasing demands on faculty and staff time,9 it is necessary to promulgate models that efficiently deliver training to large and often geographically diverse cohorts of individuals.10 With that in mind, we wrote this article to describe the University of Pittsburgh's experience in implementing a comprehensive system to support the RCR training needs of its research community.

Back to Top | Article Outline

Developing a Systematic Approach to Training

The University of Pittsburgh has a long-standing commitment to assuring the responsible conduct of research and, like many other academic institutions, has historically implemented a variety of educational programs to assure compliance with regulatory requirements and professional standards. In 1991, the university's provost issued a directive for each academic unit (e.g., department, division) to develop education in ethical research principles for all faculty, staff, and students engaged in research. This approach had several limitations inherent in decentralized models: content varied with respect to topic and depth, certification requirements were not uniform, and there was no mechanism for tracking compliance across the institution. To address some of these concerns, in 1998 the Office of Research, Health Sciences initiated a quarterly full-day clinical research workshop for faculty, staff, and students—with mixed results. Despite considerable institutional resource commitment, busy researchers found it difficult to commit the time required for live training that took place on a rigid schedule, with only 1,600 registrants attending the workshops over a 30-month period from 1998 to 2001. Also, because participation records were maintained in hard copy at the level of each department or division, it was difficult for the Office of Research (OOR), which is responsible for the university's preaward review, the institutional review board (IRB), and the Institutional Animal Care and Use Committee (IACUC) to know exactly who had completed the training. Offices within the university that conducted highly specialized training, such as the Radiation Safety Office and the IACUC, also maintained their own records. In addition, there was no uniform and accessible set of policies delineating who was required to undertake each type of training, how frequently faculty and staff would need to review core information from the training, or which unit within the university was responsible for approval and updating of content.

Acknowledging these limitations and considering the implications of the impending NIH requirements, in 2000 the University of Pittsburgh's senior vice chancellor for the health sciences (ASL) and the members of the Research Conduct and Compliance Office (RCCO) embarked on a significantly more proactive, comprehensive, and scalable approach to RCR training, addressing both local needs and existing and anticipated external requirements. It was recognized that any training program must

  • ▪ be easily accessible to a geographically distributed group of individuals involved in research, including faculty, students, and staff, as well as collaborators at other institutions;
  • ▪ have mechanisms for readily adding and updating content to accommodate new policies and regulations; and
  • ▪ provide data on those individuals successfully completing the training to a variety of institutional entities (such as the OOR, IRB, IACUC, Conflict of Interest Office, Office of Technology Management, other specialized offices, and department research administrators).

It seemed clear that the only practical way to address these requirements was to exploit the almost universal accessibility of the Internet and the increasingly sophisticated Internet-based software to support education.11

Back to Top | Article Outline

Applying IAIMS Principles for Development and Implementation

When deciding to use technology to address institutional needs, it is common to buy or build applications on a problem-by-problem basis. This “silo” approach often results in effective solutions for each issue, but also requires users to have separate user-IDs and passwords for each system, to learn to use systems with very different interfaces, and to enter demographic information needed by each system even though much of the same data already exist in other systems. From an institutional perspective, silo systems yield no economy of scale. Each application is a completely new project, with upgrades and modifications to one system yielding no benefits for other systems.12

At the University of Pittsburgh, the imperative for enhanced RCR training coincided with an Integrated Advanced Information Management Systems (IAIMS) implementation initiative, supported by a NLM grant awarded to our university in 1998. According to IAIMS principles, technology most successfully addresses the information needs of health professionals and information management needs of organizations when deployed as an integrated architecture stemming from a systematic strategic planning process.13 Within the IAIMS paradigm, institutional priorities are addressed in a deliberate manner, bringing together diverse stakeholders and taking into consideration future as well as immediate needs.14

The University of Pittsburgh was able to use the IAIMS philosophy, organizational structure, and technical approaches to address the new RCR training requirements. Because many features of IAIMS were already in place, the university considered this approach well suited to address the imperative that an improved RCR training solution be in place within a relatively short time frame. Although the immediate need in 2000 focused on deploying an RCR core curriculum to address a federal requirement for such a curriculum, institutional leaders recognized that faculty and staff needed mandatory and supplemental training on a wide variety of issues related to the conduct of research. As a result, we dismissed the option of purchasing or building a stand-alone educational activity solely for the purpose of meeting the NIH mandate. Instead, we used an IAIMS-directed approach to develop a comprehensive solution integrated over time with other institutional information systems and procedures.

The organizational framework for this project, which came to be known as Research and Practice Fundamentals (RPF), was initially led by the principal investigator of the IAIMS grant (CPF). The RPF team consisted of an operations committee (“core group”) and three subcommittees. The core group was formed in January 2000 in anticipation of the announcement of the NIH requirements mentioned earlier and was composed of approximately 25 major stakeholders from the schools of the health sciences and general university administration, representing the research community, educators, and technical experts. The core group met monthly to provide oversight and management of the project and generate policies guiding system implementation and learner requirements. Subcommittees addressing content development, certification, and technical issues performed the day-to-day work of developing the RPF.

Although the development of an educational intervention on RCR may have been more expediently accomplished under the auspices of a single institutional unit, several factors drove us to implement a more inclusive and department-transcending organizational structure. First, and perhaps foremost, the IAIMS culture and framework established in the preceding years enabled key decisionmakers who already understood the value of integrated information technology (IT) architecture and collaborative, strategic policy development to quickly develop a shared vision of how the impending mandate might be immediately addressed and how this process could be transformed into a comprehensive training structure. Inclusion in the core group of key decisionmakers from various entities within the university also provided a broad-based perspective on policy issues, such as who would need to take the training and how the IRB and other university offices would enforce the requirements.

Back to Top | Article Outline

Defining Requirements of a Comprehensive Educational System

Before assembling content or selecting technical platforms, the core group explored the needs of four key constituencies:

  • Learners: Although it was difficult to determine the exact size of the cohort to be trained, the core group estimated that at least 3,000 individuals would access the system within the first few months of operation. It was recognized that these adult learners would be “task-oriented,” seeking to efficiently access content organized in logical sequences and manageable units.15 Learning modules would have to be accessible to users who were not employees or registered students, and consideration would have to be given to those who would be using the system through dial-up and other low bandwidth connections.
  • Content experts: Regardless of whether the educational materials themselves were developed internally or licensed from external sources, there would be a need to use local experts for the purposes of review and adaptation to university policies and procedures. In order to engage busy faculty members, institutional leaders would have to identify this project as an important strategic imperative, valuing the time and effort contributed by faculty. Also, appropriate staffing would need to be provided for editing and formatting of content to minimize faculty time commitment.
  • Administration and policy makers: Senior administrators in the health sciences supported this project with the understanding that it would evolve into a comprehensive system to support a variety of educational and training needs within the research community. It was expected that the mature platform would be cost-effective in terms of ongoing technical and staff support, able to support numerous modules on various topics, and capable of being upgraded to provide additional functionality as needed.
  • Research administrators and compliance officers: Analysis of our experience with prior research integrity training programs demonstrated that records of completion were needed by multiple stakeholders, including department administrators, research project coordinators, and the OOR, IRB, and IACUC offices. This information was traditionally maintained in individual departments and divisions, requiring staff from the OOR, IRB, IACUC, and other administrative units to contact multiple locations to find the proper information, leading to considerable human resource costs and delays in proposal processing and protocol approvals. The new system would have to include a single “certification database” for the entire institution, containing information about individuals who successfully completed the program and accessible by authorized individuals in diverse locations, extending to international locations such as a specialty facility in Palermo, Italy, that is staffed by medical school faculty. It was estimated that at least 100 individuals might require permissions to access this database.
Back to Top | Article Outline

Development of the Educational System

Synthesis of these diverse needs shaped our general concept of the RPF. Due to the aggressive time frame driven by the impending NIH mandate, the RPF team drew as much as possible on existing organizational resources. Development of the technical platform was accelerated through the use of a learning management system, WebCT ©, already licensed by the University of Pittsburgh for another IAIMS activity that supported the education of medical students training in community locations. This platform was initially chosen based on ease of customization and integration with other applications. The existing WebCT© system had many of the components required for RCR training but required adaptations to make the testing function more rigorous; permit flexibility in content format; support a large, Web-accessible certification database; and offer users a single log-on to multiple applications.

Consistent with adult learning models, the RPF team decided that the system would feature Web-based content in easily reviewable, text-based, and modular formats that could be readily accessed by learners using widely deployed Web-browser software requiring only a dial-up Internet connection. Subject matter for the first two modules, addressing research integrity and human subjects research, was based on prior RCR training curricula used throughout the university, with the addition of updates reflecting current institutional policies and federal requirements. Members of the content subcommittee, as well as other university officials and representatives of the potential user population, reviewed draft content for completeness and appropriateness. In the interest of rapid deployment, we emphasized text over graphics although we did include tables and figures when needed for presentation and elucidation.

We decided to require members of the community to demonstrate their understanding of the material. Individuals who already felt familiar with the content of a particular module were given one attempt to pass a knowledge test, allowing those achieving a score of 80% or greater to “test out.” Those who chose to review the material or who did not pass this knowledge test were required to pass a series of multiple-choice quizzes with a score of at least 80%. Each quiz covered a specific section of the module but could be completed in any order. Users not passing a quiz could retake it as many times as necessary, with each quiz comprising a new, randomly assigned sample of questions from the relevant bank of questions. Users could immediately review their quiz results, and for each question, we provided a hyperlink to the most relevant instructional material, creating an efficient way for users who had answered incorrectly to understand in depth why their answer was incorrect.

Back to Top | Article Outline

Deployment of the Educational System

The senior vice chancellor for the health sciences, in collaboration with the core group, determined that, regardless of participation in previous training, all faculty and students engaged in research would be required to successfully complete the Research Integrity module, with all those participating in human participants' research also mandated to complete Module 2. A memorandum explaining this policy was distributed to the health sciences community on January 31, 2001. The Research Integrity module was released in early February 2001 followed by the Human Subjects Research module in May 2001, and the Laboratory Animal Research module in August 2001.16 Of the 18 modules released to date, 17 were developed by content experts from the University of Pittsburgh, with the remaining one (psychosocial research on human participants) being adapted from content licensed from the CITI program hosted at the University of Miami. Four of the modules were developed as part of extramurally funded projects. Continuing medical education (CME) credits as well as continuing education units are offered free of charge to RPF users.

Back to Top | Article Outline

How the system has been used

As demonstrated in Table 1, the RPF has been widely used by faculty, students, and trainees, with a total of 38,241 module certifications having been issued through July 1, 2004. Highest use has been observed for modules required of the majority of the health sciences community, such as Research Integrity and Human Subjects Research. Faculty participation in the HIPPA Privacy Training module has been relatively low because clinicians customarily complete this module though the University of Pittsburgh Medical Center rather than through the RPF system.

Table 1

Table 1

There are currently more than 200 individuals with varying degrees of access to the certification database, including OOR, RCCO, IRB, and IACUC staff, department and division administrators, and RPF staff. Soon after implementation, grant applications and IRB protocols were routinely checked for required certifications—a major use of the certification database that started almost immediately after its deployment. Processes for assuring completion of required training for individuals involved in human participants, research are becoming more fully automated with the development of an interface between the RPF certification database and electronic protocol management systems that will require completion of appropriate training by principal investigators and key personnel before proposals will be processed by the OOR and protocol submissions can be made to the IRB.

As discussed earlier, it was difficult to predict the number of individuals who would use the RPF system during initial deployment in 2001. By November of that year, use of the system was approximately double the anticipated level, leading to the deployment of a much more robust learning management system in 2002. Over time, the RPF evolved into one component of a “family” of health sciences information technologies developed under the auspices of the Pittsburgh IAIMS effort, including a Web-based conflict-of-interest “superform” used by faculty and staff from both the university and the medical center, a faculty research interest database, and a health sciences Web portal. As such, the RPF was gradually integrated into the architecture of this family. For example, the user accounts already in the RPF were incorporated with user accounts contained in other IAIMS applications to create a centralized system known as “HS Connect.” HS Connect became the source of account creation and authentication for all users of the family of applications, including the RPF.

Back to Top | Article Outline

Resources required for initial development and ongoing operation

During the planning and deployment phases, the project benefited from the ability to mobilize resources from across the institution, shortening development times and reducing project costs. The creation of the initial administrative structure drew heavily on the knowledge of IAIMS committee members, who participated on a voluntary basis. Technical staff and administrative support were provided by the Center for Biomedical Informatics, which was home to the IAIMS implementation grant. Start-up costs were further reduced by the use of existing servers and the WebCT learning management system. One additional full-time equivalent staff member was engaged to perform required modifications to the learning management system and develop the certification database. An undergraduate student was employed on a part-time basis to assist with Web programming. Because this project was strongly endorsed by institutional leadership, faculty and staff viewed participation in the RPF as integral to their existing responsibilities, making it difficult to accurately assess the incremental effort expended on the project. The time required for content development was shortened due to reliance on existing training materials. Modules 1 and 2 each required six weeks of part-time effort to convert the content from a working draft to Web-based format. Alpha and beta testing followed, resulting in a six-month period from working draft to the launch of the first two modules.

When the demands on the hardware and software caused by high levels of use required that the technical infrastructure be streamlined and strengthened, the institution invested approximately $35,000 for the purchase of dedicated servers and more powerful learning management software. The full-time equivalent for technical support continued to dedicate 100% effort during this phase.

As the RPF project matured, the processes for maintaining existing modules and for developing new modules became more routine and the project moved to a leaner administrative structure. Responsibility for program operations was transferred from the Center for Biomedical Informatics to the office of the senior vice chancellor for the health sciences, which financially supports the project. The core group has been replaced by a smaller oversight committee consisting of the university's vice chancellor for research conduct and compliance, associate vice chancellor for basic biomedical research, associate vice chancellor for clinical research, assistant vice chancellor for continuing education, and the director of planning and management. This group is responsible for policy development and implementation as well as approval of new modules. A process has been implemented for annual review and updating of all existing modules.

In this steady state, staffing consists of approximately 50% effort of a programmer/database manager as well as a pool of individuals who provide user support for this and a number of other applications. Expert faculty and staff continue to produce content for most modules, with development times being dependent upon the subject matter, ability to draw on existing content, and institutional priority for the module. Time from working draft to module deployment has been significantly reduced and currently requires six weeks to three months of part-time effort, depending on the length of the module.

Back to Top | Article Outline

Implementation Experience and Evaluation

Following the implementation of the first RPF module in February 2001, we collected a range of data enabling us to track the progress of the project and to gauge the reaction of the user community. We programmed the delivery software to maintain comprehensive, confidential log files of the users' RPF activities. We logged each interaction with the RPF with respect to module accessed and date, as well as each test and quiz taken and the user's performance. In addition, two questionnaires addressing learners' perceptions of quality, educational effectiveness, and appropriateness of content were embedded into the educational package. The data from log files and questionnaires enabled us to address several aspects of the project experience. Although the findings reported herein are from data through 2004, our continued observations of the RPF system substantiate that these reported results accurately reflect trends that are still continuing.

Back to Top | Article Outline

Extent of RPF use and penetration of the target population

As noted earlier, this project was implemented on a very substantial scale. By July 2004, 17,128 individuals had established user accounts for the RPF and 38,234 module certifications had been issued, greatly exceeding our expectations. Table 1 lists modules that have been deployed or are due to be deployed in the near future. For those modules deployed prior to July 2004, the number of users completing each module is broken out by users' organizational status (faculty, staff, fellow, and “other”). Module 1 (Research Integrity), a requirement for the entire health sciences community as well as for all faculty conducting research involving human participants or animals, had the highest number of certifications (11,980), closely followed by Module 2 (Human Subject Research, 9,757), which was required for all members of the community involved in research involving human participants. Modules 6, 7, and 8 (see Table 1), which were designed to meet the HIPAA privacy requirements, were deployed in February 2003 and have recorded a total of 11,006 certifications.

Determining “penetration,” or the fraction of the target population for each module that was actually certified, was complicated by the uncertain nature of the target populations themselves. As described earlier, the RPF core group, in consultation with senior university officials, identified a target population for each module. In many cases, this definition was based on an individual's current professional activities (“human subjects researcher” or “stem cell researcher”), which can fluctuate over time, rather than formal organizational status (academic rank or job classification), which tends to be more persistent. For modules designed for subsets of individuals within the university not clearly defined by formal institutional categories, the sizes of these target populations were very difficult to estimate. However, for those such as Module 1 that were required for all health sciences researchers, it was feasible for us to more accurately estimate the RPF's penetration. The number of faculty completing RPF modules (the numerator) was determined by those who declared themselves as full-time faculty members in one of the health sciences schools at the time of registration into the RPF system. The size of the total target audience (the denominator) was established by examination of Faculty Research Interests Project (FRIP)17 records containing numbers of faculty members in each school identified as active researchers. These data, portrayed in Table 2 along with Module 1 completion rates, reveal an overall penetration rate for health sciences faculty of 92%. For some schools, the numbers of users who completed Module 1 and self-reported as faculty members exceeded the numbers of faculty members in the FRIP database. In these cases we presumed the penetration rate to approximate 100%. It is much more difficult to determine the number of individuals outside of the health sciences conducting research involving human participants or animals, and, as a result, we are uncertain about the penetration rate of the RPF in this population.

Table 2

Table 2

Back to Top | Article Outline

User satisfaction with the RPF

The two questionnaires included in the RPF modules were complementary: the first was specific to the RPF but was not made mandatory; the second contained more generic questions but was mandatory for the physician subpopulation seeking CME credit. The former was a questionnaire that included nine items focused on users' perceived clarity and ease of use of the RPF modules, level of interest in the content presented, and experience with the tests and quizzes. The RPF participants were given the opportunity to complete this questionnaire at the end of each module. As of July 2004, 7,003 RPF questionnaires had been completed. Although this translates into a relatively low proportional return rate (18%), given the overall scale of the project it does encompass the views of a large number of people.

Table 3 reports the RPF questionnaire results for all modules, then separately for Module 1, Module 2, and all other modules combined. Responses were generally very positive, with little variation among different modules. Approximately 90% of respondents found the modules clearly written all or most of the time, and 70% found the material interesting all or most of the time; 70% would recommend the Web site to others. The 30% who would not or were doubtful about recommending it may have preferred a more traditional format for this material or perhaps felt that they should not have been required to complete the modules at all. Less than 10% found the navigation of the Web site difficult. Reported times to complete the modules were in line with the designers' expectations, as were the times to complete quizzes and tests. Over 80% of respondents felt that the quiz and test questions were clear all or most of the time, and 80% found the amount of material in the modules to be “about right.”

Table 3

Table 3

The second user evaluation, summarized in Table 4, was a five-item form employed by the university's Center for Continuing Education in the Health Sciences and emphasizing overall quality, attainment of the user's objectives, freedom from commercial bias, and appropriateness of the instructional modality (in this case Web-based instruction). As discussed above, the response rate for the continuing education questionnaire was 100% of the users seeking CME credit, since credit was not issued until the questionnaire was completed. As of July 2004, 3,984 of these questionnaires had been completed. Table 4 reports the CME questionnaire results, item by item, for all modules, then separately for Module 1, Module 2, and all other modules combined. These results suggest a positive response to the RPF among this group of users, with the modal response being “above average.” Fluctuations from module to module were small. Over 95% of respondents rated the overall quality, comparison with other CME activities, degree to which their learning objectives were met, and use of the Web-based modality to be good or excellent.

Table 4

Table 4

Back to Top | Article Outline

Experience with the adult learning model

From log files, we examined the fraction of certified RPF users who attempted initially to obtain certification by taking the module test without first reviewing the content and the success rate of these attempts. Table 5 reports these results for the four modules in which the “test out” option was offered. The fraction of the users attempting to certify via test included those who believed they knew enough about the topic to score 80% or better before reviewing the material in detail, or those who believed they had little to lose by taking the test. The results were somewhat surprising in that, overall, only 15.4% of users attempted a module test with only 4.9% ultimately certified on a module through the “test out” mechanism. The success rate of persons taking the module tests was 31.5%. Users of the Conflict of Interest module were more likely to attempt the module test, and less likely to pass it.

Table 5

Table 5

Back to Top | Article Outline

Accomplishments and Continuing Concerns

Through the use of IAIMS principles and infrastructure, the University of Pittsburgh has been able to efficiently educate its large and diverse research community in the principles and practice of RCR, in addition to streamlining administrative processes for assuring compliance with institutional policies and regulatory guidelines. Among learners, the Web-based format has been well accepted, leading us to believe that researchers now view mandatory compliance activities as a “necessary good” rather than a “necessary evil.” The new format has allowed educators within the compliance offices to divest responsibilities for basic training and management of registration records, in order to pursue higher-level programming focused on remediation and enhancement of specific competencies. While we have not performed a formal economic analysis, we also believe that the RPF has been cost-effective for our university, not only in terms of centralization of IT infrastructure and personnel, but also through enhancement of productivity for learners, who can complete the modules in a place and time of their own choosing, and for administrators who have ready access to certification data.

Despite our overall enthusiasm for the RPF, there are some limitations and concerns that we continue to address. The ability to develop content using resources solely from within our institution has become an increasing concern. For the most part, we have been fortunate to identify internal experts to draft and update content. However, we have encountered several instances in which our own faculty were either unable or unwilling to author additional modules, and some modules that we had envisioned have not yet been deployed. To address these issues, we are beginning to license content from other institutions. However, even with the use of external content, we recognize the need for our own faculty and staff to assure that modules are up to date and that the information provided is consistent with our own policies and procedures. Ongoing affirmation by senior leadership of the RPF's importance to our university has, so far, encouraged local experts to continue to participate in the project as authors, reviewers, and administrators. We are guardedly optimistic that this level of participation and enthusiasm will continue.

The use of text-based formats has enabled us to quickly deploy and easily edit the modules and has made the system very affordable. We are now realizing that some training, such as that related to animal care, may be enriched with the use of more complex media. Because the RPF system is financed completely with internal resources, we must closely weigh the costs and benefits of additional functionality and educational interactivity. Similarly, as the number of modules continues to increase, we must recognize the costs of maintaining the system, including review and updating of modules, hardware and software enhancements, and support personnel. The senior vice chancellor continues to generously assure the financing of the RPF, and we hope that ongoing demonstration of the system's effectiveness and efficiency will justify future support required to maintain and enhance the system.

We have not as yet maximized opportunities to integrate education and training throughout our university and medical center. Although the RPF is our enterprise-wide vehicle for Web-based RCR training, a number of administrative units such as human resources departments and compliance offices use other learning management systems to deliver content to faculty and staff and to maintain certification records. Our ability to combine all of these systems into one cohesive and coordinated structure is limited by the size and complexity of our organizational structure (given that the university and medical center are separate entities), differing preferences for the media and formats in which content is delivered, and variation in the type of certification information required by both internal and external stakeholders. We have made progress in providing access to diverse training programs through Web portals and continue to strive for greater integration of diverse learning management systems.

Back to Top | Article Outline

The Value of a Comprehensive Approach

We live in an era of increasing public expectations and diminishing resources. As regulatory requirements proliferate, many institutions will probably continue to respond to each mandate on a case-by-case basis. The RPF system demonstrates that there is a better approach. By implementing a comprehensive technological and administrative structure for training, we have efficiently delivered content to a large and diverse community of learners, enhanced the scope of extramurally funded projects, and provided certification data to a variety of administrative units. By employing IAIMS principles, our system has been affordable, scalable, and sustainable. We hope that our experience will be a model for other institutions as they seek to fulfill their commitment to the safe and ethical conduct of research.

Back to Top | Article Outline

Acknowledgments

The authors would like to thank Dr. Michelle Broido, Ms. Ratna Jain, Dr. Randy Juhl, Ms. Stephanie Lunsford, Mr. William Madden, Dr. Steven Reis, and countless of other individuals for contributing to the RPF project.

This project was supported in part by the University of Pittsburgh IAIMS implementation grant from the National Library of Medicine, 5G08L.M06625 as well as the NIH Office of Research Integrity Human Subjects Research Enhancement Program (2S07.rr18239-02). Responsible Literature Searching funding was supported by a cooperative agreement between the Association of American Medical Colleges and the Office of Research Integrity, United States Department of Health and Human Services; Health Insurance Portability and Accountability Act (HIPAA) Privacy for Researchers, IRB Member Education, and Research Involving Children were supported by the NIH Office of Research Integrity Human Subjects Research Enhancement Program (NIH Grant Number 2 S07 RR18239-02).

The protocol for data analysis described in this paper were reviewed by the University of Pittsburgh Institutional Review Board and were determined to meet all criteria for exemption under section 45 CFR 46.101(b)(4).

Back to Top | Article Outline

References

1 Office of Research Integrity Newsletter. 2005;13(2): 1,6. 〈http://ori.hhs.gov/documents/newsletters/vol13_no2.pdf〉. Accessed 26 October 2005.
2 National Bioethics Advisory Commission (NBAC). Ethical and Policy Issues Research Involving Human Participants. 2 vols. Rockville, MD: US Government Printing Office, 2001.
3 Institute of Medicine, National Research Council of the National Academies. Integrity in Scientific Research. Creating an Environment that Promotes Responsible Conduct. Washington, DC: National Academy Press, 2002.
4 Emanuel EJ, Wood A, Fleischman A, et al. Oversight of human participants research: Identifying problems to evaluate reform proposals. Ann Intern Med. 2004;141:282–91.
5 Blumenthal D. Academic-industrial relationships in the life sciences. N Engl J Med. 2003;349:2452–59.
6 Required Education in the Protection of Human Research Participants. NIH Notice: OD-00-039. Bethesda, MD: NIH, June 5, 2000.
7 Gunsalus CK. Institutional structure to ensure research integrity. Acad Med. 1993;(9 suppl);68:S33–S38.
8 Mastroianni AC, Kahn JP. The importance of expanding current training in the responsible conduct of research. Acad Med. 1998;73:1249–54.
9 Sugarman J, Getz K, Speckman JL, Byrne MM, Gerson J, Emanuel EJ. The cost of institutional review boards in academic medical centers. N Engl J Med. 2005;352:1825–27.
10 Cohen JJ. Trust us to make a difference. Acad Med. 2001;76:209–14.
11 Barnes BE, Friedman CP. Educational Technology. In Davis D, Barnes BE, Fox R (eds). The Continuing Professional Development of Physicians. Chicago: AMA Press, 2003:205–20.
12 Hripcsak G. IAIMS architecture. J Am Med Inform Assoc. 1997(3–4 suppl);4(2):S20–S30.
13 Stead WW. The evolution of the IAIMS: Lessons for the next decade. J Am Med Inform Assoc. 1997 (3–4 suppl);4(2):S4–S9.
14 Lindberg DA, West RT, Corn M. IAIMS: an overview from the National Library of Medicine. Bull Med Libr Assoc. 1992;80:244–46.
15 Knowles MS. Andragogy in Action. San Francisco: Jossey-Bass, 1984:1–15.
16 University of Pittsburgh Education and Certification Program in Research and Practice Fundamentals 〈http://rpf.health.pitt.edu/rpf/〉. Accessed 26 October 2005.
17 Friedman PW, Winnick BL, Friedman CP, Mickelson PC. Development of a MeSH-based index of faculty research interests. Proceedings of the American Medical Informatics Association Symposium. Los Angeles, CA: 2000:265–69.
© 2006 Association of American Medical Colleges