Secondary Logo

Journal Logo

Improving Oversight of the Graduate Medical Education Enterprise: One Institution’s Strategies and Tools

Afrin, Lawrence B. MD; Arana, George W. MD; Medio, Franklin J. PhD; Ybarra, Angela F. N. MHA; Clarke, Harry S. Jr MD, PhD

doi: 10.1097/01.ACM.0000222258.55266.6a
Residents’ Education
Free

Accreditation organizations, financial stakeholders, legal systems, and regulatory agencies have increased the need for accountability in educational processes and curricular outcomes of graduate medical education. This demand for greater programmatic monitoring has placed pressure on institutions with graduate medical education (GME) programs to develop greater oversight of these programs. Meeting these challenges requires development of new GME management strategies and tools for institutional GME administrators to scrutinize programs, while still allowing these programs the autonomy to develop and implement educational methods to meet their unique training needs. At the Medical University of South Carolina (MUSC), senior administrators in the college of medicine felt electronic information management was a critical strategy for success and thus proceeded to carefully select an electronic residency management system (ERMS) to provide functionality for both individual programs and the GME enterprise as a whole. Initial plans in 2002 for a phased deployment had to be changed to a much more rapid deployment due to regulatory issues. Extensive communication and cooperation among MUSC’s GME leaders resulted in a successful deployment in 2003. Evaluation completion rates have substantially improved, duty hours are carefully monitored, patient safety has improved through more careful oversight of residents’ procedural privileges, regulators have been pleased, and central GME administrative visibility of program performance has dramatically improved. The system is now being expanded to MUSC’s medical school and other health professions colleges. The authors discuss lessons learned and opportunities and challenges ahead, which include improving tracking of development of procedural competency, establishing and monitoring program performance standards, and integrating the ERMS with GME reimbursement systems.

Dr. Afrin is director of information technology, Office of Graduate Medical Education, the Medical University of South Carolina (MUSC), Charleston, South Carolina. He also is director of information technology at MUSC’s Hollings Cancer Center and the director of MUSC’s Hematology/Oncology Fellowship Training Program.

At the time the described system was initially deployed, Dr. Arana was the designated institutional official and senior associate dean for graduate medical education, MUSC, Charleston, South Carolina; he is now on sabbatical pursuing advanced executive management training.

Dr. Medio is associate dean for graduate medical education and director, Office of Graduate Medical Education, MUSC, Charleston, South Carolina.

Ms. Ybarra is manager, Office of Graduate Medical Education, MUSC, Charleston, South Carolina.

Dr. Clarke is chairman, Graduate Medical Education Committee, MUSC, associate dean for graduate medical education at MUSC, Charleston, South Carolina, and director of MUSC’s Urology Residency Training Program.

Please see the end of this article for information about the authors.

Correspondence should be addressed to Dr. Afrin, Division of Hematology/Oncology, CSB903, P.O. Box 250635, Medical University of South Carolina, 96 Jonathan Lucas Street, Charleston, SC 29425; telephone: (843) 792-4271; e-mail: 〈afrinl@musc.edu〉.

Since the 1990s, graduate medical education (GME) in the United States has undergone a host of changes and will continue to do so as stakeholders seek improved methods for documentation of the educational process and its outcomes. GME program objectives have been widened from the traditional development of specialty-specific knowledge and skills to the current six broad areas of competency defined by the Accreditation Council for Graduate Medical Education (ACGME).1 Once loosely defined curricula are becoming more specific,2–4 expectations of close supervision are increasing,5–10 and evaluation and remediation processes are dramatically expanding in scope and regimentation.11–13 In addition, technical competence in the clinical and surgical areas is being monitored more carefully across all specialties,14 and interests in improving patient safety have led to new constraints on resident duty hours and moonlighting.15 Finally, resident credentials are being reviewed more carefully.16–19

These changes collectively have created two large-scale problems: (1) the need for improved coordination of activities across multiple GME programs within an institutional GME enterprise, and (2) the generation of, and need to monitor and analyze, a far greater quantity of information about the GME programs and their respective operations. Addressing the first problem requires new strategies and policies, which may seem intrusive and unnecessarily burdensome to programs accustomed to greater autonomy. Addressing the second problem requires new GME-enterprise-wide information management tools to provide central GME administration with access to both program-specific and cross-program information and analyses as well as help programs achieve improved operating efficiencies.

With these problems in mind, in 2001, senior GME administrators (including GWA and FJM) in the College of Medicine at the Medical University of South Carolina (MUSC) led an institution-wide strategic planning initiative that outlined an approach to address the coming national changes. In this article, we describe the journey taken to date and opportunities ahead.

Back to Top | Article Outline

Identifying the Problems

In the summer of 2001, the MUSC GME Committee (GMEC), composed principally of MUSC GME program directors, held a two-day off-site retreat to identify and characterize global GME problems at MUSC and identify potential paths to improvement. One major problem identified was the GME enterprise’s lack of an overarching set of objectives and a plan for achieving them. Other, more specific major problems were identified relating to evaluations: faculty were not reliably completing them, trainees were not reliably completing them, some programs were not adequately monitoring evaluation processes, and central GME administration had no mechanism to be informed of, or to discover, these problems. Furthermore, with the approaching requirements for more rigorous evaluation methods and documentation, it was clear the problems would steadily grow if existing practices were not substantively changed.

The GMEC decided that a strategic planning effort was necessary but that the multiple problems with evaluation management were so pervasive and significant that a solution in that area should be sought immediately. The GMEC concluded that an enterprise-wide electronic system was needed to manage the evaluation processes and directed the GME office to pursue the matter.

Back to Top | Article Outline

Pursuing an Electronic Evaluation System

In the winter of 2001–02, the GME office canvassed the electronic residency management system (ERMS) market and identified several candidate products. In-house demonstrations of four promising products were scheduled and took place in the spring of 2002. All GME program directors, coordinators, trainees, and other GME-affiliated personnel at MUSC were invited to attend the demonstrations and were requested to assess the usefulness of the products. One product clearly emerged as the group’s top choice.

Next, the GME office solicited a program director (LBA) with extensive information technology experience to serve as project director. Assisted by the MUSC legal department, the project director entered contract negotiations with the vendor, securing a fee-free pilot of the product and arranging for several contract renewal options to provide a predictable cost structure for the first several years. Although the product provided a wide range of tools for managing GME operations, it was felt that the institution’s capacity to absorb change in a productive manner was limited, and thus the deployment plan initially developed was to put only the evaluation management tools into use first. The plan anticipated taking two or three years to perform a phased deployment across all 50 of MUSC’s GME programs. Other tools in the system would be phased into use in the ensuing years, although individual programs would be free to use them earlier if desired.

Funding an ERMS was discussed with senior management of the university hospital at MUSC, highlighting the importance of programmatic oversight across all residency programs given the increasing regulatory demands. The new potential for suffering enterprise-scale sanctions for the misdeeds of a single program was noted. Loss of program accreditation would lead to a proportionate loss of Medicare reimbursement for indirect medical education expenses, and large-scale sanctions could affect the institution’s ability to provide indigent care, thereby potentially diminishing disproportionate share reimbursement. Also noted, however, was the potential for the ERMS to pay for itself in a positive manner, not by preventing cuts but by increasing income: consultants to the MUSC GME reimbursement office had recently identified that inaccurately low levels of resident effort were being captured across most programs, and an ERMS would provide an opportunity for more accurate capture, leading to Medicare cost report filings for greater reimbursement.

As the contract was being finalized, the project director solicited the participation of a wide range of residency programs (e.g., general surgery, obstetrics– gynecology, psychiatry, and hematology–oncology) in a four-month pilot to study the feasibility and utility of the candidate ERMS.

On December 1, 2002, the pilot “went live.” The vendor trained each program’s coordinator to serve as the program’s “superuser,” and the coordinator in turn was responsible for providing the necessary training to the users in the program. Some of the pilot program coordinators provided their faculty and trainee users with specific training; others felt the system was so easy to use it merely required e-mailing users some basic access instructions and codes.

Most programs configured the system to generate, for each educational activity or rotation, three evaluations: an evaluation for the supervisor to fill out about the trainee, an evaluation for the trainee to fill out about the supervisor, and an evaluation for the trainee to fill out about the activity. Some programs also established nurses and other personnel as users and asked them to begin moving toward their “360 evaluation” goals. Users received evaluation notices from the system via e-mail. Each notice contained a link that launched the user’s Web browser, automatically logged the user into the system, and automatically displayed the appropriate evaluation form. A form sent to a trainee’s supervisor several days in advance of the end of the rotation was fully or partially completed and saved for later completion or discussion with the trainee (typically on the last day of the rotation) before final submission; forms sent to trainees for evaluation of supervisors and rotations more typically were completed and submitted in a single session at or shortly after the end of the rotation. Once submitted, a form could be reviewed by evaluator and evaluee (and program and central GME administrators) but not changed.

Back to Top | Article Outline

Scope creep

Shortly after the pilot project started, hospital administration contacted the project director regarding another regulatory issue. A Joint Commission on Accreditation of Health Care Organizations (JCAHO) survey of the Medical University Hospital Authority (MUHA) in 2000 had yielded a Type 1 recommendation that MUHA develop a centralized system for managing GME trainees’ privileges for clinical and surgical procedures performed on the inpatient and outpatient services. Such a system was intended to improve patient safety by providing nursing personnel with immediate access to a method to determine the level of physician supervision necessary for a resident to perform a given procedure. This “point of care” assessment of a resident’s capability to perform a procedure with or without a physician’s presence was considered a critical factor in promoting safe, effective patient care. By late 2002, with the next JCAHO survey due in the fall of 2003, MUHA administration asked whether the ERMS could help with this patient-safety issue.

Fortunately, this capability already existed in the selected system (it was one of the tools whose use had been planned for another phase of deployment a few years later), but a decision to use the system to solve this problem required a nearly immediate decision (instead of the originally planned four-month trial period) about the overall adequacy of the system for evaluation management. If the system were felt to be inadequate for evaluation management, it would not make sense to use it for procedure privilege management, and if it were not to be used for procedure privilege management, then as much time as possible would be needed to find an alternative procedure privilege management solution before the time of the next JCAHO survey. Thus, the pressure to rapidly assess the system’s suitability for evaluation management increased significantly.

Shortly after the turn of the year, however, this pressure increased further. It became evident that the new resident duty hour requirements would go into effect in July 2003. A task force of hospital and GME administrators considered implementation options and concluded that resident self-reporting of duty hours into an electronic system, combined with appropriate reporting and monitoring tools in that system, was the best path to comply with these new regulations. The project director, a member of this task force, discussed the matter with the ERMS vendor. The vendor was beginning to design a new duty hour management module and agreed to include MUSC in the design and pilot testing efforts.

The duty hour management issue required rapid assessment of the adequacy of the system’s evaluation tools. A canvassing by the project director in late January of key personnel in the pilot programs yielded a very favorable consensus about the system’s evaluation management capabilities, and thus the decision to migrate from pilot to production status was made.

Back to Top | Article Outline

“Big bang” time

The new pressures regarding procedure privilege management and duty hour management meant that the initial deployment plan—to deploy only the evaluation management tools in a phased approach to all 50 GME programs over two or three years—had to be radically redesigned. All 50 programs had to “go live” with the system’s evaluation management and duty hour management modules by July 1; all 50 programs would need to be ready to use the system’s procedure privilege management module by October 1. Information about 50 programs would have to be loaded into the system. Approximately 700 GME faculty, 500 residents, and 50 program coordinators would have to be trained by July 1, and roughly 2,000 nurses would have to be trained by October 1, a tall order for any health care system. A phased approach to implementation was now out of the question; only a “big bang” deployment could accomplish the goals. The system vendor was very cooperative with this major change in the implementation plan.

In order to achieve success, the project director developed a detailed implementation plan with specific milestones to be met by certain dates. The plan was endorsed by the GME office, the designated institutional officer (DIO), the associate dean for GME, the executive medical director, and the dean of the college of medicine. It subsequently was distributed to all program directors and coordinators. Following a mandatory kick-off meeting (attended by each program director and/or program coordinator) at which vendor representatives explained the setup process, program coordinators began filling in vendor-supplied electronic forms with a variety of program information including faculty and trainee demographics and educational activity information. Because of the volume of work and the short amount of time available, coordinators were allowed to omit optional data elements and focus exclusively on the fairly small core set of required data elements.

As each coordinator completed the electronic forms, the GME Office received confirmation. As the March deadline for this milestone approached, programs lagging behind were reported to the DIO and the associate dean for GME, who contacted program directors, department chairs, and division directors as needed, typically by e-mail (with a copy to the dean) but also by phone for particularly recalcitrant cases. Ultimately, all programs complied with the deadline.

The vendor came to MUSC for a week in April to review and validate the program information with each coordinator. Because only a limited complement of vendor staff was available for a limited time, there was very careful orchestration of these validation sessions to ensure all 50 were completed within the allotted week.

In the following three weeks, the vendor loaded all the program information into the system and verified each program’s initial desired system configuration options. The vendor returned for two days in May to conduct mass “superuser” training sessions for all the program coordinators. As in the pilot, it was left to the program coordinators to determine how best to provide training to their respective faculty and residents. Programs were encouraged to begin using the system immediately following superuser training, though most planned to go live on July 1.

While this setup work was in progress, the project director

  • ▪ worked with the vendor to design the product’s duty hour management module, including tools for duty hour logging, reporting, and monitoring. The project director, who was also a residency director, had his residents and faculty use the new module in May, which enabled him to identify problems and allowed the vendor time to fix the problems before the July 1 institution-wide start date.
  • ▪ worked on a second, parallel track with MUSC’s computer skills trainers to incorporate ERMS training into the computer training sessions provided at the new resident orientation in late June for the new residents matriculating on July 1.
  • ▪ worked on a third track with MUSC’s GME program directors to develop their lists of procedures, which would be tracked in the procedure privilege management module.
  • ▪ worked on a fourth track with Medical University Hospital (MUH) nursing management to develop training for nurses, who would need to access resident procedure privileges in the system and would also need to use the system to file either scheduled or “on-the-fly” evaluations.

The system officially went into production at MUSC on July 1, 2003. GME faculty, residents, and program coordinators were the first to use it, followed in September by the nurses, who received training in regular nursing meetings in July and August. It was felt that deployment to all groups in July would be too much change at once. A September deployment for the nurses would provide sufficient deployment “decompression” but still allow them sufficient time to gain experience with the system before the JCAHO site visit in November.

Back to Top | Article Outline

The First Three Years

The first year

Reaction by faculty and residents to the system was mixed, but for predictable and manageable reasons. Although the system was nearly universally praised for its ease of use, faculty realized that the amount of their time spent completing evaluations would increase for the simple reason that continued avoidance of evaluation work would be much more visible. Trainees were pleased with the increased ease of tracking and accessing the faculty’s evaluations of their performance and appreciated the improved ease of evaluating their supervisors and their educational activities. However, it was a challenge convincing some residents in virtually every program that duty hour logging was now mandatory. Automated reports to the program directors listing residents deficient in their duty hour logging were—and still are—helpful in this aspect of compliance education. To underscore the importance of duty hour reporting and compliance, the GMEC modified the resident agreement (i.e., contract) to include a paragraph clearly stating the expectations for residents and faculty with regard to the new duty hour requirements.

The nurses universally favored the new system, appreciating not only its contribution to the hospital’s developing culture of patient safety but also the “direct line” to the program directors to report positive and negative performance aspects of not only residents but also GME faculty.

Beginning with “go live” in July 2003, the project director began holding monthly meetings with key GME administrators to review use of the system and identify problems and opportunities. The project director also hosted “postmortem” sessions in October for all program directors and coordinators in an effort to bring to light any lingering problems. Attendance was modest, and no significant problems were identified. One suggestion was to load resident and faculty photographs into the system so that the face of the evaluation participant could always be shown on any forms for evaluating that individual. The system already had the capacity to do this. The project director met with the administrator of the university’s identification badge management system used by campus security and programmed an extraction from that system’s database of all resident and faculty photographs. The extraction was securely conveyed to the vendor for batch loading into the system.

The question of whether the system would satisfy regulators was thrice settled in November when, in that one month, representatives of the JCAHO came to do its site survey; representatives of the ACGME visited for a review of the internal medicine residency program, all the medicine subspecialty fellowship programs, and several other residency programs; and representatives of the American Board of Internal Medicine (ABIM) also visited for a review of MUSC’s internal medicine residency program and all the medicine subspecialty fellowship programs. JCAHO personnel expressed satisfaction with MUSC’s response to the prior Type 1 citation on procedure privilege management. Similarly, ACGME and ABIM personnel were impressed with the resident evaluation management and duty hour management.

In December, cross-program reporting was activated for central GME administration, and program transparency became a reality. With a few clicks, central administration could identify residents consistently receiving poor evaluations. Reports comparing different programs’ compliance with the new duty hour regulations became readily available. Virtually every report available at the program level also became available at the cross-program level. Directors of underperforming programs became acutely aware of the new visibility gained by central GME administration. It also became clear that some programs were not using the system to manage procedure privileges. The GMEC appointed a task force to address this issue.

Also in December, the dean’s office in the college of medicine contacted the project director about expanding the system to serve the evaluation management needs of the third- and fourth-year medical students in their clinical rotations. Because the vendor had experience with the use of its system for medical students, this additional use was implemented quickly. A senior medical school administrator was identified as the project director, a separate contract was negotiated, and an implementation plan was developed to bring the fourth-year students on board the system in July 2004 and the third-year students a year later.

In February the project director began planning for new resident computer systems orientation. Lessons learned from the prior year were applied to procedures for training the new 2004 matriculates.

Retraining sessions were offered to program coordinators in the spring to prepare for the necessary updates in the system.

Back to Top | Article Outline

The second year

Early in the second year of operation (July 2004 to June 2005), the MUSC School of Medicine “went live” with the evaluation management component of the system for its senior medical students as planned and without incident. Subsequently the system was deployed to the junior students ahead of schedule, again without incident.

Regarding residents, the GMEC’s task force on procedure privilege management concluded development of a “core procedures” list to be used as the minimum set of procedures for which a program would have to establish privilege levels for its residents. These were basic procedures typically performed on the ward by a resident whose supervisor was readily accessible but not necessarily at the bedside; operating room procedures were omitted, as MUH policy already required an attending surgeon be at the operating table throughout every surgery. A new cross-program report was developed by the vendor to identify programs deficient in establishing privileges for its residents for these procedures.

Positive experiences with the ERMS also led to a midyear decision to migrate the GME office’s use of a credential tracking program for third-party residents into the ERMS. The project director developed a plan for migrating data from the third-party program into the ERMS, and in the spring of 2005 this conversion took place and the third-party program was retired. This conversion also permitted the development of more efficient matriculation-related procedures for entry of new resident credentialing information, which resulted in reduced workloads for program coordinators and the GME office.

Back to Top | Article Outline

The third year

At the time of this writing (September 2005), the GME office, after consultation with university counsel, has embarked on a project to “backload” key information from all of its archived resident files (dating back to the 1940s) into the ERMS. The files will then be destroyed so that the extensive amount of office space currently used to store those files can be used for other purposes.

Our ERMS is flexible, and our program directors and coordinators have exhibited creativity in their approaches to common tasks such as promotion/graduation and duty hour monitoring. Differing methods of data entry have created difficulties in efforts to aggregate data at the enterprise level, and many program directors and coordinators have not yet realized the significant impact that seemingly minor data management decisions, once autonomously made with impunity, now have on enterprise-level analytic projects. Standardized methods for managing common tasks are needed. The project director, in consultation with program coordinators and directors across the enterprise, is drafting standard operating procedures (SOPs) that will be required of all programs by the GMEC.

In a related fashion, a need for on-demand training for program directors and coordinators has also become apparent. Our GME enterprise experiences approximately 5% to 10% turnover among its program directors per year and approximately twice that rate among its program coordinators. With the evolving need for continuous adherence by all our ERMS administrators to the enterprise’s SOPs, it also becomes necessary to ensure new program directors and coordinators quickly achieve competence in SOP-compliant use of the system. Toward this end, the project director is developing both a “boot camp” for new administrators and a “clinic” for existing administrators who feel a need for one-on-one refresher training.

Back to Top | Article Outline

Remaining Opportunities

Tracking procedural competency development

Our ERMS provides a tool whereby residents can log the procedures they perform. The logging of a procedure automatically generates a short evaluation form for the supervisor to use to rate and comment upon the resident’s performance. Once a resident has achieved a program-defined threshold number of satisfactory (or better) evaluations for a given procedure, the system stops generating evaluation forms when that resident again logs a performance of that procedure.

Our selected ERMS integrates logging of procedures with performance evaluation of those procedures, making it possible to determine not only how many procedures were completed but also how well the resident performed the procedures. The ACGME-owned Resident Case Log System (RCLS) required to be used by trainees in most surgery programs tracks only procedural volume, not an assessment of performance. We believe our system’s approach is better. Thus, in ideal circumstances, we would require that our residents log all their procedures into our ERMS to gain the competency tracking benefit, and we would automatically upload procedures from our system to RCLS to comply with the ACGME requirement for logging procedures in its system, too. Unfortunately, current ACGME policy precludes automated interfaces to its RCLS. This means that at institutions that wish to track development of resident procedural competency, residents must log operative procedures into RCLS and then double-log those procedures into the local institutional ERMS. The situation is further worsened in those programs whose home departments also mandate the logging of all procedures into a custom department database (typically maintained for quality assurance and/or research purposes).

Our programs that are facing this conundrum caused by the ACGME requirement have made the difficult choice of protecting their residents’ time by not requiring them to double-log. Thus, some quality of educational oversight (gained from procedural competency tracking in our ERMS) has been sacrificed for regulatory compliance. Clearly, strides must be made toward achieving interoperability of procedure logging systems, a goal that is consistent not only with national health care information technology goals,20 but also the ACGME’s own goals for fostering innovation in GME and for streamlining accreditation.21 However, interoperability can be technically challenging, and an alternative approach would be the transformation of the specific requirement to use the ACGME’s procedure logging system into a set of technology-agnostic minimum-procedure logging requirements. That is to say, the requirements should specify the data items to be collected and the methods by which any arbitrary procedure-tracking computer system can exchange these data with any other arbitrary procedure-tracking system, but otherwise there should be no requirement to use any specific computer system(s). Such an alternative set of requirements, together with an enhanced RCLS that could receive compliant data feeds from third-party systems, would create the conditions necessary to allow programs to choose between logging procedures directly into RCLS versus using superior alternative systems that supplement basic logging with mechanisms for formal tracking of development of competency plus mechanisms for automatic uploading to RCLS. Allowing programs to choose systems that best fit their local needs and preferences (while still satisfying reasonable minimum standard requirements) would foster competition in this sector of information technology, leading to improvements sooner than would be expected with the mandated use of a monopolistic centralized system.

Back to Top | Article Outline

Internal interfacing

Interoperability is just as important within the institution as it is between institutions. Currently our selected ERMS requires users to log in using a user ID and password unique to that system. We are working with the vendor to extend the system so that our users can use their existing “single sign-on” institutional user IDs and passwords to also log in to the ERMS. The vendor has other clients pursuing a similar solution.

In a similar fashion, we are working with the vendor to extend the system in the area of conference attendance tracking. Our GME programs, in accordance with ACGME requirements, track faculty and resident attendance at program conferences, and some of these programs have recently enhanced their attendance tracking methods with the implementation of automated attendance recording technologies. One of our programs developed a biometrically based attendance recording and tracking system now used by several programs,22 whereas some other programs have begun using a badge-swipe system devised by our Office of Continuing Medical Education. Our ERMS has a more comprehensive conference management module that integrates conference scheduling, attendance tracking, and evaluation features, but the attendance tracking feature still requires taking attendance in the traditional fashion and then transcribing the attendance record into the system. We are working with the vendor to extend the ERMS so that any automated attendance recording system can feed its attendance records to the ERMS.

Back to Top | Article Outline

Program performance standards

We have achieved our initial goal of gaining central visibility of the activities in the GME programs at our institution. Now that visibility has been achieved, what should be done with it? Should minimum program performance standards be set? What should those standards be? Should the annual evaluations of program directors and even department chairs and division directors be tied in part to certain outcomes reportable from our ERMS? What policies and procedures for corrective action should be developed? Our GMEC is beginning to contemplate these major issues. Task forces likely will be appointed to develop preliminary proposals to be reviewed and approved by our GMEC.

Back to Top | Article Outline

Auditing

The cross-program reporting tools in our ERMS provide information that our residents’ reported duty hours are causing only infrequent, minor violations of the duty hour regulations. Selective auditing of reported duty hours would help assess compliance and convince regulators and residents of the institution’s commitment to enforcement of these regulations.

Back to Top | Article Outline

Further expansion of the system

On the merits of the positive experiences the medical school had with deploying the system to the senior and junior medical students, the school is now considering soon expanding its use of the system to serve its freshmen and sophomores, too. Additionally, the university is considering expanding the system to serve its other health professions colleges.

Back to Top | Article Outline

Reimbursement

The MUHA’s GME reimbursement office is responsible for preparing and submitting the institution’s “Medicare cost report” to the Centers for Medicare and Medicaid Services (CMS). That reimbursement office uses an older software package that requires staff to manually enter resident demographic and effort information. Our ERMS vendor has developed a corresponding module in its system, and a transition to that new module might not only save our GME reimbursement office staff a great amount of data-entry work (since demographic and educational activity information would already be in the system) but also could result in greater reimbursement by providing more detailed information about resident effort.

Back to Top | Article Outline

Lessons Learned

There were many keys to the success of our project: a carefully chosen project director; a carefully chosen product and vendor; institutional buy-in at the highest levels; establishment of clear, specific goals and a timetable to achieve them, clear and specific methods for achieving them, and substantive penalties for failure to meet the timetable; and, last but most important, an abundance of communication, not only during the initial deployment but also on an ongoing basis. These elements, of course, are keys to success for most large projects, but we believe our report documents the first demonstration that an institution-wide GME data management system could be successfully deployed in a relatively short period of time. Since most institutions hosting GME programs likely will need to pursue a similar GME oversight strategy in the near future, our example may help inform those efforts and reduce deployment problems and failures.

In particular, we emphasize the importance to this type of effort of high-level institutional buy-in, clear and collaborative goal-setting, and abundant communication. Although we believe our selected product and vendor have served us well, it is likely that success can be achieved with alternative products and vendors as long as the internal institutional milieu is primed for success by achieving the three goals just stated.

Future work on our project will include (1) expanding management capabilities across more elements of our GME enterprise and its component programs; (2) establishing residency program performance benchmarks for program directors and departments; (3) improving the point-of-care verification of resident procedural competence; and (4) improving the system’s interoperability with other systems in the institution and interoperability with the ACGME and the CMS.

Back to Top | Article Outline

Disclaimer

The authors report no conflicts of interest with any commercial entities discussed in this article.

Back to Top | Article Outline

References

1 Accreditation Council on Graduate Medical Education. ACGME General Competencies Version 1.3 〈http://www.acgme.org/outcome/comp/compFull.asp〉. Accessed 22 January 2006.
2 Jacob J. Introducing the Six General Competencies at the Mayo Clinic in Scottsdale 〈http://www.acgme.org/acWebsite/bulletin-e/e-bulletin10_04.pdf〉. Accessed 22 January 2006. ACGME Bulletin; October 2004:1–2.
3 Jacob J. Redesigning the Ambulatory Care Curriculum at Lenox Hill Hospital 〈http://www.acgme.org/acWebsite/bulletin-e/e-bulletin04_04.pdf〉. Accessed 22 January 2006. ACGME Bulletin; April 2004:4.
4 Fabri PJ. Lions and Tigers and Bears 〈http://www.acgme.org/acWebsite/bulletin-e/e-bulletin06_05.pdf〉. ACGME Bulletin; June 2005:3–4.
5 Centers for Medicare and Medicaid Services. Medicare Carriers Manual, Part 3 – Claims Process, Transmittal 1780 〈http://www.cms.hhs.gov/manuals/pm_trans/R1780B3.pdf〉. Accessed 22 January 2006. November 22, 2002:15-9–15-12.5.
6 Accreditors tighten up accountability for graduate medical education—what’s it all about? 〈http://www.jcrinc.com/!pubs/pdfs/BM/Benchmark9-01.pdf〉. Accessed 22 January 2006. Joint Commission Benchmark 2001 Sep; 3(9):1–2,10;
7 Leach DC. Supervision: Nine Helpful Principles and a Story 〈http://www.acgme.org/acWebsite/bulletin/bulletin09_05.pdf〉. Accessed 22 January 2006; September 2005: 2–3.
8 Chang BK. Resident Supervision in VA Teaching Hospitals 〈http://www.acgme.org/acWebsite/bulletin/bulletin09_05.pdf〉. Accessed 22 January 2006. ACGME Bulletin; September 2004:12–13.
9 Fischer KS. Supervision from a Payment Perspective: The Medicare Part B Requirements for Teaching Physicians 〈http://www.acgme.org/acWebsite/bulletin/bulletin09_05.pdf〉. Accessed 22 January 2006. ACGME Bulletin; September 2005:14.
10 Flynn T. Resident Supervision 〈http://www.acgme.org/acWebsite/bulletin/bulletin09_05.pdf〉. Accessed 22 January 2006. ACGME Bulletin; September 2005:15–17.
11 Shorey JM, Salazar WH. Observation and Feedback: Core Faculty Skills that Cross-Cut the General Competencies 〈http://www.acgme.org/acWebsite/bulletin/bulletin09_05.pdf〉. Accessed 22 January 2006. ACGME Bulletin; September 2005:4–6.
12 Epstein RM. Steps Toward Assessing the Professional Competence of Residents 〈http://www.acgme.org/acWebsite/bulletin/bulletin0302.pdf〉. Accessed 22 January 2006. ACGME Bulletin; March 2002:1–4.
13 Academic Medicine Responds to Education and Assessment Using the General Competencies: A Collection of Outcome-Related Products and Tools 〈http://www.acgme.org/acWebsite/bulletin/bulletin0302.pdf〉. Accessed 22 January 2006. ACGME Bulletin; March 2002:14–15.
14 Lynch DC. Competencies in the Press: Direct Observation. 〈http://www.acgme.org/acWebsite/bulletin/bulletin09_05.pdf〉. Accessed 22 January 2006. ACGME Bulletin; September 2005:7–8.
15 Accreditation Council on Graduate Medical Education. Resident Duty Hours and the Working Environment 〈http://www.acgme.org/acWebsite/dutyHours/dh_Lang703.pdf〉. Accessed 22 January 2006.
16 Jarecki A, Braatz C. The Importance of Established Policy for Physician Background Checks 〈http://www.nejmjobs.org/rpt/rpt_article_10.asp〉. Accessed 22 January 2006. New England Journal of Medicine CareerCenter for Employers.
17 Stoever J. Applying for Residency? Here’s Help 〈http://www.aafp.org/x36558.xml〉. Accessed 22 January 2006. AAFP News Now, July 29, 2005.
18 GME Policies: Drug-Free Workplace. Duke University Hospital Housestaff Manual 〈http://www.gme.duke.edu/hsmanual/gmepolicies.htm〉. Accessed 22 January 2006.
19 Texas Tech University System Board of Regents. Criminal Background Check: Notice to Students/Trainees and Residents 〈http://www.ttuhsc.edu/HSC/OP/OP10/op1020a.pdf〉. Accessed 22 January 2006.
20 Office of the National Coordinator for Healthcare Information Technology. Mission Statement 〈http://www.os.dhhs.gov/healthit/mission.html〉. Accessed 22 January 2006.
21 Philibert I. What is New at the ACGME: Selected New Directives and Initiatives. An Update 〈http://www.ahme.org/events/spring2005/handouts/Philibert_Presentation3.pdf, slides 32–34〉. Accessed 22 January 2006. Presented at the AHME Spring Educational Institute, Tucson, Arizona, May 14, 2005.
22 Afrin LB. Tracking Residents’ Conference Attendance: A Reliable Solution At Last? 〈http://www.amia.org/meetings/f02/2002online/S32.htm#D020001747〉. Accessed 22 January 2006. In: Proceedings of the American Medical Informatics Association 2002 Annual Symposium.
© 2006 Association of American Medical Colleges