Skip Navigation LinksHome > May 2012 - Volume 87 - Issue 5 > The Unintended Consequences of Clarity: Reviewing the Action...
Academic Medicine:
doi: 10.1097/ACM.0b013e31824d4b7c
Accreditation Issues

The Unintended Consequences of Clarity: Reviewing the Actions of the Liaison Committee on Medical Education Before and After the Reformatting of Accreditation Standards

Hunt, Dan MD, MBA; Migdal, Michael PhD; Eaglen, Robert PhD; Barzansky, Barbara PhD, MHPE; Sabalis, Robert PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Hunt is cosecretary for the LCME and senior director for accreditation services, Association of American Medical Colleges, Washington, DC.

Dr. Migdal is senior research analyst for the LCME, Association of American Medical Colleges, Washington, DC.

Dr. Eaglen is executive director for faculty and learning services, Northeast Ohio Medical University, Rootstown, Ohio.

Dr. Barzansky is cosecretary for the LCME and director of the division of undergraduate medical education, American Medical Association, Chicago, Illinois.

Dr. Sabalis is assistant secretary for the LCME and director of LCME team surveys and team training, Association of American Medical Colleges, Washington, DC.

Correspondence should be addressed to Dr. Hunt, Association of American Medical Colleges, 2450 N St., NW, Washington, DC 20037; telephone: (202) 828-0596; fax: (202) 828-1125; e-mail: dhunt@aamc.org.

Collapse Box

Abstract

Purpose: To determine the frequency of severe action decisions made by the Liaison Committee on Medical Education (LCME) in two time periods and to speculate about contributing factors for any change.

Method: Two study periods were reviewed. Study Period 1 (1996–2000) was before a 2002 reformatting of the standards; Study Period 2 (2004–2009) was after that reformatting. The frequency of severe action decisions and patterns of noncompliance leading to those decisions in both periods were analyzed.

Results: There were more severe action decisions during Study Period 2 than Study Period 1, with a notable increase in the number of recommendations for probation. Study Period 1 had substantially more noncompliance with standards within the Institutional Setting and Educational Resource categories, whereas Study Period 2 had substantially more noncompliance within the Educational Program and Medical Student categories.

Conclusions: The 2002 reformatting of the standards enhanced the clarity of each standard and connected previously existing annotations to their standards. As a result of the reformatting, all documents and communications to schools were directly tied to specific standards. This has allowed the LCME to more easily identify areas of chronic noncompliance and to improve survey team training. The shift in patterns of standards out of compliance in the more recent time period is consistent with the effect of the reformatting. There may be other contributing factors for the increase in severe action decisions, but it is clear that the reformatting of standards has improved the LCME’s ability to monitor medical education programs.

The Council for Higher Education Accreditation1 defines accreditation in the United States as both a process and a status. It is a process in that it means to ensure and improve higher education quality, assisting institutions and programs using a set of standards developed by peers. It is a status in that the result of the process is the awarding or denial of accredited status. All accrediting organizations create and use specific standards to ensure that institutions and programs meet at least threshold expectations of quality and that they improve over time. These standards address key areas such as faculty, curricula and student learning outcomes, student support services, finances, and facilities. Most accrediting organizations use common practices, including a self-study by the institution or program to determine whether standards are being met, an on-site visit by an evaluation team of peer experts, and a subsequent review and decision by the accrediting body about accreditation status. Depending on the accrediting body, this review typically must occur every 3 to 10 years for the institution or program to maintain its accreditation.

The Liaison Committee on Medical Education (LCME) is the organization that accredits medical education programs leading to the MD degree in the United States. The LCME works in partnership with the Committee on Accreditation of Canadian Medical Schools (CACMS) to accredit medical education programs leading to the MD degree in Canada. The “liaison” in the LCME’s name describes the relationship that resulted from a 1942 event that brought together medical education review processes that had previously been conducted separately by the American Medical Association and the Association of American Medical Colleges.2,3 This liaison relationship between the two organizations is also represented in the current composition of the 19-member committee. The 7 professional members and 1 student member appointed by each sponsor make up 16 of the LCME members. The 2 public members (who are selected by the LCME) and the chair of the CACMS are the other 3 LCME members. Each professional appointment is for a three-year term that can be renewed once, whereas the students, who are in their final year of medical school, are each appointed to a one-year, nonrenewable term.

Back to Top | Article Outline

Reformatting the LCME Standards

Before 2002, the LCME accreditation standards were contained in an eight-page document written in prose format. The document’s sentences served as standards that guided medical education programs in preparing for accreditation survey visits and that shaped the decisions of the LCME. Annotations for many standards existed at that time, but, because the standards were in prose format, the annotations were listed at the end of the document. As a result, many people did not realize that they existed. The absence of numbering or structural organization of the standards also meant that survey teams struggled at times to find relevant standards to link to their concerns about a medical education program. In fact, the LCME accreditation letters sent to medical education programs at that time listed findings as “areas of concern,” which tended to be descriptions of general phenomena that did not always directly relate back to a specific standard.

In 2002, the LCME reformatted the standards to reduce subjectivity and increase objectivity. During the initial stages of that reformatting, it became clear that the prose format led to redundancy because, over time, sentences had been added to different paragraphs of the original eight-page document. To more clearly state the standards, the LCME reformatted them as a numbered list, which allowed each existing explanatory annotation to be published immediately below its corresponding standard.4

The standards themselves remained largely unchanged in this reformatting except that the new, numbered format allowed for consolidation and elimination of redundancy. In addition, for the first time, the questions in the document that medical education programs responded to as they prepared for a full survey (known then as the “database”) were directly associated with the numbered standards. Linking the standards to specific questions also helped to clarify the meaning of each standard by stipulating the information needed to demonstrate compliance, thus adding transparency to the rationales for LCME actions. Furthermore, the writing of the various sections of the accreditation survey report could now be framed around relevant standards, which made the report more clearly evidence-based because the documentation addressed each standard. These clarifications also led to refinements in survey team training with more case-based sessions offered to team members; the sessions were grounded in the struggles that programs had with specific standards. These changes also led to improvements in the LCME’s ability to track over time those areas in which a medical educational program was out of compliance.

Since the time of the standards reformatting, the number of “severe action decisions” made by the LCME about medical educational programs seemed to increase, which led the LCME’s professional staff to consider the possibility that the two phenomena might be related. For example, the reformatting may have resulted in an increased ability to more clearly delineate severity or frequency of noncompliance issues. In general, the LCME Secretariat defines a “severe action decision” as any requirement by the LCME for a follow-up action that is more critical than a status report, such as an interim visit to a medical school before the conclusion of the standard eight-year accreditation period, or for a decision, such as a shortening of the accreditation period, the imposition of probation, or the denial of accreditation to an applicant program.

We carried out the research reported here to determine whether the frequency of severe action decisions by the LCME has increased since the standards reformatting in 2002 and, if so, to speculate about contributing factors.

Back to Top | Article Outline

Method

Background

Although LCME decisions about accreditation are fundamentally binary (accredit or do not accredit), there is a broad range of conditions that may lead to those decisions and a variety of actions that may be taken for particular circumstances. At each of its meetings, the LCME evaluates reports from the survey teams that have visited schools, reviews any updates requested from schools, and determines whether each accreditation standard has been met. For each medical education program, the LCME then considers the entire gestalt: The areas that are out of compliance, the length of time the program has been out of compliance with each standard, and the potential impact on students and/or the public if the program continues to be out of compliance with any standards. Taking all of these factors into consideration, the LCME decides on an appropriate action to take for a program and communicates that action decision to the program, in writing. The following list describes the formal actions or decisions available to the LCME. The list is numbered to reflect the level of increasing concern on the part of the LCME about the quality or sustainability of quality for a medical education program.

1. Continue accreditation for the full seven- or eight-year term, with no additional information requested from the program. The length of accreditation changed from seven to eight years in 2002 for reasons unrelated to the reformatting of the standards.

2. Continue accreditation for the full eight-year term, with a requirement that the program submit one or more status reports detailing corrective actions to address any areas of noncompliance and/or areas in transition. These status reports are due to the LCME within two years of the date of the LCME’s review of the survey report. The due date of a status report is based on the length of time needed to achieve compliance with standards and how urgent it is to resolve the issue(s).

3. Continue accreditation but either not specify an accreditation term (referred to as “an undetermined accreditation period”) or shorten the term and require the program to undergo a follow-up visit by the LCME Secretariat (referred to as a “Secretariat fact-finding visit”) or a limited survey team (referred to as a “limited survey visit”) that focuses only on the specific areas of noncompliance and transition.

4. Continue accreditation, but place the program on “warning of probation” status.* This action, available to the LCME since June 2008, remains confidential and requires the program to submit a plan of action for prompt resolution of its accreditation challenges. Before the standards were reformatted, an informal statement in LCME accreditation letters to the effect that “probation was considered” by the LCME represented an approximation of “warning of probation” status. Both the earlier phrasing and the current formal action of “warning of probation” were always followed by a Secretariat fact-finding or limited survey visit and are identified in the related data by the visit type rather than by the status.

5. Continue accreditation, but place the program on probationary status. This action is announced publicly, and program leaders must inform all faculty, students, and applicants of this status. Before being finalized and publicized, the probation status can be reconsidered if requested by the program’s dean, and the recommendation can be reversed depending on the outcome of the LCME’s reconsideration. A reconsideration (in the past referred to as an “appeal”) generally occurs at the next LCME meeting following the meeting at which the decision was made to place the program on probationary status.

6. Withdraw or deny accreditation. This action is subject to appeal.

For the purposes of this study, a severe action decision includes all decisions or actions noted above in categories 3 to 6.

Back to Top | Article Outline
The research process

We examined the LCME’s accreditation decisions over two time periods. The period before the 2002 reformatting of the standards was Study Period 1, and the one after the reformatting was Study Period 2. In Study Period 1 (1996–2000), 108 full surveys had been completed, representing 86% of all accredited medical education programs. Study Period 1 concludes in 2000 instead of 2002 because, at the time of this reformatting, two years leading up to the complete implementation of the changes were used as a pilot phase and ensured that some LCME members were involved in decisions both before and after the standards format change, thus reducing the potential influence of LCME member composition as a confounding variable. In Study Period 2 (2004–2009), 107 full surveys had been completed, representing 85% of all accredited medical education programs. The decision to begin Study Period 2 in 2004 rather than immediately after the reformatting in 2002 was made to ensure that the full effect of the change had been felt and to capture as much as possible of the more recent time period while maintaining a comparable number of full surveys in each study period. Although full surveys were used as a measure of comparability of workload during the two study periods, LCME action decisions were considered regardless of the type of report (e.g., full survey, limited survey, status) from which they emanated. As described earlier, a new accreditation status, “warning of probation,” was created in 2008 as a more formal way to confidentially alert a program that the LCME had serious concerns. This action leads to a Secretariat fact-finding or limited survey team being sent to the school. In the earlier period, this was roughly equivalent to the statement in an LCME accreditation letter that informed a school that the committee had “considered” placing the school’s medical education program on probation, and this action had also often been associated with a Secretariat fact-finding or limited survey team being sent to the school.

One of us (M.M.) reviewed LCME meeting minutes to identify each severe action decision. In instances when the minutes were unclear, the accreditation letter sent by the LCME to the medical education program was reviewed. In some cases, a decision about a specific program entailed more than one severe action. For example, a program with several areas of concern might have been simultaneously placed on probationary status, been given an undetermined accreditation period, and been scheduled for a limited survey visit. For the purposes of this study, such simultaneous severe action decisions were counted as one severe action decision and were recorded in the category of the most severe action (in this case, the action of placing the medical education program on probationary status).

A secondary analysis of LCME severe action decisions entailed a review of the specific standards with which each medical education program was out of compliance for each of the decisions. For Study Period 2, all meeting minutes identified the specific standards by number. In Study Period 1, no reference to standard numbers was possible because the study period predated the reformatting and numbering of standards. Because the standards reformatting improved the clarity of the standards but left their content unchanged, it was a relatively easy task to identify which of the current numbered standards corresponded to each of the prose standards in Study Period 1. One of us (D.H.) made the initial assignment of standard numbers to prose standards cited in Study Period 1; another of us (R.E.), who was involved with the 2002 standards reformatting process, made the final determination in instances when the choice of numbered standard was not immediately obvious.

Back to Top | Article Outline

Results

Figure 1 shows the three groupings of severe action decisions made by the LCME during the two study periods. Excluded were those limited and fact-finding visits requested by the LCME as a result of programmatic challenges due to natural disasters (e.g., hurricanes) or LCME concerns resulting from class size increases. The data presented in Figure 1 indicate that the LCME made more severe action decisions during Study Period 2 than during Study Period 1. Particularly striking is the increase in the number of LCME recommendations for probation. Table 1 compares the standards that the LCME determined to be out of compliance and that the LCME had made severe action decisions about in these two study periods. The standards are grouped within the five categories that serve as the framework for the LCME’s Functions and Structure of a Medical School document4: Institutional Setting (IS), Educational Program for the MD Degree (ED), Medical Students (MS), Faculty (FA), and Educational Resources (ER). The mean and median number of noncompliance citations per program are higher in Study Period 2 than in Study Period 1. The programs reviewed in Study Period 1 had substantially more standards out of compliance within the IS and ER categories, whereas programs reviewed in Study Period 2 had substantially more standards out of compliance within the ED and MS categories. The number of standards out of compliance in the FA category is similar between the two study periods.

Table 1
Table 1
Image Tools
Figure 1
Figure 1
Image Tools

Tables 2 and 3 compare the frequency of noncompliance citations for some of the individual standards that contributed to the overall changes described in Table 1. Table 2 shows the standards that were cited relatively frequently in both study periods and the standards that were cited less frequently in Study Period 2 than in Study Period 1, and Table 3 shows the standards that were cited more frequently in Study Period 2 than in Study Period 1.

Table 2
Table 2
Image Tools
Table 3
Table 3
Image Tools
Back to Top | Article Outline

Discussion

There are several possible explanations for why the LCME made more severe action decisions during Study Period 2 than during Study Period 1, including the reformatting of the LCME standards and changes in the academic medicine environment. It is beyond the scope of this report to explore the latter changes; other studies will be better suited to look into the possible influences. What is clear from this study is that the reformatting of the standards is at least part of the answer.

Before the standards reformatting, survey team findings were not directly tied to standards. Team reports in the earlier study period often used broad and general statements that did not lend themselves to corrective action or linkage to a specific standard, as in the following excerpt from a report written during Study Period 1:

Despite the good progress that has been made in curriculum reform, weaknesses and vulnerabilities remain. Elements of the new curriculum are fragile, requiring continued attention by faculty, staff, and administration. Competing demands on faculty and institutional resources may impede the ongoing evolution of the curriculum. Some themes are not well developed and integrated.

The statement could easily be construed to refer to issues relating to accreditation standards about curriculum management, adequacy of resources to support the curriculum, sufficiency of content coverage in the curriculum, or any combination of those issues. In Study Period 2, by contrast, the team report would require citation of the specific standard or standards that were not being met, and a summary of the evidence supporting noncompliance for each standard that was identified. In most cases, that evidence would be extracted from the information provided by the program and from interactions during the survey visit that are specifically linked to the standard or standards involved.

Before the reformatting of standards, it was difficult for the LCME to track programmatic improvements in meeting accreditation standards and to identify compliance problems over time. This was a result of the lack of precision in the reports themselves when they were not oriented directly to the standards. In Study Period 2, each standard with which a program was not in compliance is identified in a follow-up report and tracked for the degree of resolution. The greater clarity of accreditation findings that resulted from the standards reformatting would also make it much easier for the staff of medical education programs to understand the basis for such findings and determine the actions needed to address them appropriately.

The arraying of annotations, with their respective standards, helped to more clearly express the intent of each standard, which likely resulted in clearer understanding, which may have led to more frequent findings of noncompliance and more severe action decisions. For example, in Table 3, the number of citations related to standard ED-30 (formative/summative feedback) increased from 4 in Study Period 1 to 20 in Study Period 2. This change can be largely attributed to a change in the annotation that explicitly defined the “timely” return of clerkship grades to mean that the clerkship grades must be returned to the student within four to six weeks.

It is clear that there were many downstream effects of the reformatting of LCME standards, some anticipated and some not. It is also likely that other factors have contributed to the increased number of severe action decisions made by the LCME in Study Period 2. For example, approximately 10 years ago, the U.S. Department of Education instituted the “two-year rule” that mandates that accrediting organizations require, absent the existence of exceptional circumstances, a program to come into compliance with a standard within two years of the determination that it was out of compliance. This rule, in concert with the improved ability to track a program’s progress in achieving compliance with individual standards, has likely contributed to the increased number of LCME severe action decisions.

In addition, the clarification of the LCME standards that resulted from their reformatting allowed for more extensive training of survey team members than had previously been possible. Sessions devoted to survey team training began in approximately 1999–2000. The enhanced clarity of standards and annotations made it easier to create cases or scenarios that would help lead to greater consistency in survey teams’ judgments. The effect of improved survey team training is difficult to assess because the teams were made up of different individuals in the two study periods.

The increase in citations in the ED and MS categories noted in Tables 2 and 3 lends weight to the argument that the reformatting of LCME standards has had a significant impact. Medical education programs were found to be out of compliance with ED and MS standards more frequently in Study Period 2 than in Study Period 1, and these categories of standards are most likely the ones whose clarity has increased most from the standards reformatting: More annotations were developed for these standards than others, and these standards have been targeted more frequently for surveyor training. These standards are, on the whole, more process focused, and with the greater clarity achieved in the reformatting there is a clearer explication of the requirements for compliance. In Study Period 1, it is likely that the higher (compared with Study Period 2) numbers for IS and ER standards are due to these standards’ being somewhat easier to judge with less information at hand. For example, in the absence of focused data from the program, it would have been easier to determine whether or not a program had a strategic plan, a building, or sufficient funds rather than the systems that need to be in place for the ED and MS standards. Additional factors that may have contributed to these observed differences are that, in Study Period 1, there were greater concerns about medical school financing due to fears of managed care, concerns about the changing nature of hospitalized patients, and more tension between the roles of medical school deans and the vice presidents for health affairs to whom they reported.

The data presented in Tables 2 and 3 can be viewed as markers that document the shift in focus of accreditation and perceptions of what constitutes educational quality. In Study Period 1, programs were not consistently using the variety of program evaluation information that was available to them to improve the quality of their educational program (ED-46): Nineteen medical education programs were cited as noncompliant with that standard. In Study Period 2, the number of programs being cited for noncompliance with standard ED-46 dropped to three. It is disappointing to see that citations of noncompliance for central management (ED-33) have changed so little across the two study periods, with 21 programs cited in Study Period 1 and 17 programs cited in Study Period 2. Kassebaum et al5 commented that, on entering the 1990s, the “LCME toughened the standards” for the design and management of the medical curriculum and noted the importance of this standard. It is a complex standard, and it is quite possible that the issues that resulted in a program’s being out of compliance with this standard in Study Period 2 were different from those in Study Period 1. Even so, meeting this central and integrating standard remains elusive to too many medical education programs.

What remains unclear is whether the improved precision in the standards has also led to improvements in medical education and, ultimately, to improvements in the performance of MD physicians who practice in the United States and Canada. Limited data are available to answer these important questions. There is an improved possibility, however, that, with the reformatted standards, it is now possible to link databases containing data on physician and resident performance to databases containing accreditation data about the programs that educated those physicians and residents.

The increased number of medical education programs being placed on probation during Study Period 2 is not an anomaly limited to that time period. If you had looked at the LCME Web site on October 2009 when Study Period 2 ended, you would have seen that the medical education programs at three schools were on probation. If you looked at the LCME Web site in May 2012 when this article was published, you would see that five are on probation, so the upward trend is unfortunately continuing. Whatever the reasons for the increase in the number of severe action decisions made by the LCME, the reformatting of its standards has unquestionably improved their clarity and precision. As a result, the ability to provide more precise feedback to a medical education program has improved, and the LCME is now much better able to track whether, and to what degree, a program has made a correction. Because of the eight-year accreditation cycle, programs that went through a full accreditation survey in the early to mid-2000s are now becoming due for another full accreditation survey. The staff of these programs may not have been monitoring these standards changes and, as a result, may be puzzled as to why they are being cited for issues that, in the past, were not noticed. It may be two or three more years before the word gets out that, paraphrasing from the old television commercial for Buick cars, “This is not your father’s LCME.”

Funding/Support: None.

Other disclosures: None.

Ethical approval: Not applicable.

* Because a warning by the LCME might precede a future action other than probation (e.g., withdrawal of accreditation), the LCME replaced the term “warning of probation” with the term “warning” in February 2012, after this article went to press. Cited Here...

Back to Top | Article Outline

References

1. Council for Higher Education and Accreditation Specialized/National Advisory Panel; Council for Regional Accrediting Commissions.The Value of Accreditation. February 2010 Washington, DC Council for Higher Education and Accreditation Specialized/National Advisory Panel and Council for Regional Accrediting Commissions
2. Proceedings of the conference between the American Medical Association and the Association of American Medical Colleges, February 18, 1942, Chicago, Illinois. [archives of the Association of American Medical Colleges]

3. Kassebaum DG.. Origin of the LCME, the AAMC-AMA partnership for accreditation. Acad Med. 1992;67:85–87

4. Liaison Committee on Medical Education. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the M.D. Degree. http://www.lcme.org/standard.htm. Accessed January 13, 2012

5. Kassebaum DG, Cutler ER, Eaglen RH. The influence of accreditation on educational change in U.S. medical schools. Acad Med. 1997;72:1128–1133

© 2012 Association of American Medical Colleges

Login

Article Tools

Images

Share