Michigan's extensive on-site review process allows participants to exchange key information, share best practices, ask questions, and seek clarification. Michigan conducts site visits every 3 years, sending 15 reviewers per weeklong visit. The state is in its third cycle, having conducted more than 100 site visits of its 45 LHDs. To strengthen its process, Michigan is developing a model that incorporates the NACCHO Operational Definition of a functional LHD into the organizational capacity review.12 The model will increase accountability of agencies' capacity to deliver the core public health functions and the essential public health services.14,15 To illustrate accountability, agencies will have to meet measurable standards and demonstrate improved performance. Recognizing LHDs' strong interest in high performance and continuous quality improvement,2,11 the Shewhart Cycle of Learning and Improvement—Plan, Do, Check, Act—will be used as an integral part of the model.16,17
Missouri's voluntary system has three accreditation levels: primary, advanced, and comprehensive. On the basis of its capacity, the LHD selects one of these levels when it begins the self-assessment process, which may last up to a year. Each agency does a process and impact evaluation, and the reviewers complete a process review with each visit. Site review teams are made up of three reviewers per team (there are 16 in all); reviewers typically hold advanced degrees and have held positions in local public health agencies. They receive an honorarium for their reviews. An independent evaluator compiles the results and submits a report to the Accreditation Council, which then revises processes as needed. Missouri uses an evaluation process to maintain the integrity of the site review, the reviewers, and the participating agencies.
The Missouri Institute for Community Health constantly refines its processes to eliminate duplication in its standards and the review process and be more customer-centered. The first 3-year cycle was completed summer 2006; standards were revised and streamlined for the second cycle.6
North Carolina's LHDs include rural and urban areas, large and small health departments, public health authorities, a community health alliance, and district health departments that represent multiple counties. North Carolina's mandatory standards-based Local Health Department Accreditation program focuses on a set of minimal standards that must be provided to ensure the protection of the health of the public, but does not limit the services or activities an LHD may provide to address specific local needs. While the benchmarks being applied are similar to the NACCHO Operational Definition of a functional LHD12 and are drawn from work done in other states, the activities are specific to practices in NCLHDs. The NCLHDA does not create a new accountability system; rather, it links basic standards to current state statutes and administrative code and the many Division of Public Health and Division of Environmental Health contractual and program monitoring requirements that are already in place.
The site review teams are made up of not fewer than four individuals, selected by the independent accreditation administrator from a pool of trained peer volunteers. The individuals receive a customized comprehensive training, and participate in conference calls and team meetings before the beginning of each site visit. They receive an honorarium for participating in the program.7
Since 2001, Washington State has conducted three cycles of assessment. Washington uses 23 standards to assess the performance of the public health system. These standards have been cross-walked with the Core Functions of Public Health, the Ten Essential Services, and the NACCHO Operational Definition, and are used for quality improvement and to augment program measures and outputs. This approach is predicated on the belief that standards and program measures are essential tools for improving outcomes.18
On-site reviews for each LHD and most areas of the state health department are carried out by independent contract consultants and staff peer reviewers. Training is conducted with the staff peer reviewers and all reviewers participate in interrater reliability testing, to assure that scoring is consistent. Before the site visit, the LHD completes a self-assessment. Site-visit activities include the review of written documentation and entrance and exit interviews. Each LHD receives a written report detailing its performance for each measure and recommendations for improvement.
The Washington program makes a special effort to make its site visits useful for gathering and sharing best practices systemwide. All on-site reviews are conducted during a 4- to 5-month period, allowing for a systematic assessment and snapshot of statewide performance. The system report identifies the specific results for each measure, comparisons between the state health department and LHDs, comparisons with previous years' results, recommendations about systemwide improvements, and improvements to the standards, measures, and processes. Documents that exemplify the work expected for each measure are collected and placed on a Web site, and agencies are encouraged to use these exemplars they work to improve their individual results.8
In each of the MLC states that uses on-site review, it is a central feature of the accreditation or performance assessment process. The processes differ: Michigan, Missouri, and North Carolina evaluate LHDs only and do so on a rotating schedule across multiyear cycles; Washington evaluates LHDs, selected state programs and the state Board of Health, and conducts all reviews during a 5-month period every 3 years to capture system performance. The length of time on site and number of reviewers can vary from one half-day to 5 days, and 2 to 15 reviewers. All of these states emphasize assessing administrative capacity and adequacy of the public health infrastructure.
Most of these states use self-assessment to help the agency prepare for the on-site review. Anecdotal reports and survey data suggest that the amount of time spent on the self-assessment and the seriousness with which the self-assessment is conducted may positively impact the accreditation or performance assessment outcome in terms of expediency and/or need for correction to achieve conformance. Michigan surveyed agencies, reporting that 86 percent of respondents agreed the self-assessment is useful for identifying areas needing improvement.11 In the same survey, 84 percent of respondents reported that the self-assessment is useful for preparing for on-site review and 76 percent reported that the self-assessment is a catalyst for prereview consultation.11 These data were used previously by Michigan to validate the usefulness of self-assessment as a tool for preparing for the on-site review and identifying areas needing improvement.
As the self-assessment determines the LHD's readiness, the on-site review serves as the springboard for quality improvement and provides the venue to evaluate conformance with standards and measures. The on-site review is used to conduct interviews and examine agency documents and policies. It also provides a forum for discussion about conformance and best practices.
No state holds site visits more often than every 3 years. All states evaluate the on-site review process and are concerned with the challenge of reviewer consistency in evaluating conformance with standards. To address this challenge, increase consistency, and improve the review process, all of these states provide ongoing training for reviewers and the agency undergoing review. Missouri uses a manager to ensure the site-visit consistency. Washington uses an established interrater reliability process with mock evaluations, and Michigan and North Carolina use various interrater reliability processes, such as reviewer observation or job shadowing, depending upon the need.
Coordinating the on-site review process, including scheduling and making logistical arrangements, is labor intensive. In most states, a third party coordinates the review process and handles related activity. This third party is typically a public health institute or nongovernmental entity, such as a private contractor. Reasons for this vary, but include aspects of neutrality and technical expertise.
It appears that both the voluntary and mandatory approaches have the potential for full participation, which may have positive implications for a national voluntary system of accreditation for local and state public health entities. Michigan and North Carolina require participation in accreditation or performance assessment and the associated on-site review process; in Missouri and Washington, it is voluntary. Both Michigan and Washington have achieved 100 percent participation in their programs, although one is mandatory and the other voluntary. Missouri and North Carolina expect to achieve full participation. Of these states, none charges a fee specifically for the on-site review, although Missouri does charge an application fee for its voluntary accreditation program.
An important consideration for any accreditation or performance assessment program is reviewer knowledge, experience and skills, and organizational affiliation. These reviewers are selected by various methods, and their organizational affiliations and educational backgrounds vary. Michigan uses only state department reviewers—Missouri specifically excludes state employees from participation. Washington uses private contractors and North Carolina uses trained peer volunteers selected by an independent accreditation administrator. All of the states have utilized peer reviewers at some point in their processes. Reviewers typically hold advanced degrees, have extensive state or local public health experience, and are specialists in their field.
Other important components of the on-site review process in all of these states are the exit interview and the on-site review findings reports. Exit interviews are mandatory and serve as forums where both strengths and weaknesses are discussed. Written reports of the on-site review findings are provided to all reviewed agencies. If standards are not met, three states require the reviewed agency to submit a corrective plan of action and be re-reviewed for conformance.
All states evaluate their on-site review process. Their ongoing evaluation activities continue to demonstrate the critical importance of the face-to-face exchange between reviewers and participants. Michigan's survey data suggest that success of the face-to-face exchange (as viewed by LHDs) depends largely upon reviewer skills, knowledge, attitude, beliefs, and behavior.11 The survey results found marked differences between reviewer and LHD perception with regard to how well the face-to-face exchanges were occurring. For example, when asked, “Do program reviewers have a good understanding of accreditation standards?” 50 percent of LHD respondents indicated “no” while 100 percent of reviewer respondents indicated “yes.”11 When asked, “Do reviewers apply accreditation standards the same way?” 80 percent of LHD respondents indicated “no” while 70 percent of reviewer respondents indicated “yes.”11 In addition, when asked, “Do program reviewers apply accreditation standards the same way at each LHD?” 40 percent of LHD respondents indicated “no” while 90 percent of reviewer respondents indicated “yes.”11 In spite of these differences, when asked the question “Is the on-site review an opportunity for constructive program-related dialogue?” 70 percent of LHD respondents indicated “yes” and 90 percent of reviewer respondents also indicated “yes.”11 The data suggest that differing perceptions exist depending on whether one is a reviewer or an LHD and that successful on-site review processes hinge on positive, productive, and appropriate interaction between reviewers and state/local participants.
All states in the MLC conduct accreditation/performance assessments to measure LHD organizational capacity and performance. In addition, these states also are conducting or exploring the use of accreditation or performance assessments to measure state health agency performance and capacity. Specifically, Washington has been conducting a state performance capacity assessment since 2002; Missouri completed an analysis of the state system in 2005, using the National Public Health Performance Standards and is exploring next steps; North Carolina initiated a state assessment in 2006; and Michigan began to explore development of a state process in the fall of 2006. Each state uses or plans to incorporate an on-site review component with respect to measuring state agency performance.
Illinois, although an MLC state, was not included in this study because it does not conduct on-site reviews for LHD certification. In the past, Illinois Department of Public Health Regional Health Officers conducted on-site reviews with certified LHDs for the purpose of performing midterm reviews of LHD compliance with Illinois certification standards. Illinois is exploring the future function and structure of on-site reviews. Future research into this topic should take into account the Illinois experience as it unfolds.
The examination of accreditation/performance assessment programs in Michigan, Missouri, North Carolina, and Washington has implications for other states and an evolving national accreditation model. When constructing the framework for state/local and/or national accreditation or performance assessment initiatives, several key concepts should be considered.
The MLC states in this study have found including on-site review and self-assessment components vital to their ability to assess accountability and performance of LHDs. According to their experiences, on-site review provides the venue to assess, observe, interview, review, evaluate, and/or survey a local/state agency or program regarding its ability to meet a set of public health standards. This process opens up a dialogue that allows LHDs to amplify, clarify, and verify their activities. It is expected that over time this dialogue will build trust between LHDs and their reviewers. That potential hinges on positive, productive, and appropriate interaction between public health professionals. Therefore, to maximize buy-in to an assessment or accreditation program, state reviewing entities should address reviewer recruitment, selection, skill set development, training, and interrater reliability, and ensure that LHD perceptions of reviewers are in line with reviewers' own perceptions of their roles and abilities.
Architects for new accreditation or assessment initiatives should consider establishing accreditation on-site reviews at both the state and the local levels to promote the support and collaboration between state and local public health in each state. Advancing public health for all citizens requires a partnership—a systems approach, which requires assessment of both state and local performance focusing on improved public health outcomes.
Other essential activities related to the on-site review process should include a focus on continuous quality improvement of the review process itself, including the reviewers' and the LHD's experiences during the review. Accreditation and performance assessment are living processes. All MLC states have learned that accreditation/performance assessment programs—including the site visit component of such programs—must evolve and continuously improve pursuant to evaluation results, new information, and changes in public health practice or priorities.
On the basis of their experiences, the states in this study found that standards used for on-site reviews are best based on the three core functions of public health—assessment, policy development, and assurance, the Ten Essential Services 15 and, if appropriate, the Operational Definition of a Functional Local Health Department. 4 The MLC states used these resources to varying degrees, in alignment with national efforts to measure public health performance.
One of the primary purposes of on-site review, and an added value in several of the states, is that entities responsible for accreditation or assessment were able to identify, collect, and disseminate effective or model practices. Sharing effective practices can aid in efforts to improve the health status of communities and service areas and improve processes within state and/or LHDs. In some cases, as in North Carolina's, using the services of peer reviewers served the dual function of providing for a competent, experienced review process, and allowing the reviewers themselves to learn from the process of reviewing other LHDs and bringing lessons back to their home agencies.
An objective accreditation or assessment process to evaluate state and local public health performance is critical to improving the protection of the public's health, assuring governmental sector accountability, and building and maintaining public health capacity across the nation. According to the states forging the way on this issue, using on-site reviews is a valuable component of this process and should be considered by all states planning to develop accreditation programs.
1. Institute of Medicine. The Future of Public Health.
Washington, DC: National Academy Press; 1988.
2. A proposed model for a voluntary national accreditation program. Exploring Accreditation Project Web site.. Published May 19, 2006. Accessed June 21, 2006.
3. Beitsch LM, Thielen L, Mays G, et al. The Multistate Learning Collaborative, states as laboratories: informing the national public health accreditation dialogue. J Public Health Manag Pract.
4. Mays G. Can accreditation work in public health? Lessons from other service industries. Working paper prepared for the Robert Wood Johnson Foundation.. Published November 30, 2004. Accessed May 10, 2007.
5. Michigan Local Public Health Accreditation Program Web site.. Accessed June 21, 2006.
6. LPHA Accreditation Program. Missouri Institute for Community Health Web site.. Accessed June 21, 2006.
7. North Carolina Local Health Department Accreditation. Overview.. Accessed June 21, 2006.
8. Public Health Improvement Partnership—Performance Management.. Published November 2006. Accessed July 12, 2006.
9. Michigan Department of Community Health. Michigan Local Public Health Accreditation Program Tool: Developing and Implementing the Accreditation Program
. Lansing: Michigan Department of Community Health; 1999.
10. Pyron TS, Cline G, Tews DS, Parker MD. The Michigan Local Public Health Accreditation Program: many partners—one vision. J Public Health Manag Pract
11. Michigan Department of Community Health. Accreditation Quality Improvement Survey: Executive Summary and Analysis
. Lansing, MI: Center for Collaborative Research in Health Outcomes and Policy; 2003.
12. National Association of County and City Health Officials. Operational Definition of a Functional Local Health Department.
Washington, DC: National Association of County and City Health Officials; 2005.
13. National Public Health Performance Standards Program. US Government Department of Health and Human Services, Centers for Disease Control and Prevention Web site.. Accessed July 12, 2006.
14. Institute of Medicine. The Future of the Public's Health in the 21st Century
. Washington, DC: National Academies Press; 2003.
15. Centers for Disease Control and Prevention, Public Health Functions Steering Committee. The essential public health services.. Accessed May 10, 2007.
16. HCi Professional Services. PDCA cycle.. Accessed June 21, 2006.
17. Turning Point: Collaborating for a New Century in Public Health. From silos to systems: using performance management to improve the public's health.. Accessed June 22, 2006.
18. Mauer BJ, Mason M, Brown B. Application of quality measurement and performance standards to public health systems: Washington State's approach. J Public Health Manag Pract.
Keywords:© 2007 Lippincott Williams & Wilkins, Inc.
accreditation; essential services; local public health departments; Multistate Learning Collaborative; NACCHO Operational Definition; on-site review; performance assessment; public health standards; quality improvement