Lenihan, Patrick PhD, MUPP; Welter, Christina MPH; Chang, Carol MPA, MPH; Gorenflo, Grace MPH, RN
The 1988 Institute of Medicine (IOM) landmark report, The Future of Public Health, triggered a quest by the public health community to define itself in a way that would resonate both within the profession and to a broader audience.1 Efforts at definition continued for the ensuing years with several attempts to expand on the IOM Core Functions of assessment, assurance, and policy development, but consensus was slow to develop about how to put the core functions into practice.2
A 1994 article by Edward Baker and James Harrell communicated to a broad audience a functional definition of public health at a time when the role and relevance of public health were being questioned.3 This article presented the Ten Essential Services as a consensus functional definition of public health, which was developed by the Essential Services Work Group composed of representatives from the major national public health leadership organizations.* Attempting to address the invisibility of public health, Harrell and Baker described in some detail what public health is and how public health serves each community through the Essential Services. They urged that these essential services be included as part of health systems reform.
Since then, many local public health agencies have adopted the Essential Services, in principle, as a framework for defining public health. However, this framework is very broad, and describes public health writ large. Even local health departments that embrace this framework do not accept this as a definition in practice that applies directly to them. Furthermore, for years local health departments have prided themselves on their individuality, proclaiming that “if you have seen one health department, you have seen one health department.” This has reflected a lack of understanding that in an era of public expectations for greater government accountability, a murky identity coupled with competition from other agencies that may have more uniform and understandable functions is not likely to generate much public support, especially in the form of increased funding.4
The Operational Definition of a Functional Local Health Department
The National Association of County and City Health Officials (NACCHO), as the national organization of local public health agencies, recognized this dilemma and in 2002 triggered an identity clarification process for public health agencies in the same way that earlier efforts and the Essential Services had more broadly defined public health functions. The product of this effort, entitled Operational Definition of a Functional Local Public Health Department, represents a major step in developing a shared understanding of what everyone, regardless of where he or she lives, can reasonably expect from his or her local health department. It emphasized that this understanding needs to be shared by public health professionals and elected officials at all levels of government. Furthermore, communities and governing bodies need to know what public health agencies should do and be able to hold them accountable.5
The Operational Definition represents the next chapter in this 20-year quest for public health identity and relevance. Based on the Ten Essential Services, the Operational Definition includes 45 standards to help local public health agencies both define themselves along common lines and demonstrate in concrete terms what they can do to improve community health. The Operational Definition standards development was spearheaded by NACCHO and overseen by a series of task forces. Local health officials, local board of health members, state health officials, federal public health professionals, county commissioners, mayors, governor health policy advisors, and state legislators were successively consulted and the draft standards were vetted through these groups over a 2-year period. The strength of the final product, released in November 2005, lies in the iterative approach used and the incorporation of the various stakeholder perspectives.6
Operational Definition Indicator Development
Following the release of the standards, a systematic and evidence-based approach involving NACCHO, consultants, and local health departments was employed for Operational Definition indicator development. Charged with ensuring that indicators could both reflect and measure actual activities of local health departments, the development team felt it was important that Operational Definition indicators match the practice of public health at the local level. Several primary methodological decisions helped ensure that indicators closely mirrored local health department activities. The following provides an overview of the Operational Definition Prototype Metrics development methodology and process, which are also depicted in Figure 1.
Collection of base indicator set
State and local health department performance indicators currently in use provided the primary source of Operational Definition Prototype Metrics. In 2005, the Robert Wood Johnson Foundation funded the development of a Multistate Learning Collaborative (MLC) to advance performance assessment or accreditation of public health departments. Applicants to the project were required to submit their state program assessment indicators as part of their application. The indicators were extracted from the MLC applications. Of the MLC sites available (N = 18), 11 sites had indicators, providing the development team with hundreds of sample indicators from across the nation. Following the qualitative analysis concept, “saturation” (the point at which no new information or themes are observed in the data), MLC state indicators were reviewed multiple times throughout the Operational Definition indicator development process in an attempt to glean a comprehensive picture of local health department activities.7 In total, nearly half of Operational Definition indicators were derived from this initial base indicator set, and reflect indicators used in practice at the health department level today.
The matching process
MLC indicators were then thematically matched to each Operational Definition standard. The great variety of MLC indicators, especially in the wording used, complicated efforts to directly match MLC indicators to Operational Definition standards. An intermediate translation key was needed to ensure some degree of precision and consistency in interpreting the MLC source indicators. The National Public Health Performance Standards (NPHPS) served as a translation key to help thematically match state and local health department indicators to the Operational Definition Standards. As an established set of public health standards that is often used at the state and local levels either directly or in reference, the NPHPS confirmed that the match made from MLC to Operational Definition standards was based on a clear and consistent meaning.
The matching process was completed in two stages, as presented in Figure 2. First, MLC indicators were matched to the 112 “stem” NPHPS questions. Second, the development team matched NPHPS to the Operational Definition standards. It is important to note that this second stage required the development of a “focus statement” that helped clarify the practice-based application of the Operational Definition. The “focus” statement can be found in the final set of indicators located in the far left column beneath each Operational Definition standard in Figure 3, described below.
Final metric development
After matching MLC indicators to Operational Definition standards, the preliminary Operational Definition indicators were grouped and consolidated. Recognizing that performance can occur at several levels, the Handler et al 8 systems model of public health work was used to categorize each indicator as having a capacity, process, or output orientation.8 Capacity, or inputs, included measures such as human resources, fiscal and physical resources, information resources, and system organizational resources necessary to carry out the core functions of public health. Process indicators were described as steps in a program logically required for the program to be successful. Finally, outputs were health programs and services intended to prevent death, disease, and disability, and to promote quality of life. These categories comprise the public health infrastructure that should exist within each health department, suggesting that to fully describe the agency's performance, indicators needed to be available for each category.
Once they were categorized, an extensive review of the quality, diversity, and quantity of the indicators began. Duplication and redundant indicators were revised or eliminated. For some Operational Definition standards, few MLC site indicators were available. When no MLC state and local indicators were available to match the Operational Definition standard, NPHPS was used as a placeholder. The indicator chosen was either an actual NPHPS indicator or one that the development team created from practice-based experience, and the final set of indicators was coded to document their origin (1 = exact measure from an MLC state; 2 = measure had the same meaning as a state measure but was slightly reworded; or 3 = developed from practice experience or NPHPS was used).
Development of illustrative evidence
A final step in the indicator development process was to identify how it might be determined that a standard and its associated indicator were met. Evidence used by existing performance improvement and certification programs, along with that from the MLC sites, was reviewed to provide a list of proposed evidence that matched the final set of Operational Definition metrics. One piece of evidence could apply to multiple indicators, so evidence was proposed for all indicators associated with a given Operational Definition standard rather than attempting to create a one-to-one match between an indicator and an evidence. This had the effect of reducing the evidence that would be needed to demonstrate that a standard and its indicators had been met, thereby reducing the effort required to apply the Operational Definition.
Validation and final product
A modified Delphi process was used to validate the Operational Definition indicators and evidence.9 Thirty-six local health department officials reviewed and provided feedback through both a survey and focus group phone calls on the validity of the developed indicators and the ability to demonstrate corresponding evidence.
The final set of Operational Definition Prototype Metrics includes 250 indicators that were developed over a 10-month process and can be found on NACCHO's Web site: http://www.naccho.org/topics/infrastructure/operationaldefinition.cfm. Figure 3 presents a sample page from the indicator document. The source coding for each indicator remains at the top of the first page. The left-hand column includes the Operational Definition standard along with the “focus statement” developed to help translate the NPHPS and MLC indicators to the Operational Definition. The middle column represents the actual indicators, grouped by Capacity, Process, and Output, and coded for their original source. Finally, the far right-hand column presents the illustrative evidence.
Validity of the Tool
How well Operational Definition indicators actually capture local public health agency activities and how easily the tool can be used are important considerations in judging diffusion in the field. Researchers generally employ several related concepts in evaluating the validity of a set of measures to capture phenomena they are trying to understand and predict.10 The research concepts of internal validity and external validity are relevant to this discussion. Internal validity examines the degree to which a measure captures the elements it is meant to measure, in essence asking the question, “Are you measuring what you think you are measuring?” The external validity perspective looks at whether the metrics as a whole are sufficiently broad to have general application in the field. For practice purposes, these concepts are important for knowing how well the indicators measure what is actually going on in local health departments today (ie, internal validity) and how well the Operational Definition Prototype Metrics represent a comprehensive picture of a local health department (ie, external validity).
Operational Definition Prototype Metrics' validity can, in part, be demonstrated by revisiting how the metrics were developed. The Delphi review process provided an assessment of internal validity as the Delphi participants served as an expert review panel, which rated each indicator on a 5-level Likert-type scale. The Delphi Panel rated less than 5 percent of all indicators as being poor. These indicators were revised in the second round of Delphi reviews, suggesting that the current set of indicators represent a good measurement of each local health department function considered.
The extent to which the Metrics represent a robust portrayal of a typical local health department is less easy to demonstrate. Two factors are useful in assessing generalizability: indicator quantity and goodness of fit. Interpreting quantity is straightforward—a larger number of indicators will better capture the practice diversity of a given standard. For at least one half of the Operational Definition indicators, there was a large variety of MLC practice-based indicators. However, for some Operational Definition standards, there were too few MLC indicators available for use, necessitating the use of NPHPS or expert judgment. For these standards, generalizability must be determined through future application. Goodness of fit refers to how closely a set of indicators clusters around a theme. While it was noted that the MLC indicators were diverse in wording, their content tended to cluster certainly around Operational Definition standards. In summary, we will not know whether the entire set presents a truly comprehensive picture of a local health department until the metrics tool is widely field tested, but this preliminary assessment is promising.
Applications of the Operational Definition for Local Public Health Practice
This initial version of the Operational Definition indicators is the first of its kind to define and describe local health department activities from a performance perspective. As a first attempt, the Operational Definition indicators should be considered in the formative stage of development. While a systematic development process has grounded the indicators in both national public health conceptual frameworks and public health practice, this process has derived only a pilot set of indicators that appear to have a good potential of being applied in practice. It is the practice application at the local level that will convert this promising pilot into a well-accepted tool for local public health agencies.
Indicators developed for Operational Definition Standards have a great potential for transforming public health practice. Exactly how that will unfold is still speculative, but several interrelated possibilities for the application of the Operational Definition seem apparent.
Defining and Marketing Local Public Health
With over half of all indicators in the MLC states matching Operational Definition standards, and a Delphi rating of more than 95 percent for the Operational Definition indicators, these findings suggest that the Operational Definition has already made significant progress in providing the foundation for a uniform performance-based definition for local public health departments. Individual health departments will always have unique characteristics reflecting population, geography, politics, history, and other environmental factors, but at a fundamental level, their functions should generally have some uniformity. Developing a common language and functional description at the agency level is vital to the future of local public health. The Operational Definition may play a key role in helping bridge surface differences and uncover those basic common characteristics. Even if health departments never align exactly with the Operational Definition, the tool becomes a touchstone to help practice initiatives move in a common direction.
Beyond communicating within the public health field, the use of the Operational Definition should enhance communication with the public and policy makers. Articulating local health department functions in a performance-based and standard way should increase awareness about the value of local public health and how local public health should work when performing well. Increased awareness can also help public health agencies build constituencies and partnerships for programs and services. This is particularly important in an era when public health still is not well understood and appreciated, or is defined by a narrow subset of functions (eg, emergency preparedness) that happen to be high on policy makers' and the media's list of concerns. To advance this understanding, NACCHO has developed a series of fact sheets about local health departments, based on the Operational Definition. Three different versions were targeted to elected officials, the general public, and health department employees (NACCHO Web site).
Basis for accreditation
Having a uniform set of performance metrics should help accelerate efforts toward a national accreditation program. The Robert Wood Johnson Foundation's Exploring Accreditation Initiative generated a set of recommendations for establishing a voluntary national accreditation program for state and local health departments. An essential ingredient in any accreditation scheme is the availability of standards and indicators by which to assess performance. The Operational Definition in its entirety is well positioned to serve as the initial set of standards, indicators, and evidence to be pilot-tested for use in a voluntary accreditation program.
Performance measurement, management, and quality improvement
An expected use of the Operational Definition involves performance measurement, management development, and quality improvement. Much as the NPHPS have been used to set standards for the assessment of local public health systems, the Operational Definition sets standards and indicators to help assess the performance of local public health departments. But the Operational Definition indicators do not yet approach the comprehensiveness of NPHPS for measuring agency performance, and there are still significant gaps. Analysis of the currently available indicators from MLC states suggests that while some states have a performance improvement process in place, even these state indicators target specific essential services and not the performance of the health department as an organization.
While not yet complete, the Operational Definition may well provide an excellent starting point for performance measurement. This purpose could be expanded through a national program to collect the Operational Definition assessment data, which could be developed in conjunction with an accreditation system. Incorporating performance measurement and quality improvement into the public health accreditation system will add a more tangible value to accreditation, which may counter potential opposition based on claims that accreditation by itself adds little of value to public health department service delivery capacity.
Public health agency strategic management
The Operational Definition comes at a time when strategic management is becoming increasingly important to effective public health practice. Strategic management tools, such as MAPP, business process analysis, PACEEH, assessment of clinic service delivery options, and Project Ready, all require a more detailed understanding of local public health agency purpose and performance. The Operational Definition could be important in furthering strategic management efforts.* Most approaches to strategic management include a detailed analysis of an organization's strengths and weaknesses based on generally accepted performance indicators. In sectors other than public health, these indicators are available and can be used to benchmark performance relative to similar organizations in an industry or a sector. Even when a point of comparison is not available, having a set of generally accepted performance indicators provides the diagnostic basis to understand strengths and weaknesses and pursue realistic strategies. Once these strategies are in place, performance indicators then provide the basis for strategic management control, as they can be used to gauge progress and document success. Having such performance milestones is particularly important for public health agencies, whose success often hinges on distant and illusive goals aimed at improving community health.
Even in their preliminary form, the indicators are an improvement over what is usually used to assess local public health agency performance. Other tools are starting to become available that, coupled with the Operational Definition, provide a capacity to analyze local health department functions in a detailed and uniform manner similar to what has been done in other sectors. One example is the application of business process analysis and design to local health departments.11 As in other sectors that have adopted strategic management methods, sophistication in application and broad acceptance grow with use.
The Operational Definition can be viewed most simply as the next step in addressing the lack of understanding and appreciation of public health observed in the 1988 IOM report. It comes at a time of great opportunity and challenge. A clearer understanding of the role of local public health departments based on performance can lead to a profound new vision that can inspire a transformation of local public health.
1. Institute of Medicine. The Future of Public Health. Washington, DC: National Academy Press; 1988.
2. Turnock BJ, Handler AS. From measuring to improving public health practice. Annu Rev Public Health. 1995;18:261–282.
3. Harrell JA, Baker EL. The essential services of public health. Leadersh Public Health. 1994;3(3):27–31.
4. Tilson H, Berkowitz B. The public health enterprise: examining of twenty-first-century policy challenges. Health Aff. 2006;25(4):900–910.
5. National Association of County and City Health Officials. Operational Definition of a Functional Local Health Department. Washington, DC: National Association of County and City Health Officials; 2005.
7. Morse JM. The significance of saturation. Qual Health Res. 1995;5:147–149.
8. Handler A, Issel M, Turnock BA. Conceptual framework to measure performance of the public health system. Am J Public Health. 2001;91(8):1235–1239.
9. Delbecq AL, Van den Ven A, Gustafson DH. Group Techniques for Program Planning, a Guide to Nominal Group and Delphi Processes. Glenview, IL: Scott Foresman; 1975.
10. Cook TD, Campbell DT. Quasi-experimentation Design & Analysis Issues for Field Settings. Boston: Houghton Mifflin; 1979.
11. Public Health Informatics Institute. Taking Care of Business: A Collaboration to Define Local Health Department Business Processes. Decatur, GA: Public Health Informatics Institute; 2006.
*This workgroup included the Association of State and Territorial Health Officials, the National Association of County & City Health Officials (NACCHO), the Institute of Medicine (National Academy of Sciences), the Association of Schools of Public Health, the Public Health Foundation, the National Association of State Alcohol & Drug Abuse Directors, the National Association of State Mental Health Program Directors, and the US Public Health Service.3 Cited Here...
*These tools available through NACCHO include the following:
* Mobilizing Action Through Planning and Partnerships
* Taking Care of Business (business process analysis)
* Making Strategic Decisions About Service Delivery: An Action Tool for Assessment and Transitioning
* Protocol for Assessing Community Excellence in Environmental Health
* Project Public Health Ready (emergency preparedness planning)
© 2007 Lippincott Williams & Wilkins, Inc.