Potter, Margaret A. JD, MS; Schuh, Russell G. EdD; Pomer, Bruce MPA; Stebbins, Samuel MD, MPH
Local health departments (LHDs) provide an array of essential services on a day-to-day basis. When emergencies occur, they must reallocate or augment resources such as personnel time, compromise working infrastructure such as program schedules, and, depending on severity and duration of the event, even suspend routine service outputs. These changes define an organization's “adaptive” response.
Previous studies in the private nonprofit sector had established that since all organizations develop in stages of size and complexity, measurement of certain commonly held attributes can yield insights on their capacity and sustainability.1,2 That previous work suggested that similar measurements could describe organizational adaptations occurring during the course of an emergency. The common organizational attributes of public health departments—as for other organizations—include personnel or staffing, activities or outputs, and infrastructure (including both physical facilities and organizational structure).
This article reports on the collaboration between a number of California health departments and university-based preparedness researchers that led to development of the Adaptive Response Metric (ARM). Through repeated iterations of data gathering, analysis, and reliability testing, patterns of change in these attributes emerge during the course of emergency response. The result is a standardized rubric for stage-state scoring, which was described methodologically in a previous publication.3 Originally intended to be a tool for managing emergency response and recovery, the ARM's pilot testing with a group of LHDs in California suggest its utility also for preevent planning and after-action reporting. The H1N1 pandemic of 2009-2010 provided an opportunity to apply the ARM to a relatively long-duration emergency event.
This article presents the process of developing the ARM through interaction of the research team with participating LHDs. First, we describe the ARM development process and its central features. Second, the preliminary assessments by health department officials from 8 participating counties are presented on the basis of application of the ARM method in a variety of emergency types. Third, we present the detailed results of using the ARM to capture and analyze H1N1 event data for 1 county. The article concludes with a discussion of how to further refine the ARM for planning, decision making, and after-action analytics, as well as its potential contribution of reliable data for research and policy development.
California Health Officers' Initiative
In 2005-2006, the Health Officers Association of California initiated the development of a planning and assessment tool for use in preparedness efforts its member health departments.4 Its director (B.P.) eventually invited researchers into the process. The collaboration depended on weekly teleconferences, monthly reports to the Health Officers Association of California Board and the state's Emergency Preparedness Office, and regular meetings with a special Advisory Committee composed of local and regional public health emergency preparedness coordinators, local health officers, and LHD executives, as well as representatives from the state.
Development of the ARM
Development of the ARM was based on methods of evaluation science that emphasize the interactivity of researchers and end users.5 The research team engaged 8 LHDs in data gathering and analysis. The health departments represented urban and rural areas; north, south, coastline, and border jurisdictions; as well as low-, medium-, and high-funding per capita levels. The university's institutional review board reviewed and approved the exempt study design but disallowed disclosure of the participating counties by name.
The health departments supplied data from official Web sites, internal documents, and after-action reports, as well as personnel who participated in semistructured interviews. Researchers considered these data in 4 categories:
- Structural data described each health department's relationships and accountability within the local government, organizational structure, functional units (ie, divisions or departments), budgets, and position descriptions.
- Personnel data included, for each functional unit, the number of individuals and full-time equivalents, exempt and nonexempt job classifications, and work hours.
- Infrastructure data captured each functional unit's processes, facilities, equipment, and supplies.
- Output data described the types and frequencies of all programs, services, products, and reports by functional unit.
The structural data were used to construct a matrix for presenting personnel, output, and infrastructure data over periods of time. As shown in Figure 1, the row assigned to each functional unit in the matrix corresponds in height to the proportion of that unit's overall contribution to the health department's effort. A separate row for each functional unit (not illustrated in Figure 1) could be used to represent personnel, infrastructure, or outputs, with rows varying in height relative to the size of each unit. Columns in the matrix correspond to time periods (ie, day, week, or month). Color coding for each cell in the matrix denotes each of the progressively adaptive stage states: 1, blue; 2, green; 3, yellow; 4, orange; 5, red. For example, changes over time in personnel activities (as found in time records, deployment records, and cost recovery records) are represented by changes in color across time periods; similarly, changes in infrastructure (such as extension of operating hours, addition of operating shifts or operating days, and outsourcing) might also be represented by changes in the color code. The color code associated with each stage thus provides a visual representation of change in resource consumption over the course of a response as the health department adapts from normal to disaster and eventually back to normal levels.
The Table shows a rubric for determining the stage state of emergency response burden in a public health department as measured by personnel, outputs, and/or infrastructure. For each data attribute at each stage, the Table shows a 2-part definition: first, a qualitative description of relevant observations, and, second, a quantitative measure derived from the data.
Over a period of years (2009-2011), the research team collected data from emergency response activity in participating LHDs and entered the data into spreadsheets. The spreadsheet data were then presented as color-coded visualizations of ARM stage-state scores. ARM scoring was prepared for the responses to a variety of public health emergencies, including a local measles outbreak, a series of summer wildfires, a natural gas explosion and fire, and the H1N1 pandemic. Local health department leaders from the participating counties provided feedback regarding usability, accuracy, and reliability of ARM visualizations. Their critiques led to several insights about the ARM's potential utility.
First, ARM visualizations revealed with specificity the functional units stressed by emergency response and those relatively unaffected by it. Some units could maintain normal or stage 1 category operations, whereas others were moving to stages 3, 4, and 5. Even the functional units most affected by response burden did not typically maintain stages 4 and 5 adaptations for the full duration of an emergency but rather returned gradually to normal functioning. Sometimes, regular weekend cycles of reduced activity punctuated an entire emergency period. Health officers observed that recognition of such peaks and cycles could influence the duration of active-duty shifts and thus avoid burnout among key personnel—a major stressor during prolonged responses.
Second, the practitioners found that the ARM gave a picture of emergency response more nuanced than subjective recollection alone could give. For example, when interviewed about response to a season of wildfires, a health department's staff had recalled exerting a maximum effort for the entire summer, but daily time logs as captured and presented by the ARM showed a response actually occurring in 3 distinct peaks separated by days or weeks. The peaks corresponded roughly with the occurrence of 3 waves of wildfires affecting different acreages of land. The ARM also revealed characteristics of response that other types of after-action reporting might mask or minimize. Using data from several counties, the ARM showed a particular stressor of the H1N1 pandemic: it had been severe enough to require diversion of resources from day-to-day outputs but not so severe as to call for official suspension of those outputs. The need to maintain both ongoing response activities and near-normal day-to-day operations caused significant stress on many health department employees, especially at senior management levels where fewer options for backup were available.
Finally, ARM visualizations provided a measure of adaptive response activities that was standardized across all hazard types. The type of hazard would determine which functional units experienced the greatest stresses. For example, the ARM showed that the Animal Control Unit was more heavily engaged than other units in one county health department during the occurrence of summer wildfires. Regardless of event type, the ARM measured the same attributes by consistently defined stage states. In this way, it allowed for the accumulation and generalization of insights for health officers' future planning and response management.
H1N1 Response as Recorded and Visualized by the ARM
During the 2009-2010 H1N1 pandemic, the research team used the ARM tool to capture data and present analytic outputs from several of the participating California counties. For one of these counties, the research team was also able to obtain corresponding outbreak data.
This county health department served a moderate size population of 150 000 and had a staff of 162. During H1N1 pandemic, it conducted its immunization efforts entirely with internal staff. Some of its specimen analyses were done by its own laboratory and a portion was outsourced to larger LHDs; this was consistent with observations in other California counties during the pandemic. Outbreak data for the county had been recorded inconsistently from July 2009 to March 2010. Confirmed case data were available for some weekly or monthly periods, and hospitalization and/or death data were available for other periods—some overlapping with and some separate from the case data. Therefore, to estimate a weekly case count from the existing data for the entire period, the research team derived H1N1 case numbers from hospitalization and mortality rates reported at the state level and then normalized all counts to weekly periods. Personnel data were gathered from postevent review of personnel records and interviews with personnel. The data account for 92.6% of the health department's total personnel.
As shown in Figure 2, the comparison of ARM visualizations with outbreak data illustrates how stress—as measured by personnel resource deployment—varied during the H1N1 response. The lower graphic in Figure 2 presents the county's confirmed H1N1 cases on a weekly basis for the 39-week period, and the upper graphic is the ARM color-coded representation of the proportion of personnel within each administrative unit who were deployed to the response on a daily basis during the same period. Each functional unit 2 is represented by a horizontal bar: immunizations, HHS-funded bioterror, health office administration, epidemiology, public health laboratory, emergency medical services, and regional services. The stress levels among functional units varied. Most units experienced low stress as shown by stage 1 (blue) and stage 2 (green) throughout the pandemic period. However, a period of high stress began in week 17 (about mid-October) and was experienced most severely by 2 units: health office administration and regional services. The high stress period was relatively brief, declining substantially after weeks 20 and 21.
This stress period shown in the ARM graphic corresponds to the peak case incidence between weeks 17 and 22 as shown in the epidemic curve. As the pandemic waned, there were minor peaks at weeks 25 (mid-December) and 29 (mid-January), but these peaks appeared not to produce high stress levels for personnel resources in any of the health department's functional units.
Further Prototype Development
The 2008 report of an Institute of Medicine committee called for research to develop “criteria and metrics” to measure the nation's preparedness.6 The ARM prototype described in this article represents an effort to meet that need. The preliminary critiques and the application to H1N1 response, as reported earlier, suggest that the ARM can contribute to an understanding of emergency adaptation patterns in public health departments. Nevertheless, potential applications of the ARM prototype in the phases of planning, response, and after-action analytics warrant further exploration.
First, the ARM might assist during emergency planning by bringing focus to areas in need of continuity-of-operations planning, investments in technology, or exercising and training. In counties that participated in preliminary testing, health officials used ARM reports as the basis for improving response operations: they decided to activate their continuity-of-operations plans for critical routine functions to ensure maintenance throughout any emergency.
Second, if used in real time during an emergency response, ARM visualizations might show where resource allocations (such as personnel time) could be adjusted across functional units. For example, a reallocation of cross-trained personnel might relieve stress on those functional units most engaged in response.
Third, as an after-action analytic tool, the ARM might provide a visualized record of response data captured during an event. The ARM offers the potential for a standardized, validated, and quantitative data set to characterize response burden across jurisdictions and emergency types. Its visualized analytic outputs might assist practitioners in accumulating experience both specific to an event and generalized to broader practices and policies. The ARM allows for generalization in the sense that it measures common attributes—such as size, structure, and staffing—that vary widely among LHDs. The importance of the ARM lies in its potential for identifying developmental and behavioral patterns among LHDs. ARM visualizations may thus enhance the ability of practitioners to draw inferences and “lessons learned” from isolated, infrequent events, which is difficult using current observational methods.
In addition to these practical applications, the ARM might also contribute to research and policy development. Use of the ARM by many health departments across jurisdictions and emergency events could improve the availability of reliable data. Since the ARM captures organizational structure and budget information, its outputs could standardize the measurement and comparison of response burdens across jurisdictions. Modeling studies could be used to optimize the selection of variables most important to decision making in emergency response so that over time after-action reports could be more selective in the inclusion of data. Widespread and frequent use in a variety of emergency events could help refine and validate the scoring rubric.
To realize its potential as a tool for practical decision making, research, and policy development, the ARM prototype needs further development. In its present form, the ARM requires time-intensive data gathering, personnel trained in using the scoring rubric, and manual analysis and plotting of much data. For routine use by LHDs, the tool should be customized for particular organizational characteristics, built for real-time input of data, and automated for analysis and output visualizations. Field testing with trained users could produce concise instructional guidebooks.