Insights From Evaluation of Practice Experience
The 2009 H1N1 vaccination campaign was the largest such effort in US history. More than 124 million Americans, close to 41% of the population, were vaccinated against the pH1N1 virus in late 2009 and early 2010.1 Local health departments (LHDs) were responsible for administering vaccine to the public during the 2009 H1N1 campaign, but they had relatively little guidance or experience to inform their efforts, which were complicated by the campaign urgency and uncertainty and (at the outset) high public demand and a limited supply of vaccine. They could not depend on previous experience or established “best practices” to decide how to administer vaccines to the public most efficiently.2 As a result, there were extensive local differences in vaccination processes, which suggests that there is room for improvement in the quality of the response across locations.3
The objective of this project was to learn from H1N1 in order to improve public health systems responses to similar events in the future. We used a positive deviance approach,4 beginning by identifying LHDs that performed beyond expectations during the 2009 H1N1 vaccination campaigns. We then used realist evaluation to learn about the combinations of context and mechanisms that led select LHDs to perform well. The realist perspective posits that interventions provide catalysts that change individual/organizations' reasoning processes, their motivations, capacities, opportunities, and the like, to promote change.3 The “context” in which an organization operates (its resources, staff, funding, etc) makes a difference in the outcomes it achieves. Different contexts might enable or prevent certain “mechanisms” from being triggered. Thus, there is always an interaction between context and mechanisms that influences outcomes, represented as: Context + Mechanisms = Outcomes or C + M = O. We focused on high-performing LHDs rather than comparing their experience with the LHDs that performed poorly. This approach is innovative because it focuses on learning from those who performed well rather that learning what not to do. In addition, defining LHDs that performed poorly is a challenge, as peers are unlikely to tell researchers who did not do well. Those LHDs that performed poorly are also less likely to share their experience than those that did well.
We used process mapping to define the steps involved in implementing public vaccination clinics. We then conducted in-depth interviews with the staff of high-performing LHDs, using the process maps as a guide, to learn about the context and mechanisms that led to successful public vaccination clinics. The 20 LHDs chosen for in-depth analysis represented a variety of geographic locations across the contiguous United States. The LHDs served populations in urban, suburban, and rural communities in Washington, Texas, California, Illinois, Pennsylvania, New York, Massachusetts, Louisiana, and Kansas. There were 5 LHDs in communities with fewer than 100 000 people, 6 with populations fewer than 500 000, 4 with populations between 500 000 and 1 million people, and 5 serving communities with more than 1 million people. The number of employees working in LHDs during the vaccination campaign ranged from as small as 2 to more than 1800 full-time employees. We defined successful LHDs by developing process maps describing each stage of the vaccination campaign process. We reviewed the NACCHO Model Practices Database, After Action Reports, and more than 100 publications and reports and worked with public health partners to define high performers.
We focused on successful practices in different contexts. Some LHDs had far more resources than others, but those with fewer resources found creative ways to leverage existing relationships to be successful. While resources may have played a role, our goal was to look for successful practices regardless of resources outside of LHD control (ie, funding, staff, etc). We found that successful LHDs defined priority groups, communicated with the public, maintained adequate staffing, established community partnerships, and maintained flexibility. We also found specific contexts in which particular mechanisms led to successful outcomes. For example, small, rural LHDs depended strongly on personal relationships and less formal partnerships than larger, urban LHDs. This is most likely a reflection of the size of local communities. In rural communities, a neighbor may also be the local police chief who has a child in the same school. Such personal relationships allow for less formality in working with partners. Large, urban LHDs may require formal memoranda of understanding because organizational staff did not know each other.
The positive deviance and realist evaluation approach allowed us to identify specific mechanisms that, in certain contexts, led to successful public clinics. In this way, LHDs can learn from the experience of other LHDs in similar contexts to implement successful public vaccination clinics in the future.
3. Pawson R, Tilley N. Realistic Evaluation. London: Sage Publications Ltd; 1997.
4. Marsh DR, Schroeder DG, Dearden KA, Sternin J, Sternin M. The power of positive deviance. BMJ. 2004;329(7475):1177–1179. doi:10.1136/bmj.329.7475.1177.