Weill Cornell Medical College, New York.
Correspondence: Nathaniel Hupert, MD, MPH, Weill Cornell Medical College, 402 E. 67th St, LA-219, New York, NY 10065 ( firstname.lastname@example.org).
This work was made possible by unrestricted support for the Cornell Institute for Disease and Disaster Preparedness by Weill Cornell Medical College and the New York-Presbyterian Hospital/Weill Cornell Center. The author thanks Prof John A. Muckstadt and Dr Richard Hatchett for their many contributions to the concepts discussed in this commentary.
Disclosure: The author has no relevant financial ties to disclose.
Hurricane Sandy, which hit the northeast United States in late October 2012, has been called the most accurately modeled storm of all time: the National Hurricane Center correctly projected the storm's path, timing, and potential for record-breaking storm surge in New York Harbor and Long Island Sound well over 24 hours before landfall.1 What happened after Sandy hit is another story entirely: 2 of the 5 hospitals in New York City that evacuated patients because of the storm found themselves doing so in the dark, with no operable elevators or communication systems. Backup power systems and other critical infrastructures—which had been widely reinforced after the prior year's near-miss with Hurricane Irene—were torn apart by the floodwaters that left large parts of the city waterfront, and many of the nursing homes that were situated there for the “ocean views,” uninhabitable to this day. Half a year later, patient flows through the citywide health care network are still recovering from the consequences of these unanticipated closures.
The decision to evacuate a hospital in the face of a disaster is a complex risk assessment with life-or-death consequences, typically made with incomplete information and on an accelerated time schedule—ingredients that invite caution when critiquing outcomes. But in the run-up to Hurricane Sandy, it appears that many health system leaders in the New York region engaged in a classic error of judgment called “anchoring,” perhaps because they lacked the ability to dynamically recalibrate their understanding of the storm's consequences when faced with worsening tidal surge projections.2,3 By letting 2011's anticlimactic experience with Hurricane Irene weigh more heavily in their mental risk calculus than the actual meteorological evidence that Sandy had the size, atmospheric pressure, and angle of attack to cause extreme inundation in the East and Hudson rivers, these leaders chose not to preemptively transfer patients from hospitals that were shown days before landfall to be within the water's destructive path. It is perhaps a testament to the power of prior expectations that some of these health care leaders claimed even afterward that “all signs pointed against a storm emergency,” although Mayor Michael Bloomberg used the very same data to predict—to within a foot—the actual catastrophic level of storm surge more than a day prior to Sandy's arrival.
The articles in this special issue of Journal of Public Health Management and Practice discuss the wide variety of dynamic systems that affect preparedness and response to public health crises. It is instructive, in this context, to consider 2 of the many dynamic systems that played out in Hurricane Sandy—in the hurricane and at the hospital. Thanks to decades of federal investment in meteorological science and computer modeling, which, in turn, benefited from centuries of advances in physics, chemistry, and mathematics, scientists were able to make remarkably accurate predictions for the natural phenomena occurring within the storm's 1000-mile-wide wind field.4 For the city block-scale hospitals that were of wholly human design and operation, however, we could do nothing of the sort. Quantitative modeling of hospital activities has a much shorter history, with attention paid mainly to issues of scheduling and bed supply, largely in countries with nationalized health systems for which efficiency of these systems is a priority. Many of these models aim to capture a very un-dynamic feature of operations, namely, long-term steady-state equilibria that are of interest to regional capacity planners. Predicting the likelihood of failure of critical hospital systems requires conceptually flipping this standard approach on its head—instead of expected values, it is the potential for extremes that matter for understanding the resilience of health care system under stressful conditions.5
For the last decade or more, the standard approach to address this uncertainty is a facility-specific hazard vulnerability analysis carried out by hospital preparedness planners. Hazard vulnerability analyses start with the question, “What risks do we face and what might be the consequences of those risks?” Working through these questions enables a strategic form of analysis that, ideally, leads to answers to questions such as: “What might be done to mitigate those risks? What is feasible to do now, and what changes need to be put in place to do more? And what will these steps cost in time, effort, and money?”
When a crisis occurs, however, the long-term strategic approach is the first thing to go out the window, as noted by Eisenhower in his famous comment that “in preparing for battle, plans are useless, but planning is indispensable.” In its place, emergency responders need tools to help maximize the effective use of existing resources in the face of uncertainty affecting both demand for and supply of services. When the dynamics of crisis response are considered in this light, tactical flexibility is key. Knowing the optimal order of evacuation of a critical care unit's occupants may not be of much use, for example, when the pre-event plan called for elevator use but the hospital is left without power. In such situations, human ingenuity and perseverance often save the day: the fact that not a single one of the hundreds of evacuated patients died during Sandy's aftermath was widely touted as a sign of success of the entire process. But the very fact that the evacuations had to rely on heroism can be seen as a sign of system failure: given the chance to relive the flood, what hospital administrator would have chosen to place the fate of those emergently evacuated in such tenuous circumstances?
Critics of this line of thought might counter that the particular system failures that happened during Sandy occurred because of a unique confluence of events that is unlikely to be repeated. They might argue that, for example, a probabilistic risk analysis would have predicted that a preemptive evacuation likely would have caused more harm than good. The proper course of action, under this line of reasoning, is to improve institutional resilience in the prestorm period and then essentially ride it out once the decision to remain open had been made. The driving notion here is, to paraphrase Tolstoy, that “every hospital crisis is a crisis in its own way,” so attempts to project outcomes are doomed to fail.
But if what these critics say is true, that every crisis is sufficiently unique to preclude reliance on pre-event projections of possible outcomes as the basis for defensive actions, then their line of reasoning also provides powerful argument for a rethinking of how we address the dynamics of crisis response, with special attention to the iterative role of data gathering and analysis to inform actions. As shown in the Figure, effective crisis response is highly dependent on combination of real-time data and the ability to transform those data into actionable information. The 2011 Tōhoku earthquake and tsunami provides a good example of the relationship between data and information in this context. Moments after the earthquake was sensed, wave-height data from sea-based buoys received by the Japanese Tsunami Center were used to predict potential tidal wave heights along the country's northeast coast.6 There were 2 problems, though: first, the initial magnitude of the quake was incorrectly underreported as a magnitude 7.9, leading to underestimates of maximal wave heights, and, second, when the magnitude was corrected upward to 9, it turned out that there were no prerun scenarios with such extreme inputs, so there was no immediate correction sent to at-risk communities. In short, there was no way (or at least, no quantifiable way) to turn the dynamically changing data into similarly dynamically responsive actionable information in the minutes before the tsunami hit. In light of these failures, the Japanese National Research Institute for Earth Science and Disaster Prevention has begun work on an ambitious fleet of model-linked sensors off their eastern seaboard, the purpose of which is to feed sensor data into real-time wave-height models.
Although natural and human-caused disasters are a constant feature of modern life, each individual disaster can indeed be seen as a unique confluence of uncertainties. Sandy, Tōhoku, and other recent crises show that effective response under such variability requires 2 dynamic elements: (1) real-time situational awareness in the form of quantitative assessments of both threats to and functioning of critical systems; and (2) flexibility in marshaling resources to maintain continuity of operations of those critical systems, which, in turn, requires robust and tested collaborative information systems. These are the ways that modern health system disaster preparedness can aspire to be evidence based, by transforming real-time data into decision-specific information in the hands of the individuals who have to make those decisions in the midst of each new crisis. It is a tall task, but one that, as the articles in this issue amply demonstrate, is increasingly within our collective reach.
2. Hartocollis A, Bernstein A. At Bellevue, a desperate fight to ensure the patients' safety. New York Times. ; November 2, 2012; .
3. Kahneman D, Slovic P, Tversky A., eds. Judgment Under Uncertainty: Heuristics and Biases. Cambridge, England: Cambridge University Press; 1982; .
4. Silver N. The Signal and the Noise: Why Most Predictions Fail—But Some Don't. New York, NY: Penguin; 2012; :131
5. Abir M, Davis MM, Sankar P, Wong AC, Wang SC. Design of a model to predict surge capacity bottlenecks for burn mass casualties at a large academic medical center. Prehosp Disaster Med. 2013; 28:(1):1–10.
6. Monastersky R. Tsunami forecasting: the next wave. Nature. 2012; 483:144–146.