The COVID-19 pandemic has directly caused more than 611,000 deaths and 34.6 million cases through July 2021 in the United States alone. Key sources of transmission include crowded and poorly ventilated indoor spaces, including indoor dining areas.1 Understanding which nonpharmaceutical interventions reduce transmission is crucial for balancing public health and economic and social costs derived from the COVID-19 pandemic. Previous research suggests that mask mandates,2 sick leave policies,3 and shelter-in-place orders reduce COVID-19 spread,4 although the quality of studies concerning the causal evaluation of nonpharmaceutical interventions is mixed.5 Analysis of indoor dining ban impacts has been more limited. A review article of 20 studies found that reopening hospitality venues, including restaurants, were associated with a high risk for COVID-19 rate increases.6 However, most of these studies did not differentiate between indoor and outdoor dining, leaving questions about the associations between indoor dining and COVID-19 incidence.
During the 2020 summer and fall, US state and local governments were left to negotiate closing and reopening policies without federal regulation or guidance related to activities such as indoor dining. In the early months of the pandemic, most cities and states limited indoor dining, while allowing delivery, take-out, and pickup services, and sometimes, outdoor dining. As the pandemic progressed some cities and states began reopening indoor dining while others remained closed.7 Other cities attempted to keep indoor dining closed but were prohibited from doing so, and forced to reopen, by their state.
The federal system in the US delegates much of the authority to protect health to states under the police powers. States can then further delegate authority to local governments.8 The authority to prohibit or limit the power of a lower level of government to enact legislation is a legislative doctrine known as government preemption.8 States have preempted local government on laws including minimum wage, bans on removing confederate statues, antidiscrimination polices, and antigun legislation.8 Recently, predominantly white and conservative legislatures have adopted preemption laws to limit progressive policies in democratic and majority nonwhite cities.9
Some states have preempted cities from keeping indoor dining closed, forcing cities to reopen indoor dining during the pandemic.10 State preemption of city-level policies around indoor dining created substantial heterogeneity between cities in the scope of local governments’ authority to regulate indoor dining and provides an opportunity to test the association between keeping indoor dining closed and COVID-19 rates. The objective of this study is to examine the association between keeping indoor dining closed and COVID-19 rates, leveraging the variation in state government’s use of power to prohibit local orders. This variation allows us to proxy a counterfactual contrast, which we exploit by comparing cities that kept indoor dining closed to cities that intended to keep indoor dining closed but were forced reopen indoor dining by their respective state governments.
Following a policy trial emulation framework,11 we define units of analysis and exposure, causal contrasts, outcomes, and time zero. Table 1 and Figure 1 contain details on these design specifications. The study sample was composed of cities that expressed an intention to keep indoor dining closed. Cities were then divided into treatment and comparison groups, according to whether their respective state governments actually allowed them to continue keeping indoor dining closed. The treatment group includes cities that kept indoor dining closed, while the state allowed indoor dining to reopen. The comparison group includes cities that intended to keep indoor dining closed but were preempted from doing so by the state and so reopened indoor dining. By identifying comparison cities that would have remained closed absent the preemption, we provide a stronger comparison group than comparing all geographies that remained closed to those that reopened, or studies that compare policy adoption between geographies over time.12 This is because cities that shared the intention to keep indoor dining closed are likely to share a number of characteristics related to case rates.
Table 1. -
Treatment and Comparison Group Definitions
||Cities allowed to reopen by the state, but that stayed closed
||Cities that would have kept indoor dining closed but were preempted by their state and reopened indoor dining
|Postperiod start (time zero)
||Date state allowed the city to reopen, but city stayed closed
||Date the city/state reopened
||Milwaukee, Indianapolis, Philadelphia, San Francisco
||Atlanta, Austin, Charleston, Dallas, Houston, Phoenix, San Antonio
|Number of cities (n)
|Weeks indoor dining closed (before time zero) median (min-max)
|Case rate at time zero (per 100,000) median (min-max)
|Population Size (millions)a
|% aged <18 median (min-max)
|% aged ≥18–64 median (min-max)
|% aged >65 median (min-max)
|% living in poverty median (min-max)
|% college educated median (min-max)
|% Female median (min-max)
|% overcrowded (>1/room) median (min-max)
| % Black (non-Hispanic) median (min-max)
| % Hispanic/Latino median (min-max)
| % Non-Hispanic White median (min-max)
| % service workers median (min-max)
| % foreign-born median (min-max)
| % using public transit median (min-max)
| % voted for Biden in 2020
| Cities w/ democratic mayors # (%)
| % always or frequent mask wearing median (min-max)
The study population includes cities that are members of the Big Cities Health Coalition (BCHC), an organization composed of 30 of America’s largest metropolitan health departments.13Figure 1 shows a flowchart of treatment and comparison group selection. To identify cities that met inclusion criteria, we collected information on statewide and city or county orders by searching publicly available policy databases, and state/city websites listing these orders and reviewing state/city orders. We identified public statements by searching news articles, twitter posts, and state/city websites. GC conducted an initial review of indoor dining and preemption policies across all Big Cities Health Coalition cities, GO repeated the process and included analysis of city intention to remain closed and additional cities. Any disagreement on inclusion was resolved through review by ASM, who conducted a final review of policies and public statements to ensure cities met inclusion criteria.
We examined the spring and summer period of reopenings after initial stay-at-home orders, and so limited the study sample to cities that reopened or were allowed to reopen between March 30, 2020, and October 1, 2020. We excluded cities that did not demonstrate intention to stay closed and then divided cities into treatment and comparison groups. eAppendix B; https://links.lww.com/EDE/B876 provides evidence that comparison cities would have stayed closed absent the preemption order, along with evidence on treatment decisions to remain closed and details on subsequent reopenings in treatment cities. We also excluded from the treatment group cities that reopened dining less than 3 weeks after the state allowed them to reopen, to ensure the period of closure was long enough to differentiate from the comparison cities and to avoid any policy turning on and then off during the study period. To increase the number of cities, we included any US city that had >100,000 residents and well-publicized preemption related challenges, and that otherwise met inclusion criteria described above (opened or allowed to reopen March 2, 2020–October 1, 2020, expressed desire to reopen, and if in treatment, reopened more than 3 weeks after being allowed to do so); all cities in the United States with populations over 100,000 were included in the pool of eligible additional cities.
Preemption also helped us to identify time zero.14 For treatment cities, we define time zero as the date the city kept indoor dining closed when the state allowed reopening. For comparison cities, time zero is the date the state mandated (via preemption) the city reopen indoor dining (i.e., when the city would have stayed closed but was preempted, so reopened) (see eTable A2; https://links.lww.com/EDE/B876 for city and state dates). Incorrectly identifying the time when the comparison group would have enacted the policy (remained closed) can bias estimates by inappropriately conditioning on or selecting on posttreatment variables, and the preemption allows for easy identification of time zero in our comparison group.11 To isolate the immediate impacts of keeping dining closed and limit the potential for time-varying treatment effects, we constrained the study period to 8 weeks total: the 2 weeks before and 6 weeks after time zero for all cities. This means calendar date for time zero differs by city, but all cities include 8 weeks of observations. Throughout the article, “city” refers to the city itself, or the county in which the majority of the city resides, depending on data availability (see eTable A2; https://links.lww.com/EDE/B876).
We employ city-level (or county equivalent) daily new COVID-19 confirmed case and death counts from the Center for Systems Science and Engineering (CSSE) at Johns Hopkins University.15 We calculate daily confirmed case and death rates, and use city or county-level population denominators from the 1-year 2019 American Community Survey (ACS).16
We controlled for other time-varying city-level nonpharmaceutical interventions, specifically: stay-at-home orders, mask mandates, and eviction moratoriums, derived from multiple policy trackers,17,18 and review of state and city orders. We did not include starts of stay-at-home orders, school, or religious gathering reopening covariates, as no cities restarted stay-at-home orders and no schools reopened during the study period, and no treatment city restricted religious gatherings as every city was in a state that designated religious gatherings as essential services. We included the nonpharmaceutical intervention variables as binary time-varying indicators (see eAppendix A; https://links.lww.com/EDE/B876).
We calculated city descriptive statistics, using the ACS, New York Times county-level mask survey,19 and Politico 2020 county-level election results.20 We employed a quasi-experimental approach with a difference-in-difference (DiD) analysis.21 We compared differences in COVID-19 cases before and after the potential reopening date (first difference), in cities that did and did not reopen (second difference). We assume that the change in the log(case rate) for the comparison group reflects the change that would have been observed in the treatment group had the treatment not occurred.22
To examine whether prepolicy trends in COVID-19 case and death rates differed between the treatment and comparison cities, we plotted log of the average weekly confirmed case rates (and death rates) and timing of state and city indoor dining reopenings for treatment and comparison cities. We also ran a negative binomial model interacting the treatment variable with time during the preintervention period. We measured our treatment as a binary indicator for open(=0)/closed(=1), with open defined by any level of reopening indoor dining, including with limitations (e.g., occupancy limits and special hygiene measures). We included 2-week lags to allow for time between exposure and incubation period, testing lags, and reporting lags. The pretreatment period includes outcomes from the 2 weeks before and after time zero, which represent the 4 weeks before time zero with the 2-week lag.
We ran negative binomial regression models, with city-day as the unit of analysis, city population as an offset, time-varying city-level nonpharmaceutical interventions in the adjusted model, and Huber–White robust standard errors clustered at the city and state levels, to address clustering. The main model specification is as follows, and detail on model interpretation is provided in eAppendix C; https://links.lww.com/EDE/B876:
Using this main model, we calculated the average daily cases prevented by keeping dining closed in the treatment cities, and then estimated the average marginal effect, calculated as the count of cases that would have been averted in all 11 cities, had no city reopened over the 6-week period. In addition to the canonical DiD model, we also employed an event study model. We also use a two-way fixed effects approach by including the policy as a time-varying dummy variable that is initially set to 0 for all cities and changes to 1 for treatment cities in the postperiod and includes both city and time fixed effects (sensitivity 1).
We conducted six other categories of sensitivity analyses to test for robustness of our model to alternative specifications. Details on the sensitivity analyses are available in eAppendix D; https://links.lww.com/EDE/B876. In summary, we (1) included city fixed effects, ran a two-way fixed effects DiD specification (city and calendar-week fixed effects), and added a state-fixed effect to our main model; (2) extended the preperiod to 4 weeks before time zero, extended the duration of pretreatment and follow-up to 12 weeks, and varied the prespecified 2-week lag; (3) excluded various cities, controlled for weeks of closure before the treatment, and controlled for potential demographic and protective behavioral differences between treatment and comparison cities (racial-ethnic composition, public transport use, and mask wearing); (4) repeated the main adjusted analysis using death rates instead of case rates; (5) modeled the association using a linear model for log(rates), and applied wild bootstraps to estimate the potential bias due to a small number of clusters with robust standard errors. These analyses adjusted for posttreatment nonpharmaceutical interventions and included Huber–White robust standard errors clustered at the city and state levels, while we only included the city and calendar-week fixed effects in the two-way fixed effects and event study models. As a sixth sensitivity analysis, we explored an alternative modeling approach for staggered policy adoption and treatment heterogeneity using the Callaway and Sant’Anna approach.23
We conducted analyses in R 4.0.2 and STATA 15.1. Code for replication along with data is available here: https://github.com/alinasmahl1/covid_indoor_dining. This study was exempt from institutional review board review, as we used publicly available deidentified data.
We included 11 cities contributing 781 city-day observations and a total of 88,589 cases in the 10-week study period (4 weeks before and 6 weeks after time zero). Table 1 displays descriptive statistics on treatment and comparison cities: groups were comparable in terms of most characteristics, with the exception of % Hispanic and % non-Hispanic Black, which were higher and lower in comparison cities, respectively. Indoor dining was closed for an average of 13.1 and 5.7 weeks before cities were allowed to reopen dining (in treatment cities) and before having to open (in comparison cities), respectively. Treatment cities remained closed for a mean of 5.1 weeks after they were allowed to reopen. eFigure A1; https://links.lww.com/EDE/B876 shows the pattern of COVID-19 cases and timing of indoor dining closing and reopening from April to October; for treatment cities, the figure shows the date states allowed reopening and the date the city eventually reopened. In cities that reopened, we saw an increase in cases approximately 2 weeks after indoor dining reopened, while case rates decreased or continued to decrease in cities that maintained indoor dining closures.
On average, treatment and comparison cities had 11 and 3.8 daily cases per 100,000, respectively, before reopening indoor dining or keeping indoor dining closed (Table 1). How ever, as shown in Figure 2 pretreatment case rate trends were approximately parallel (exponentiated 95% confidence interval [CI] for the interaction between time and treatment = 0.84, 1.02) also suggesting parallel trends in the preperiod. Additionally, in the adjusted event study model, the pretreatment trends were small (exponentiated 95% CI for 4 weeks before treatment = 0.31, 0.91; exponentiated 95% CI for 3 weeks before treatment = 0.47, 1.38; exponentiated 95% CI for 2 weeks before treatment = 0.83, 1.2), suggesting parallel pretreatment trends after covariate adjustment.
Results from the models show a strong association between COVID-19 rates and keeping indoor dining closed. Keeping indoor dining closed was associated with declines in new COVID-19 case rates by 55% (IRR = 0.45; 95% CI = 0.21, 0.99) compared with cities that reopened indoor dining, in the fully adjusted model (Table 2). Based on the canonical DiD analysis, keeping indoor dining closed averted 130 (95% CI = –263, 2.0) daily cases in the average city. Had all cities kept indoor dining closed (and holding all other nonpharmaceutical interventions at their actual levels), we estimate that approximately 39,661 (95% CI = –79,925, –602), or 189 per 100,000, cases would have been averted over the 4-week period. These estimates should be viewed cautiously and as general approximations. The event model specification shows similar results (eTable D2 and eFigure D3; https://links.lww.com/EDE/B876), as COVID-19 case rates were higher in cities that reopened indoor dining, starting 5 weeks after reopening, with 4.69 times higher case rates compared with the week before reopening (95% CI = 1.52, 14.44). The magnitude of these differences further increased over the follow-up period. The alternative specifications of the event model showed no substantive changes to our inferences.
Table 2. -
Unadjusted, Adjusted, and Alternative Specifications: COVID-19 Incidence Rate Ratio (IRR)
||IRR (95% CI)
||0.35 (0.15, 0.80)
|Adjusted model (main model)
||0.45 (0.21, 0.99)
|City fixed effects
||0.38 (0.20, 0.73)
||0.16 (0.06, 0.29)
|Calendar-week and city fixed effects
||0.33 (0.17, 0.61)
|Extend preperiod to 4 weeks pre “time zero”
||0.46 (0.20, 1.0)
|Extend study period to 24 weeks (12 pre and post)
||0.14 (0.06, 0.31)
|Decrease lag to 9 days
||0.60 (0.30, 1.0)
|Increase lag to 3 weeks (21 days)
||0.31 (0.13, 0.75)
|Increase lag to 4 weeks (28 days)
||0.25 (0.11, 0.55)
|Remove the non-Big Cities Health Coalition cities
||0.38 (0.15, 0.97)
|Remove SF (early fall reopening)
||0.46 (0.17, 1.2)
|Adjusted for weeks open before treatment
||0.49 (0.29, 0.82)
|Adjusted for demographic and behavioral factors
||0.50 (0.31, 0.83)
|Callaway and Sant’Anna (simple average)
||0.44 (0.24, 0.81)
|Callaway and Sant’Anna (weighted by exposure length)
||0.42 (0.22, 0.80)
|Linear model w/ wild bootstraps
||0.45 (0.13, 1.21)
|Adjusted model (with deaths)
||0.39 (0.14, 1.12)
Results come from a negative binomial model with city population model as an offset, robust standard errors clustered at the city and state level, and a 2-week case lag for treatment and NPIs. Adjusted model further adjusts for NPI’s: mask mandates, stay-at-home orders, and eviction moratoriums. All sensitivity analysis, except for Callaway and Sant’Anna, adjusted for NPIs. Deaths model includes a 5-week rather than 2-week lag.
The changes to modeling strategy, study period, lags, city selection, and additional covariates showed no changes to our inferences (Table 2). Including city and week fixed effects showed slightly larger reductions in case rates in cities that kept dining closed, compared with the main model, and state-fixed effects showed even larger reductions. Extending the duration of the follow-up period to 12 weeks pre and post suggested a stronger association, as would be expected given exponential growth. Decreasing the lag period resulted in a weaker association, while increasing the lag resulted in stronger associations. Pretreatment demographic and preventive behavior covariate adjustment or adjusting for weeks of closed dining before time zero did not change our inferences. Excluding San Francisco showed similar estimates, but with wider confidence intervals. Our sensitivity analysis using the Callaway and Sant’Anna specification shows substantively similar results to our main findings, suggesting that staggered treatment timing and heterogeneous treatment effects are not likely to be major sources of bias. The linear model with wild bootstraps showed estimates consistent in magnitude but with wider confidence intervals than the negative binomial model with cluster robust standard errors (IRR = 0.45; 95% CI = 0.13, 1.2).
eFigure D3; https://links.lww.com/EDE/B876 shows that pretreatment log(death rates) for treatment and comparison groups appear approximately parallel (exponentiated 95% CI for the interaction between time and treatment: 0.96, 1.004), suggesting parallel pretreatment trends. Results from our sensitivity model estimating death rates shows a similar association as the for the model with case rates, albeit with wider confidence intervals (IRR = 0.39; 95% CI = 0.14, 1.12).
In this study of 11 large cities in the United States with a total of over 22 million inhabitants, we found that keeping indoor dining closed was associated with declines in new COVID-19 case rates as compared to reopening. We approached this question leveraging the heterogeneity in state and local policies, specifically comparing cities that kept indoor dining closed to cities that were preempted from doing so but would have kept indoor dining closed if allowed to. Our numerous sensitivity analyses all agreed in the direction of association, reinforcing the robustness of these findings. We estimate that, after adjusting for other nonpharmaceutical interventions, COVID-19 rates decreased by 55% over 6 weeks in cities that kept indoor dining closed compared with cities that reopened.
In a review of 20 articles using different methods, Bilal et al found that reopening hospitality venues (bars, restaurants, and nightclubs) posed a high risk for increases in COVID-19 incidence, that closing them was among the most effective COVID-19 mitigation strategies, and that these venues were frequent sites of superspreading.6 However, most reviewed studies did not differentiate between indoor and outdoor dining and could not isolate the impacts of indoor dining on COVID-19 rates. For example, a CDC quasi-experimental analysis found that reopening on-premises dining, which includes indoor and outdoor dining, was associated with an increase in cases 41–100 days after reopening,12 and another CDC analysis found that people who tested positive were twice as likely to have reported eating at a restaurant.24 Another US study found that compared with reopening other nonresidential locations, reopening full-service restaurants was associated with the largest predicted impact on infections.1 A quasi-experimental study found that restaurant dining area closures, and other nonessential venues, had impacts similar in magnitude to shelter-in-place orders.4 And, a contact tracing study from Hong Kong found that social venues, including restaurants, accounted for 33% of all traced transmissions, and presented an elevated risk for outbreaks.25
Our article uses a quasi-experimental design to provide further evidence of impacts of indoor dining specifically, rather than examining general on premise dining (outdoor and indoor). Our comparison group and matching on intention to stay closed satisfies several conditions suggesting a strong comparison, and our use of preemption improves upon prior analysis that cannot control for differences influencing the likelihood that a geography implements a nonpharmaceutical intervention (selection into the comparison group). The analysis also highlights differences in state and local policy decision powers and authorities, a key policy pandemic question as local government often enacted, or attempted to enact, stricter nonpharmaceutical intervention policies than state governors or legislators, producing heterogeneous within-state COVID-19 patterns.26 Indoor dining has not only been associated with increased COVID-19 incidence but also has been shown to potentially be responsible for some of the stark racial-ethnic COVID-19 disparities.1 Higher death rates in Black and Hispanic populations are driven by higher rates of infection and exposure, in large part thought to be due to higher rates of occupational exposure.27 Analyses of occupation and COVID-19 deaths found that food preparation and service workers,28 and food and agricultural workers,29 had among the highest rates of COVID-19 deaths of all workers. Among food preparation and service workers, Hispanic workers had 8 times higher death rates than white workers. Given disproportionate representation among restaurant workers—Hispanics/Latinos account for 27% of food and restaurant workers but only 17% of the population30—reopening may increase exposure risk, particularly among Hispanic/Latino populations. While our study could not estimate impacts of indoor dining on inequities in COVID-19, due to lack of longitudinal data on neighborhood level or race-specific COVID-19 rates across cities, or on specific occupational categories, future studies should extend this work to examine disparate impacts of indoor dining policies by race/ethnicity and occupation.
Restaurant groups and lobbyists advocated for restaurant reopening, and in some cases sued (successfully and unsuccessfully) governors and mayors,31 citing the lack of research on the relationship between indoor dining and COVID rates and superspreading events.32 Our study cannot speak to impacts of indoor dining on superspreading events, but we do provide evidence that indoor dining bans are associated with reductions in cases. Mechanisms can include direct effects (reducing transmission linked to diners or restaurant workers) or indirect effects with dining bans signaling the severity of the pandemic and incentivizing other behaviors that may reduce transmission (like less mobility or more social distancing). Nearly all cities and states across the US reopened indoor dining in spring and summer 2021. These reopening decisions involve difficult tradeoffs between protecting public health and preventing further layoffs and restaurant closures and have been made more difficult because of limited strong evidence on the association between indoor dining and COVID-19 spread.
Our study has several limitations. The cities are similar on multiple potential confounders that may affect new COVID cases, implementation, and compliance, including political leaning, age structure, socioeconomic and housing factors, percentage service workers, and mask compliance. However, only a small number of cities met study inclusion criteria, and treatment cities were smaller, had larger Black and smaller Hispanic populations, and used transit more frequently than comparison cities. Adjusting for these unbalanced factors did not meaningfully change the results. Moreover, these differences should not bias our difference-in-difference analysis unless they were changing differentially during the study period. However, our treatment and comparison groups may differ on other factors related to COVID-19 rates, and our approach may not fully capture time invariant city-level confounders or unmeasured or unobservable time-varying confounders (e.g., other mitigation strategies) that may be associated with indoor dining opening, such as mobility data or longitudinal test count. We were unable to control for time-varying testing because longitudinal data are not available at the city/county level. This may result in an underestimation of the number of cases due to lack of testing, which would be problematic if testing changed differentially in the treatment and comparison groups after the treatment occurred. State preemption helps us to isolate associations with indoor dining from other policies implemented at the state and local level. For example, in Texas all the study cities extended city/county stay-at-home orders after the Governor ended the state’s stay-at-home order, and the cities only allowed reopening of services explicitly preempted by the state order, including indoor dining, while the cities kept nonessential services such as bars, hair salons, and gyms closed.33 However, reopening indoor dining timing in some states coincided with other reopenings such as museums, malls, and theaters. Although contact tracing data suggests that such activities pose relatively limited risk for COVID-19 transmission,34 if reopening other nonessential leisure activities, or removing other mitigation measures also contributed to increased infections then our results may overestimate the association attributable to indoor dining.
Importantly, and given transmission dynamics of COVID-19, the difference between groups in pretrend levels of COVID-19 case rates may bias the association by impacting the subsequent rate of change,22 and people may have been less cautious in cities with lower baseline rates. Our model measures associations with indoor dining policies, not the act of engaging in indoor dining specifically, capturing cases due to policy-related behavior change: for example, reopening dining policies may signal reduced risk to residents, leading to more risky activities. In addition, we were not able to fully account for other features of disease transmission (city-specific time trends) post time zero that could differ between treatment and comparison cities for reasons unrelated to indoor dining. As a result, our results may overestimate the impact of keeping indoor dining closed. Finally, we only measured a binary indicator of reopening (open vs. closed), although different capacity levels or other restrictions may differentially impact COVID spread1 and future research should examine the impacts of these forms of policy implementation. The results may have differed had reopenings occurred during colder months, when outdoor gatherings are less common, and if people were more likely to gather in homes if indoor dining was closed. Such factors are important determinants of COVID-19 spread, so may limit the generalizability of our findings to other seasons and contexts. Similarly, our study only included cities that demonstrated an intention to keep indoor dining closed, so results may be most applicable to cities that aggressively enacted (or attempted to) mitigation measures.
Keeping indoor dining closed may be one tool to prevent further spread of COVID-19. Cities and states have reopened indoor dining in the United States, but the Delta variant and other more infectious variants may drive up cases, hospitalizations, and deaths. Although vaccination rates are increasing in the United States, large vulnerable populations remain unvaccinated, and other countries have far lower vaccination coverage. This study suggests that keeping indoor dining closed may be one tool to help prevent thousands of COVID-19 cases.
We would like to acknowledge Dr. Kathryn M. Leifheit for methodologic assistance and Dr. Noah Weber for providing a methodologic review.
1. Chang S, Pierson E, Koh PW, et al. Mobility network models of COVID-19 explain inequities and inform reopening. Nature. 2021;589:82–87.
2. Lyu W, Wehby GL. Community use of face masks and COVID-19: evidence from a natural experiment of state mandates in the US: study examines impact on COVID-19 growth rates associated with state government mandates requiring face mask use in public. Health Affairs. 2020;39:1419–1425.
3. Pichler S, Wen K, Ziebarth NR. COVID-19 emergency sick leave has helped flatten the curve in the United States: study examines the impact of emergency sick leave on the spread of COVID-19. Health Aff (Millwood). 2020;39:2197–2204.
4. Courtemanche C, Garuccio J, Le A, et al. Strong social distancing measures in the United States reduced The COVID-19 growth rate: study evaluates the impact of social distancing measures on the growth rate of confirmed COVID-19 cases across the United States. Health Affairs. 2020;39:1237–1246.
5. Haber NA, Clarke-Deelder E, Feller A, et al. Problems with evidence assessment in COVID-19 health policy impact evaluation (PEACHPIE): a systematic strength of methods review [Preprint]. medRxiv. 2021. (doi: 10.1101/2021.01.21.2125024). Accessed 3 August 2021.
6. Bilal U, Gullón P, Padilla-Bernáldez J. Epidemiologic evidence on the role of hospitality venues in the transmission of COVID-19: a rapid review of the literature [published online ahead of print April 28, 2021]. Gaceta Sanitaria. doi: 10.1016/j.gaceta.2021.03.004.
7. MultiState. COVID-19 state and local policy dashboard. MultiState. 2020. Available at: https://www.multistate.us/research/covid/public
. Updated November 22, 2021. Accessed 4 August 2021.
8. Briffault R. The challenge of the new preemption. Stan L Rev. 2018;70:1995.
9. Carr D, Adler S, Winig BD, et al. Equity first: conceptualizing a normative framework to assess the role of preemption in public health. Milbank Q. 2020;98:131–149.
10. Connor G, Vaidya V, Kolker J, et al. Indoor dining
and COVID-19: implications for reopening in 30 U.S. cities. Collaborative UH, ed. In: Policy Brief. Drexel Dornsife School of Public Health; 2020.
11. Ben-Michael E, Feller A, Stuart EA. A trial emulation approach for policy evaluations with group-level longitudinal data. Epidemiology. 2021;32:533–540.
12. Guy GP Jr, Lee FC, Sunshine G, et al. Association of state-issued mask mandates and allowing on-premises restaurant dining with county-level COVID-19 case and death growth rates—United States, March 1–December 31, 2020. Morb Mortal Wkly Rep. 2021;70:350.
13. National Association of County and City Health Officials (NACCHO). Big Cities Health Coalition-About Us. Available at: https://www.bigcitieshealth.org/about-us-big-cities-health-coalition-bchc
. Accessed 5 May 2021.
14. Strumpf EC, Harper S, Kaufman JS. Fixed effects and difference in differences. Oakes JM, ed. In: Methods in Social Epidemiology. Jossey-Bass; 2017:342–368.
15. Dong E, Du H, Gardner L. An interactive web-based dashboard to track COVID-19 in real time. Lancet Infect Dis. 2020;20:533–534.
16. U.S. Census Bureau. 2019 American Community Survey 1-year Estimates. Bureau of the Census; 2020. Available at: https://www.census.gov/newsroom/press-kits/2020/acs-1year.html
. Accessed 8 July 2021.
17. Raifman J, Nocka K, Jones D, et al. COVID-19 US State Policy Database. 2020. Available at: https://statepolicies.com/
. Accessed 7 July 2021.
18. Hepburn P, Louis R, Desmond M. Eviction Tracking System: Version 1.0. Princeton University; 2020. Available at: www.evictionlab.org
. Accessed 7 July 2021.
19. Katz J, Sanger-Katz M, Quealy K. A detailed map of who is wearing masks in the US. The New York Times. 2020 Available at: https://www.nytimes.com/interactive/2020/07/17/upshot/coronavirus-face-mask-map.html
. Accessed 15 July 2021.
20. Vestal AJ, Briz A, Choi A, et al. 2020 Election Results. 2021. Available at: https://www.politico.com/2020-election/results/
. Accessed April 1, 2021.
21. Goodman-Bacon A, Marcus J. Using difference-in-differences to identify causal effects of covid-19 policies. Survey Res Methods. 2020;14:153–158.
22. Haber NA, Clarke-Deelder E, Salomon JA, et al. Impact evaluation of coronavirus disease 2019 policy: a guide to common design issues. Am J Epidemiol. 2021;190:2474–2486.
23. Callaway B, Sant’Anna PH. Difference-in-differences with multiple time periods. J Economet. 2021;225:200–230.
24. Fisher KA, Tenforde MW, Feldstein LR, et al. Community and close contact exposures associated with COVID-19 among symptomatic adults≥ 18 years in 11 outpatient health care facilities—United States, July 2020. Morb Mortal Wkly Rep. 2020;69:1258.
25. Adam DC, Wu P, Wong JY, et al. Clustering and superspreading potential of SARS-CoV-2 infections in Hong Kong. Nat Med. 2020;26:1714–1719.
26. Schnake-Mahl A, Bilal U. They’re dying in the suburbs: COVID-19 cases and deaths by geography in Louisiana (USA). [Preprint]. Medrxiv 2020.(doi: 10.1101/2020.10.28.20221341). Accessed 27 July 2021.
27. Price-Haywood EG, Burton J, Fort D, et al. Hospitalization and mortality among black patients and white patients with Covid-19. N Engl J Med. 2020;382:2534–2543.
28. Hawkins D, Davis L, Kriebel D. COVID-19 deaths by occupation, Massachusetts, March 1-July 31, 2020. Am J Ind Med. 2021;64:238–244.
29. Chen YH, Glymour M, Riley A, et al. Excess mortality associated with the COVID-19 pandemic among Californians 18-65 years of age, by occupational sector and occupation: March through November 2020. PLoS One. 2021;16:e0252454.
30. Bureau of Labor Statistics (BLS). Labor Force Statistics from the Current Population Survey. 2021. Available at: https://www.bls.gov/cps/cpsaat18.htm
. Accessed 8 July 2021.
31. Fantozzi J. California joins cities and states battling dining restrictions in court. But are the lawsuits having an impact? Nation’s Restaurant News. 2020. Available at: https://www.nrn.com/news/california-joins-cities-and-states-battling-dining-restrictions-court-are-lawsuits-having
. Accessed 5 July 2021.
32. National Restaurant Association. Letter to the National Governors Association. In: The Honorable Andrew Cuomo Chair NGA, ed. 2020. Available at: https://www.restaurant.org/downloads/pdfs/advocacy/letter-from-tom-bene-to-nga-chair-andrew-cuomo_11
. Accessed 10 August 2021.
33. The County Judge of Travis County. Stay Home Order Consistent With and Guidance Beyond Governor’s Executive Orders. Vol. County Judge Order 2020-8. Travis County, TX. 2020. Available at: https://www.austintexas.gov/sites/default/files/files/TC%20Stay%20Home-Work%20Safe%20Order%202020.05.08.pdf
. Accessed 10 August 2021.
34. Pray IW. Trends in outbreak-associated cases of COVID-19—Wisconsin, March–November 2020. MMWR Morb Mortal Wkly Rep. 2021;70:114–117.