This innovative analysis presents data differently, showing the evolution of government response not in terms of calendar days but vs the cumulative number of reported cases and deaths. The findings are revealing. Published in Agenda Pública.

The pandemic is waning in Europe. Epidemiological rates in the region have stabilised. Governments are currently concentrating on how to exit the restrictive lockdowns without triggering second outbreaks. Economic concerns are now the subject of public debate.
As daily rates of new cases and deaths continue to fall, it is worth taken a step back and evaluate the performance of policy responses retrospectively, especially in the early stages of the pandemic. As lockdown are currently lifted, a post-mortem study into when and how these were introduced offers valuable lessons.
Most of the cross-country analysis of the government response has focused so far on describing the evolution of the pandemic in calendar days, quite often only for the period after the contagion reached a tipping point of confirmed cases or deaths, e.g. the day after the 100th case was recorded.
My analysis presents data differently, showing the evolution of government response not in terms of calendar days but vs the cumulative number of reported cases and deaths. This perspective has an advantage: it makes it possible to rebase the metric of government response to the same scale of reported cases/deaths for a harmonised cross-country comparison.
Furthermore, the analysis focuses on inception phase of the epidemic: the critical days before a tipping point is reached, typically before the 1000 confirmed cases or 100 deaths. In hindsight, it has been in this period when early action has proved to be most effective.
The data used to track the government response is the Stringency Index of the Blavatnik School of Government which has been developed by University of Oxford to track the government performance during the pandemic. This index is a quantitative rating of the level of policy response and it combines 9 measures of government action:
- School closing
- Workplace closing
- Cancel public events
- Restrictions on gatherings
- Close public transport
- Stay at home requirements
- Restrictions on internal movement
- International travel controls
- Public information campaigns
As the authors indicate, the Stringency Index can help to “explore whether rising stringency of response affects the rate of infection and identify correlates of more or less stringent responses.” Moreover, because it combines several metrics, it provides a richer view than the otherwise narrow analysis which focuses only on the stay-home requirements, i.e. the so-called lockdown.
Because I want to focus on the cross-country comparison at the European level only, the country scope of my analysis is limited to West and Central Europe, excluding microstates such as Monaco, San Marino or Andorra to avoid distortions. Even though the selected list of countries involves different demographic sizes, no adjustment for population has been made. In the early stages of an outbreak, population matters the least and is not strictly required to measure the speed of contagion, as cases are likely to be geographically concentrated in regions or cities before the virus spreads to other parts of the territory.
Chart 1.1. shows the evolution of the Stringency Index vs the confirmed contagion cases. The chart suggests that Spain was a notable laggard in raising the level of response to the surge of reported cases, consistently scoring below the simple average trajectory and only above the UK, Sweden and partially France. Spain’s first restrictive measures became effective on 9 March, elevating the index from 11 to 47, when the number confirmed cases already exceeded 1500. This contrasts with the responsiveness of other countries, most of which had a stringency Index of over 60 when they reached the 1000 confirmed cases; Spain’s score was then 11, indicative of only one measure taken: public information campaigns. The so-called national lockdown (confinamiento in Spanish) became effective later on 14 March with over 7500 reported cases; only then Spain’s stringency index became aligned with the average.

Chart 1.2 plots the Stringency Index vs the number of confirmed deaths. It shows a similar pattern for Spain. The government response to the increase of reported deaths was slow compared to the majority of other countries. The exceptions were again France, UK and Sweden, but with the latter two consciously choosing a strategy of no suppression (i.e. low Stringency Index), which the UK reverted later but Sweden still remains committed to.

For a further insight, we can take a snapshot of the policy response and its components at the specific benchmark points of the contagion. Charts 2.1 and Charts 2.2 shows the Stringency Index and its components at 10 and 1000 contagion cases respectively in the y-axis as well as the date where these levels of contagion were reported in the x-axis. Chart 2.1 shows that government response in most countries was mostly limited to public information campaigns when only 10 contagion cases had been reported. Chart 2.2 shows that the response at 1000 cases, however, varied significantly: the UK and Spain still had only public information campaigns (Index = 11), whereas Portugal, Greece and Austria had deployed all measures (Index = 85-88).

Charts 2.3 offers an additional perspective with a more nuanced view. The chart shows grey bars, marking the Stringency Index at both 10 and 1000 contagion cases, as well as red dots, which show the number of calendar days it took for this 100-fold increase to materialise in each country. The chart indicates that most countries experienced this surge in a span of 14 to 17 days, i.e. 2 weeks. In Spain, the spread was the slightly more rapid: just 11 days. Italy, Slovakia, France, Germany and the UK, however, experienced a lengthier increase: 4 weeks. Spain had the fewest days to react to the surge whereas UK or France had three times as long.
Chart 2.4 tests the correlation between the early responsiveness to the outbreak (change in the Stringency Index from 10 to 1000 contagion cases) vs the latest death count (deaths per millions on 20 May). The red trend line suggests some level of correlation (R2 = 0.5) which becomes more robust if we exclude Belgium and Italy as idiosyncratic tail cases. Italy was the first struck country in Europe, and Belgium has been particularly hit because of the high concentration of elderly deaths in care homes. Aside from these two countries, it is noticeable that the countries with the highest death rate per million are those with the most sluggish response in the early stage of the outbreak: Spain and UK.

More importantly, early action seems to have been more important than other factors such as underlying vulnerabilities or the strength of the health care system as the next two final charts show.
The chart to right compares the latest death count vs the World Bank’s Vulnerability index constructed by principal components analysis with data on aged population, demographic density, smoking, etc., from World Development Indicators, United Nations Population Division, and WHO Global Health Observatory. The country scope of the chart are OECD countries, excluding Latin American members. It shows no clear correlation between the two variables, which reinforces the hypothesis that early action has been the key factor in mitigating the virulence of the outbreak.
Similarly, the chart to the left compares the same metric of death count per million vs the 2019 Global Health Security Index (GHS Index) constructed by the Johns Hopkins University in collaboration with the Economist Intelligence Unit to assess countries’ health security capabilities. Astonishingly, some of the countries with the highest GHS scores have been among the worst performers in this pandemic: US, Netherlands, UK and Sweden (followed by France and Spain). Conversely, some countries with lowest GHS scores did performe well, with much lower reported deaths per million: Croatia, Greece, Czech Republic, etc.

Interestingly, this final chart offers some insight into what it could be five categories of country performance and policy responsiveness:
- Green: Asia-Pacific countries with dissimilar GHS score but high responsiveness to natural disasters, not least because of the historical exposures to previous pandemics.
- Blue: The European Southern and Eastern periphery of small, vulnerable countries, historically aware of their exposure to external threats (low GHS score) and therefore more prone to early action.
- Yellow: The confident and cautious European North, broadly speaking, plus Canada, with risk averse attitudes plus strong health security capabilities.
- Red: Incautious Western European countries with inferior health security capabilities: Italy, Belgium and Ireland.
- Pink: The US (who else?) plus the incautious, carefree core Europe with better health security capabilities than the Red group countries.
What are the valuable lessons then? First and foremost is that it is critical to take early action. This is particularly relevant given the effort being put to track and trace the virus in this phase of lockdown easing. A second outbreak could be devastating if the same incautious, unresponsive government attitudes are repeated. The second lesson is that government should not focus on just stay-home requirements as the silver bullet to supress the spread of the virus. Other less disruptive and socially less punitive measures are available and have proved to be also quite effective in staving off the outbreak.