Even though Anthony Fauci, the White House’s chief medical adviser, backed off his statement that the United States is “out of the pandemic phase,” elected officials and much of the public seem to think that he had it right the first time. But if the end of the COVID-19 emergency is at hand, the United States is reaching it with lower vaccination and higher per-capita death rates than other wealthy nations. The conventional wisdom is that the American political system failed at public health—by prioritizing individual rights over collective safety; sowing doubt about the benefits of vaccines, masks, and other protective measures; and, most important, failing to implement universal health care, paid sick leave, and other safety-net programs.
I agree with the conventional wisdom. But there’s plenty of blame to go around. Public health also failed America. The two most important federal public-health agencies, the CDC and the FDA, have been uniformly criticized for muddled messaging and guidance to the public on masks, vaccines, rapid tests, and other matters; those arguments need no rehashing here. Less well understood is that other sectors of our public-health system—including local agencies and prominent public-health academics—were unprepared for a nationwide infectious-disease emergency, particularly in a divided country with tight limits on government power.
[Read: The U.S. is about to make a big gamble on our next COVID winter]
As federal, state, and local health officials struggled in spring 2020 to obtain the basic funding, staff, lab supplies, and data systems to test, trace, and isolate cases, academics on Twitter and cable news became the face of public health—and they zeroed in on the many ways in which the U.S. response to COVID-19 fell short of a textbook approach to pandemic control. Public-health agencies were ill-prepared for this crisis, and academics were ill-prepared to speak on their behalf.
The U.S. has a highly decentralized public-health system that relies on thousands of state and local health agencies operating with a wide degree of independence. The origin of these agencies lies in combatting malaria, yellow fever, smallpox, syphilis, and other infectious diseases; their standard activities historically included controlling mosquitoes, improving water quality and sanitation, isolating and quarantining people during disease outbreaks, and directly providing prevention and treatment services.
By the mid-20th century, though, heart disease, lung disease, cancer, and other chronic conditions replaced infectious diseases as the leading causes of death. Over several decades, public-health agencies reduced their focus on environmental dangers, infectious disease, and clinical services. In the 2000s, these agencies dedicated more and more personnel and public communications to controlling tobacco, promoting physical activity and healthy diets, and screening for diabetes, heart disease, and cancer. The consensus of government and academic public-health experts was that the most effective way for these agencies to serve the public was to reduce illness and death from chronic disease. The key metric for judging the effectiveness of public-health agencies was life expectancy in the community they served, and—at least in the immediate pre-COVID era—promoting healthy lifestyles for all was more likely to avert premature deaths than infectious-disease control was.
[Jay Varma: The public-health calculus has shifted]
In theory, public-health agencies could add chronic-disease-control activities without sacrificing their infectious-disease expertise. In reality, however, public-health departments have experienced a progressive decline in real spending power, particularly since the Great Recession, and as a result have chosen to cut infectious-disease programs. More than 75 percent of the nation’s larger health departments reported eliminating clinical services from 1997 to 2008.
I experienced this shift firsthand. When I began overseeing infectious diseases at New York City’s health department in 2011, I worked for one of the nation’s leading proponents of chronic-disease control: Mayor Michael Bloomberg. Because of budget cuts, we had to shrink our infectious-diseases programs; I had to close or reduce hours for our immunization, sexually transmitted disease, and tuberculosis clinics. I had to justify these decisions to appropriately disgruntled community groups and city council members by saying that the Affordable Care Act’s Medicaid expansion would pay to ensure that these services could be provided by the private sector—a claim that I based more on hope than evidence.
As local health agencies nationwide scaled back their clinics, they also lost their presence in the community. Clinics are an important way of building visibility and credibility, because most people do not understand what a public-health agency is or does. Residents see the good work you do, tell elected officials that your work matters, and then trust you during emergencies. Running clinics also builds logistical expertise.
Unfortunately, when health agencies were called on to run the largest, most rapid vaccination campaign in U.S. history, most lacked personnel qualified to either run these clinics themselves or oversee contractors effectively. This resulted in debacles like the one in Philadelphia, where the health department let an untested start-up take on the job of running mass-vaccination clinics in the city. Public-health agencies lost an opportunity: One way to overcome vaccine hesitancy is to have trusted providers deliver information and services. Without a strong public presence directly administering vaccines before the pandemic, local health departments were additionally unprepared to reach communities inherently distrustful of a mass-vaccination campaign.
[Juliette Kayyem: Biden is rightsizing the COVID crisis]
The U.S. is not as different from the rest of the world as Americans frequently think. After the 2002–04 SARS epidemic and the 2014–16 Ebola epidemic in West Africa, independent reviewers of the World Health Organization concluded that the agency had become too focused on providing high-level technical guidance and had failed to invest in staff and systems to respond quickly during emergencies. During the coronavirus crisis, the agency has been far more effective than during past crises in mobilizing personnel and supplies in all regions of the world for border screening, laboratory testing, and vaccination. The crucial lesson that the WHO learned from the earlier epidemics is that failure to rapidly and effectively solve urgent problems, such as infectious-disease outbreaks, destroys your credibility and prevents you from addressing long-term problems and leading causes of death. Imagine a fire department that was focused on reducing the frequency of kitchen burns and not on putting out infernos in high-rise buildings. That’s the situation that local public-health officials found themselves in.
For most Americans, the face of public health during COVID-19 has not, however, been local health officials. The most prominent voices—other than Anthony Fauci’s—have been those of university professors proffering guidance on television, in print, and through social media. People who practice public health in government are expected to stick to the talking points for their agency; the mayor or governor whom they serve constrains them from freely explaining their recommendations and decisions. Even if afforded freedom to talk more openly, they have lacked the time to do so.
Into that void have stepped university-based physicians, epidemiologists, and virologists opining about what the government should do without fully understanding or communicating what is feasible, affordable, legal, and politically acceptable for public-health agencies. When I was advising New York City Mayor Bill de Blasio on how to respond to COVID in 2020 and 2021, the city faced terrible choices. As we attempted to return the country’s largest school district to in-person instruction in the fall of 2020 and then to keep classrooms open in the following months, I had to parse uncertain science while balancing the demands of staff unions, parents, and elected officials. Meanwhile, experts publicly faulted us for our limited ability to identify the source of infection for any given COVID case and for our failure to test every COVID-exposed student every day. Anyone who had actually implemented a large testing-and-contact-tracing program understood the impossibility of such demands. But most of the people with genuine technical expertise were busy practicing public health, not doing multiple cable-news hits a day.
Consumers of that commentary could easily conclude that the government was simply not trying hard enough to stop the virus. And yet state and local health agencies generally cannot remove the major legal, financial, and political constraints they face. For example, critics faulted the CDC and local health agencies for not releasing enough data but didn’t acknowledge the strict, complex patchwork of regulations at the federal and state level that limit what data public-health agencies can legally receive and report.
Every public-health practitioner I know understands that the U.S. can reduce its vulnerability to epidemics by improving data collection. Likewise, most of my colleagues believe that strengthening the social safety net—particularly through universal health care, paid sick leave, housing, and child care—will improve Americans’ ability to fend off COVID-19 and other threats. But enacting those measures is beyond the power of public-health officials, whose job it is to mitigate harm under real-world conditions. Those conditions include the underlying laws and systems that elected officials created.
[Tom Nichols: The playacting over masks really needed to end]
Universities and government agencies are subject to different prerogatives. The economic model of public-health schools rewards professors who bring in research grants; tenure committees and research funders do not necessarily demand that professors have experience inside government public-health agencies. In reporting the comments of academic experts, news outlets routinely include their university affiliation. But other credentials are more crucial: Have you ever run a large government health program? Have you ever led an official outbreak investigation? Academic experts can offer the public an idealized version of what public health could be, but during this pandemic they have also played a role in setting unrealistic expectations—for instance, that emergency measures could go on indefinitely, until society’s underlying failures are addressed.
I don’t mean to be too critical of my academic colleagues. (Full disclosure: I now teach at a university too.) The greatest threats to public health right now come from elected officials who would gut public-health agencies’ budgets and legal authority on the grounds that they threaten individual liberties.
One way to avoid that fate is for health leaders to recognize that their daily work is largely invisible to the general public—and that the public expects health agencies to focus on threats that they do not believe they can protect themselves from. Public-health experts, both in academia and in government, rightly point out that the holes in social-welfare policies are the primary determinants of ill health. These experts also believe that, for instance, promoting healthy diets and encouraging people to wear masks are worthy goals of government policy. But most Americans, for better or worse, still prioritize individual choice more and community protection less than those of us drawn to the public-health field do.
To rally voters’ support, agencies need to make themselves more visible in public life through direct clinical services and ensure that they are dedicating sufficient resources, even within constrained budgets, to public-health-emergency response. Meanwhile, the interested voter should press elected officials on their plans for restoring infectious-disease and emergency-response services. Ultimately, the highest aspirations of the public-health profession must coexist with the reality of law and where power actually resides in a democracy: with voters and their elected representatives.