The False Promise of Federalization
When the federal government has retreated lately, it has not spelled disaster.
In 1996, the United States reformed its welfare system with the passage of the Personal Responsibility and Work Opportunity Reconciliation Act. This legislation replaced the longstanding Aid to Families with Dependent Children program with Temporary Assistance for Needy Families, shifting significant responsibility for welfare from the federal government to individual states. Critics warned that decentralization would lead to widespread economic insecurity and increased poverty. Peter Edelman, a former assistant secretary at the Department of Health and Human Services, resigned in protest, calling the new law "a moral outrage" that would "plunge millions of children into deeper poverty." Critics feared a "race to the bottom" where the poorest citizens would suffer the most as states cut benefits under budgetary constraints.
However, the outcomes told a different story. Instead of the disaster predicted by critics, many states took the opportunity to innovate and tailor their welfare programs to better serve their populations. The "Wisconsin Works" program, under Republican governor Tommy Thompson, became a model for other states, emphasizing work and personal responsibility while providing extensive support services such as childcare, transportation assistance, and job training. As a result, Wisconsin saw a dramatic reduction in welfare rolls and a significant increase in employment among former welfare recipients.
Between 1996 and 2000, the number of welfare recipients across the country fell from 12.3 million to 5.8 million. Employment among single mothers, particularly targeted by TANF, increased substantially, and child poverty rates declined during the late 1990s economic boom. The feared rise in extreme poverty did not materialize on the predicted scale, and many states effectively used their flexibility to create programs that matched local needs and priorities.
When the federal government retreated, disaster did not ensue. As positive as the outcomes were across the union of states, the outcomes were beside the point. Deference to states altered the political incentives. When welfare decisions were mostly federalized, politicians often had to appeal to broad, national constituencies, which could drive policy discussions to the extremes. By dispersing power, welfare reform helped temper the political climate by giving space to state officials to explore policy approaches that departed from national partisan orthodoxies.
After sixty years, more Americans than ever had more proximate influence and personal investment in how and under what conditions welfare was dispersed in their communities. The sky did not fall. So why does our entire political discourse presume the wisdom of federalization?
At least two intertwined reasons come to mind. The modern reputation of states’ rights, particularly since the 1960s, is deeply complicated by history. The concept dates back to the Articles of Confederation, which sought to preserve state autonomy. However, the defense of states' rights became entangled with the preservation of slavery and, later, racial segregation. Confederates and Southern Democrats invoked states’ rights to justify oppressive practices that have tarnished the principle of state autonomy and an institution designed to guard against the kind of tyranny from which the country rebelled. Even before and after slavery's abolition, the Ninth and Tenth Amendments were increasingly sidelined by capacious interpretations of the Commerce Clause. While necessary federal intervention to secure civil rights created an alarming pretense for limitless federal policymaking elsewhere, Congress, the President, and the Supreme Court all contributed to eroding state autonomy for much of American history independent of civil rights-related interventions.
There may be a related, more technocratic reason why federalization is desirable to those adjacent to national power centers. The supposedly enlightened view — one with trans-ideological adoption — is that in a competitive global economy, where a nation’s prosperity is seen as contingent on maximizing efficiency and addressing inequality, the days are gone when nations can afford to disperse their best resources and trust that individuals and private enterprise will produce desirable results. Achieving efficiency and equality means marginalizing institutions designed to check these noble intentions, including subnational institutions erected to reflect the divergent preferences of many polities. Pining for quaint Revolutionary-era concepts like deliberation and deference to local and state assemblies is passé. This view condescends: Those striking democracies Tocqueville observed across the land, you see, naive child, are merely “patchworks” that today impede progress in the “national interest”.
But what if centralization itself poses a greater long-term risk to a sustainable form of pluralism? The premise of centralization presupposes that only it can establish the best incentives for the union’s many societies. Deference to a subnational institution is presumed unable to produce the incentives or outcome the federal institution intended, a fear that doing so will leave a polity vulnerable to externalities. This strict paternalism severely discounts personal agency and the real-world, ground-level incentives that shape decision-making among local governments and private actors.
In 2017, when the president reduced the size of protected land in Utah, specifically Bears Ears and Grand Staircase-Escalante National Monuments, there were widespread fears that these lands would be aggressively exploited by energy companies for oil, gas, and mining operations. The outdoor apparel company, Patagonia, protested the executive action, stating that “The President Stole Your Land.” — the precise opposite of what happened. Environmentalists warned that the rollback of protections would lead to significant environmental degradation and the destruction of sacred sites. None of this materialized.
That same year, when the FCC announced the repeal of net neutrality regulations in 2017, critics warned that the internet would collapse. Upon the news, CNN’s front page read “The End Of The Internet As We Know It”. Yet, if you’re reading this, you’ll know the internet did not end. In fact broadband speeds and innovation in online services, like mobile broadband for homes and bandwidth options that can carry vastly larger amounts of data on streaming services — continued to improve.
In 2013, the Supreme Court's decision in Shelby County v. Holder, which struck down the pre-clearance formula of the 1965 Voting Rights Act, was met with alarm. Critics feared that states with histories of racial discrimination would quickly implement new voter suppression laws, including especially voter ID laws. While states did pass voter identification laws, they proved to be popular with minority voters and multiple studies found no significant adverse impact on Black voter turnout. Another study out of the University of Oregon concluded that “the Shelby decision did not widen the black-white turnout gap in states subject to the ruling.” The federal retreat from this matter did not shake democracy to its core.
During the COVID-19 pandemic, the federal government implemented an eviction moratorium to prevent millions of Americans from losing their homes during the economic downturn. When the Supreme Court ruled against the extension of the moratorium in 2021, fears of widespread evictions did not materialize. One Aspen Institute forecast that 30-40 million Americans were at risk of eviction. Instead, emergency rental assistance programs, state and local protections, and negotiated agreements between landlords and tenants prevented a homelessness crisis. In one study of 30 cities, all but one recorded fewer eviction filings than the historical average. Otherwise, evictions wound up declining.
Then there was the federal public health response to the pandemic. In the early weeks, the White House press corp had badgered the president — the man who had been fearfully likened to an autocrat since he took office — as to whether he would order a “national lockdown”. Many journalists were more interested in whether such a lockdown would occur than explaining why it would have been flagrantly illegal much less how enforcement would play out. The lockdown never happened, but pandemic hawks emerged to make the case that a brute national response was necessary because viruses transcend state borders. Yet the climate of both constantly evolving information and the misrepresentation of key data points by federal public health officials were not indicative of the competency required to communicate effectively less execute a more robust federal response — never mind the lacking strategy for noncompliant citizens on a national scale. More importantly, what we learned during the pandemic was that, in fact, the coronavirus was highly influenced by distinct social and geographic factors, independent of preventative measures like masking and distancing. States and localities but also private businesses aiming to avoid liabilities worked within a broader democratic process attuned to what policies were not just necessary but practical given the ever-changing information and geographic ebbing of the virus.
Then came inflation. During the pandemic, Congress and the president initially spent $2.2 trillion through the CARES Act to prop-up the economy, followed by another $900 billion in December. Under pressure from his party to transcend his reputation as a moderate, the new president and Congress approved an additional $1.9 trillion in the American Rescue Plan just three months later. Many dismissed inflation concerns, viewing them as exaggerated, especially since inflation had been dormant since the 1980s. However, the federal response significantly exacerbated the inflation other countries also witnessed though with the U.S. having experienced longer persistent inflation than any other Western country. While the ARP accelerated recovery, it also risked "abrupt and painful" adjustments, according to a Brookings projection before the law was passed. That study found that the U.S. would have recovered, albeit at a moderately slower clip, without any additional aid. The ARP was a political decision by a party eager to blow its newly earned capital, the cost of which was transferred to the people it intended to help.
Then the Dobbs decision was handed down. After a fifty year public opinion stalemate on abortion, the U.S. Supreme Court deferred to legislative bodies to write their own laws regulating the practice. The result? An increase in support for generally-worded “legal abortion” amid an incredible example of the rancorous democratic process successfully functioning on a major, hot-button issue. As most states voting to limit regulations or halt restrictions, a small minority of states included people choosing to re-elect leaders who had voted to continue restricting access. Dobbs ventilated an overheated issue and dispersed the political incentives by putting Republicans on the defensive and prompting the party to soften their stance at the Republican National Convention. And despite the bans, abortions have nevertheless increased since the high court’s decision.
A key reason why the federal government has become the hammer to which everything is a nail has to do with overlooking the personal agency of Americans. We see this especially in political reporting that applies a double standard when it comes to agency where the influence of powerful figures, like a president or governor, over things they can't control, such as the economy, is often overstated, while the responsibility of ordinary people for actions within their control, like voting, is minimized. When it came to the pandemic and the economic recovery, policy experts and public health officials vastly underestimated the ability of local municipalities and the American workforce — including the women who were predicted to suffer from a looming “she-cession” that never happened — to dig themselves out of a ditch well after pandemic aid had lapsed.
Decentralization isn't a panacea for the issues inherent in central power, like inefficiency or cronyism, but it offers a critical advantage: the dispersion of vulnerabilities through democratic federalism. Centralized systems tend to impose uniformity in response to volatility, which can lead to long-term instability. The preferred route is to distribute risk (yes, unevenly as it may be) such that volatility can be endured. This applies across discplines. In finance, decentralized models like blockchain reduce the risk of catastrophic failures by eliminating single points of control. In education, local tailoring of curricula meets specific community needs better than one-size-fits-all mandates. Architecture shows that decentralized urban planning creates more adaptable and resilient cities. Even in thermodynamics, decentralized processes distribute energy more evenly, preventing system collapse. Across these disciplines, decentralization enhances stability, adaptability, and resilience, managing risks by containing them rather than allowing them to become systemic.
Critics may argue that the United States’ decentralized structure was never fully intended and was merely a product of begrudging compromise. This tellingly reinforces the case for decentralization. There was no viable alternative to compromise, but a framework that disperses power among differing principles. Even if the Founders had adopted Hamilton’s top-heavy structure, after over two centuries it’d be more constructive to inquire whether these men who are deified got that part right as we would any other aspect of the order they established.
The last decade has yielded widespread interrogation of the established order across a number of domains — capitalism, liberalism, expertise, identity and national security. And yet despite a wave of foreign and domestic events demonstrating the risk of power consolidation, neither the political-media establishment nor its populist critics have confronted the most entrenched axiom in American politics: that what we call democracy and what we seek in a “more perfect union” can or should be primarily sought through the federal government.
Federalization doesn't make the United States more inclusive, functional, resilient, or truly democratic; instead, it has placed more cultural influence and decision-making power into the hands of fewer people — and it’s made so many of us miserable and unfulfilled with civic life as a result. As political observers convince themselves that the problem with our politics is polarization, language, “the volume”, or “the temperature”, seldom is it acknowledged that the arrangement of power — where it is concentrated and how little recourse there is from it — runs contrary to the skepticism of authority on which the union was founded. Decentralization may never come, or it may come too late, but it’s the only path to sustaining a large and diverse democracy for future generations.
In a political media dominated by national narratives, That Patchwork is the only newsletter about democracy from a decentralist angle. To preserve democratic pluralism means challenging the primacy of national narratives that presume central power knows best.
Do you think Alexander Hamilton’s vision of the need for a strong central government in order to create a strong nation not dependent on other nations was a mistake? He was a federalist.
Imagine a country without the two major parties. They control ALL governments, federal, state, and local, yet have NO authority to do so.
States rights is a real thing, a central tenant of the constitution. States rights was besmirched by the southern states and slavery. But that issue in no way invalidates state's rights. They still exist, and if the two parties didn't have a stranglehold on everything, the federal government would be a fraction of what it is currently is. And that would be a vast improvement.