The Limits of Private Action: What the Past 40 Years Taught Us About the Perils of Unregulated Markets

The two big ideas that animated American public policy since the end of World War II, employer-sponsored social benefits and neoliberalism, are failures. We need better options.

 

 

Since the end of World War II, two big ideas have animated American public policy. Employer-sponsored social benefits dominated from the late 1940s through the 1970s. Neoliberalism, prominent from the 1980s through the 2000s, turned to markets and private actors to achieve social goods.

 

Both of these models have now failed, albeit for different reasons. Sea changes in work and family life mean that virtually no one anymore has a lifetime job with union protections and secure benefits. And the financial turmoil of the early 2000s, culminating in the Great Recession, taught us hard lessons about the dangers of relying on unregulated markets. The financial crash triggered mass unemployment and mass foreclosures that ruined lives and still echo today, a decade later. No longer can anyone suppose that the unregulated market will provide a secure platform on which ordinary people can build their lives.

 

Knowing something about the intentions behind both efforts, and how and why they came undone, is essential before we turn to the promise of public options.

 

The End of the Treaty of Detroit

 

In the spring of 1950, Walter Reuther, the head of the United Auto Workers, and Charlie Wilson, the head of General Motors (GM), came to a historic agreement. GM would give its workers a monthly $125 pension, cover half the cost of health care, and increase pay in step with increases in the cost of living and rising productivity. Businessweek called it a moment of “industrial statesmanship.” Fortune dubbed the agreement the “Treaty of Detroit.”

 

The Treaty of Detroit established the paradigm for American social policy in the post-World War II era. Many people incorrectly think that the mid-twentieth century was defined by “big-government liberalism,” a system in which the federal government provides a robust welfare state to the American people. But this was not the model for public policy in mid-twentieth-century in America.

 

In fact, American public policy rarely took on the mantle of government-provided social goods. There were a few big-government programs—most notably Social Security and Medicare—but these were exceptions, not the rule. As proof, just compare our efforts to those of European countries. European governments provided socialized health care, free college education, and robust social welfare benefits to their citizens regardless of their wealth. The American government did none of those things. Under the Treaty of Detroit, it was General Motors, not Uncle Sam, that provided workers with a pension and health care. In the decades after 1950 this model spread, and it became the norm for much of the industrial and corporate sectors, whether or not they were unionized.

 

We want to be clear that the employer-based approach was a model, a framework. Not everyone was included in it, and it didn’t fully succeed even on its own terms. But it served as a powerful vision for how social policy was supposed to work.

 

The employer-benefits approach was based on a number of assumptions. The first was that people would have the same employer for their entire working life, starting at age eighteen or twenty-two and continuing until retirement at age sixty-five. Employees were expected to be hardworking and responsible; in return for their loyalty, employers paid them a decent wage, and their jobs were secure. Retirement meant a big party with friends and family and a gold watch from the employer, as a token of appreciation.

 

The second assumption was that employers would provide workers with critical social benefits. Employees would get health care and pensions, and eventually even educational, childcare, housing, transportation, and other benefits, from their employers, not from the government. Employer-based benefits had been around since the late nineteenth century, but they took off during the 1940s. Between 1940 and 1945, health insurance coverage expanded from 1.3 million American workers to 32 million workers.

 

After the war, the country continued along the employer-based benefits path instead of creating national programs, as many European countries did. In America, government programs existed largely as a safety net for those who fell through the cracks of the employer-based system. The extremely poor received health care through Medicaid. The elderly and retirees got their health care through Medicare and a minimal pension through Social Security’s old age provisions. The disabled, orphans, and widows received a basic income through Social Security. But these were exceptions. As a general approach to public policy, ordinary Americans would not get their benefits from the government.

 

The third assumption underlying the Treaty of Detroit was that employers would provide a “family wage.” Families were thought of as a male breadwinner, a stay-at-home wife, and children. When Reuther and Wilson negotiated the Treaty of Detroit, the idea of wages rising with the cost of living and with productivity improvements was partly a function of the need for workers to provide for their families. Indeed, between 1948 and 1978, the median man’s income kept pace pretty well with productivity. Productivity rose 108 percent during this period; hourly compensation increased 96 percent.

 

“Many people incorrectly think that the mid-twentieth century was defined by ‘big-government liberalism,’ a system in which the federal government provides a robust welfare state to the American people. But this was not the model for public policy in mid-twentieth-century in America.”

Starting in the 1970s, however, companies increasingly began withdrawing from the Treaty of Detroit. Oil shocks, stagflation, increasing competition from abroad, and an ideological shift that emphasized shareholder profits instead of stakeholder well-being all contributed to corporations shedding the view that they were “in it together” with their workers. Employers instead increasingly shifted risk onto their workers.

 

One of the best examples of what Jacob Hacker calls the “great risk shift” was the move from providing defined-benefit pension plans to offering defined-contribution pension plans. Defined-benefit plans give retirees a fixed amount in their yearly pension, which means the risk of market fluctuations is borne by the employer sponsoring the pension. Defined-contribution plans, in contrast, involve employees putting money into pension funds and investing that money; the retiree, not the company, bears the risks of market fluctuations. In 1980, there were 30 million defined-benefit plan participants, compared to only 19 million defined-contribution participants. By 2006, there were only 20 million defined-benefit participants, compared to 66 million defined-contribution participants. Today companies rarely, if ever, offer new defined benefit pension plans to their employees. Workers bear the risks of the market.

 

The trouble with the Treaty of Detroit model is that the assumptions undergirding the system of employer-based benefits are no longer reasonable, if they ever were. Americans don’t stay in the same job or career for decades. According to a 2016 Gallup study, 21 percent of millennials reported changing jobs within the past year. This isn’t unique to millennials. Even past generations of workers went through multiple job transitions throughout their lifetimes.

 

When people do work for companies, they increasingly are not actually employees of the company. The most prominent example is Uber and the “gig economy.” Uber claims that drivers are just using their mobile platform, and they are therefore not employees of the tech company. On this understanding of the gig economy, because workers are not employees, the company doesn’t have to provide benefits—no retirement package, no health insurance, no unemployment insurance.

 

The gig economy might get the most attention, but technology companies aren’t the only—or even the most common—cases of this phenomenon. Consider a housekeeper working at the San Francisco Marriott. Even though its name is prominently displayed outside, Marriott doesn’t own the property—a company called Host Hotels and Resorts does. Nor does Marriott employ the housekeeper or manage her hours or payroll—that’s the job of Crestline Hotels and Resorts, a hotel management company. Marriott does establish guidelines for how the housekeeping staff is supposed to maintain the hotel rooms. But this is a far cry from the old days, in which Marriott would have owned the property, managed the hotel, and employed the people working there.

 

The above example, taken from David Weil’s The Fissured Workplace, shows just how far the employer-employee relationship has changed in the last generation. Today, Weil argues, workplaces across a variety of sectors have been fractured, leading to complex employment relationships. Employees are now often working for contractors or subcontractors. Sometimes they are independent contractors who have no direct relationship with a company at all. Whether you call it the “gig economy,” the “1099 economy,” or the “patchwork economy,” the common theme is that workers are increasingly cobbling together different forms of employment—part-time, gig, independent contracting, and self-employment. They aren’t working for a single employer.

 

The decline of the lifetime employer idea and the fissuring of American workplaces strike at the very core of the Treaty of Detroit’s employer-based benefits model. It makes little sense for employers to pay expansive pensions to a worker whose tenure is only a few years. And for workers who switch jobs frequently, it is a hassle to change retirement plans and health care providers every time they move to a new company. When companies use independent contractors or gig workers, the situation is even more problematic. These individuals don’t get any benefits. The legacy of the Treaty of Detroit leaves them out in the cold.

 

The basic idea behind the family wage hasn’t fared well, either. Almost every assumption behind the imagined ideal of a 1950s-style family, where Dad comes home from work to a stay-at-home mom and a couple of kids, is out of date. While wages and productivity grew together in the Treaty of Detroit era, the ordinary male worker’s wages have been stagnant since the 1970s. True, household incomes continued to rise through the 1970s and 1980s until about 2000. But this was partly because women entered the workforce in increasing numbers. In May 1950, when the Treaty of Detroit was signed, 33 percent of women were in the workforce. Fifty years later, in May 2000, that number was 60 percent. Women’s income buoyed households that were already struggling under the model of breadwinner liberalism.

 

Whether or not the Treaty of Detroit paradigm suited the age of managerial corporations, it is certainly out of sync with employment and families today. Adhering to the approach of employer-based benefits makes little sense given contemporary conditions. While some people might be nostalgic for the America of the 1950s, nostalgia is rarely a good guide to public policy.

 

“When we look back from beyond the 2008 crash, neoliberalism seems woefully naive.”

 

The Perils of Neoliberalism

 

As companies withdrew from the Treaty of Detroit, a second paradigm for public policy took hold. Neoliberalism was defined by its desire to use markets as the means to achieve social goals and by its view that markets were an end in themselves. Ideologically, neoliberalism grew out of a conservative intellectual movement that was powered by anti-government Austrian economic theorists who fled Europe in the mid-twentieth century, and it was buoyed by the Cold War and the fear of communism. It gained political dominance with the elections of Ronald Reagan in the United States and Margaret Thatcher in the United Kingdom and became virtually universal when liberals turned to neoliberal policies in the 1990s and 2000s.

 

When it came to public policy, neoliberals thought that markets—rooted in strong private property and contract rights— were the best way to achieve the well-being of society and to preserve and advance freedom for all. On their theory, government would do little more than enforce contract and property rights. If government was going to act to advance broader social goals (like health or education), it would do so through private markets. Privatization involved outsourcing government functions to profit-seeking companies. Deregulation became central to neoliberalism because it freed markets to act without constraints. Vouchers, state-provided coupons to purchase private services, became the dominant approach for how government could advance social goals by market means.

 

When we look back from beyond the 2008 crash, neoliberalism seems woefully naive. But Reagan-era policy makers were facing their own age of anxiety and were searching for an alternative to old institutions that no longer seemed viable. The economic chaos of the late 1970s, with high inflation and economic stagnation, called for desperate measures. The old Treaty of Detroit model was already moribund. Modern economics had resurrected Adam Smith and the promise that an unregulated, laissez-faire market could do a far better job than any command-and-control government intervention.

 

And we cannot forget the role of the Cold War. Neoliberalism equated markets with freedom and big government with Soviet-style planned economies. Unfettered capitalism, the Reaganites believed, could pull the United States out of the economic doldrums of the late 1970s and provide the fuel needed to win a decisive economic, political, and ideological victory against the Soviets.

 

But in retrospect, we know that neoliberalism had a number of failings, including an almost religious faith in markets even in the face of empirical evidence of their failings and flaws. Unquestioning acceptance of market efficiency often led proponents to ignore fundamental problems with the neoliberal approach: that profit-seeking actors would try to use fraud, force, and deception to cheat people for their own gain; that markets can lead to the concentration of economic power and, as a result, undermine the very functioning of competitive markets; and that profit-seeking actors would seek to use government to benefit themselves, at the expense of taxpayers, consumers, and a competitive market. All of these neoliberal problems came home to roost at one time or another, and sometimes all at the same time.

 

Consider the neoliberal turn in higher education, for-profit colleges. Instead of the government funding public universities and community colleges directly and allowing people to attend private non-profit colleges if they desire, the neoliberal approach urged two things. First, private, profit-seeking actors should run institutions of higher education; second, the government should give students a voucher to go to whatever school they wanted. The vouchers in this case are Pell Grants, federal grants given to students directly for use at any college or university, and federal student loans, some of which the government subsidizes by not requiring students to pay interest while in school. On the neoliberal model, students can use Pell Grants and federal student loans at any school in the market—public, nonprofit, or for-profit. The hope was that the market would create new and better options for students.

 

So how did the experiment turn out? The neoliberal higher education model turned into a way for shareholders and CEOs to make boatloads of money off taxpayers by deceiving prospective students, saddling them with unconscionable levels of debt, and leaving them without a decent education or any reasonable job prospects.

 

According to a report from the US Senate Committee on Health, Education, Labor, and Pensions, in 2009, 86 percent of the revenues for the fifteen publicly traded for-profit colleges came from taxpayer dollars. Think about what that means. For every $100 of revenue that for-profits earned, $86 came from taxpayers and only $14 came from other sources. And the absolute dollars at stake are large. In the 2009–2010 school year alone, the US government invested $32 billion in for-profits, or a whopping 25 percent of the Department of Education’s budget for student aid programs. Pell Grants alone amounted to $7.5 billion (up from only $1.1 billion in 2000–2001).

 

Neoliberalism’s faults are sometimes more subtle. One example is basic banking services. Most of the time we don’t notice that dollar bills say on them “legal tender.” What that means is that the government has mandated that bills can be used to pay for goods and services. For a variety of reasons, including security and technology, most employers don’t pay their employees in legal tender—in cash. Instead, they pay either with a check or with direct deposit. For many people, that isn’t a big deal; it’s convenient. You just go to the bank and deposit the check, or if you have direct deposit, the money automatically appears in your account. But 7 percent of American households—15.6 million adults and 7.6 million children—don’t have a bank account. As a result, just to get access to the money that they are owed for their work, they have to use check-cashing establishments that charge exorbitant fees.

 

Neoliberalism’s faith in markets also too often ignores the consequences of concentrated economic power. In this case, a lesson from history went ignored. In the late nineteenth and early twentieth centuries, industrialization led to massive consolidation of industry. During the “great merger movement” of 1895 to 1904, more than 1,800 firms disappeared. By 1904, 40  percent of American industry was under the control of only 300 corporations.

 

Concentrated economic power had a variety of detrimental consequences. It meant that employers had more power over their workers, it meant that monopolies could raise prices on customers, and it meant that corporations and their wealthy leadership had more resources to influence public policy. In response, Progressive Era reformers passed antitrust laws and public utilities regulations in order to break up or regulate concentrated economic power. They did so not just for economic reasons of efficiency but also for constitutional reasons. They understood that America could not remain a republic—it could not have political freedom—if power was concentrated in a small number of corporations that would try to rig the political system to serve their interests.

 

Three generations later, during the ascendency of neoliberalism, Robert Bork, then a professor and later a judge, penned a famous tract, The Antitrust Paradox, arguing that antitrust policy should not take into account political factors, the market ecosystem, or anything except economic efficiency. As Bork’s ideas took hold, antitrust prosecutions languished. Today we are in the midst of what The Economist calls the “great merger wave.” Looking at 900 sectors of the economy, the magazine found that two-thirds had become more concentrated between 1997 and 2012. Concentrated power once again means less competition, higher prices, and the increasing political power of big monopolies that control virtually every aspect of Americans’ lives.

 

Of course, the 2008 economic crash was one of the consequences of the long neoliberal moment. Deregulation meant deceptive and fraudulent practices in a variety of sectors, including the housing and financial markets. The neoliberal preference to facilitate social goals through government-supported vouchers, subsidies, and incentives only deepened this pathology. Federally supported housing finance allowed financial institutions to capture the benefits of their bets without bearing the risks when those bets went bad. And consolidation and concentration in the financial sector produced Wall Street banks that were “too big to fail,” leading to government bailouts.

 

Despite its many problems, fondness for (if not necessarily robust faith in) neoliberalism remains strong. Many people remain nostalgic for this easy-to-understand approach to public policy. But the financial debacle of 2008–2009 and its continuing aftershocks should leave Americans in no doubt. Nostalgia for neoliberalism is perilous. We need better options.

 

Excerpted from The Public Option: How to Expand Freedom, Increase Opportunity, and Promote Equality by Ganesh Sitaraman and Anne L. Alstott, published by Harvard University Press. Copyright © 2019 by Ganesh Sitaraman and Anne L. Alstott. Used by permission. All rights reserved. Ganesh Sitaraman is a Professor of Law at Vanderbilt Law School. Anne L. Alstott is the Jacquin D. Bierman Professor at the Yale Law School.

 

The ProMarket blog is dedicated to discussing how competition tends to be subverted by special interests. The posts represent the opinions of their writers, not necessarily those of the University of Chicago, the Booth School of Business, or its faculty. For more information, please visit ProMarket Blog Policy