Free Novel Read

The Raging 2020s Page 3


  The last fifty years have witnessed a quiet reimagining of what companies are for and how they operate. But what brought us to this point?

  * * *

  THE HISTORY OF the social contract is a story of power and how it redistributes over time. Throughout history, the rights and responsibilities of capital, labor, and the state have been mostly determined by whichever group possesses the most power and is able to set the terms without overplaying its hand to the point of creating unrest or revolution. In the agricultural societies of the past, sovereign rulers exercised near-absolute authority over their lords and the peasantry and their economic lives. During the Industrial Revolution, the scales tipped toward the wealthy and politically connected owners of capital. In the early 20th century, American and European workers reined in the power of corporations through labor unions and the ballot box. Today, power has concentrated in the private sector yet again.

  What caused this shift? How have corporations amassed so much size and power over the last half century? If you want to understand the trigger point, you can look to a single idea, a single sentence even.

  After the calamities of the Great Depression and World War II, the economy began to boom throughout the United States and Europe. But strong checks were placed on it by both organized labor and government regulators, who were all too familiar with the cost of monopolies and stock-market crises. In this context, the vast majority of businesses saw themselves as fitting within a clear niche in society. Companies were expected to turn a profit while also working to improve the well-being of their employees, support the communities where they did business, and generally serve the public good.

  Yet not everyone thought this model was sensible. In 1962, in his book Capitalism and Freedom, economist Milton Friedman wrote, “There is one and only one social responsibility of business—to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.”

  This idea marked a dramatic departure from the existing social contract—a world where a lifesaving patent was sold for a few dollars and where George Merck could speak of profit as a secondary motive for doing business. But in Friedman’s eyes, such decisions were inefficiencies, flaws in the market. And according to the theories of Friedman and his colleagues at the University of Chicago, the world would benefit several times over if every individual pursued maximum profit and then reinvested the gains. In Friedman’s view, a company’s only loyalty was to its shareholders, and any leadership decision that kept a dollar out of shareholders’ pockets was mismanagement. In time, this profit-optimizing philosophy would come to be known as shareholder primacy.

  The idea did not catch on right away. But it began resonating with a core group of supporters in the 1970s, when the booming postwar economy began to stagnate. Economists pointed to government regulation and inefficient management as the problems, and discontent opened the door for Friedman’s ideas to circulate. Then, in the 1980s, his philosophy hit the mainstream. Shareholder primacy melded perfectly with the Reagan and Thatcher eras, providing an intellectual cornerstone for deregulation and trickle-down economics. Soon, clear opposition to the New Deal–era checks on corporate power emerged. These critics argued that government had kept a lid on business for too long: managers of big businesses had grown complacent and had stopped driving profits, and the whole economy was stagnating as a result. If companies were turbocharged to maximize profits, it would jolt the whole country and the whole world into growth. To get there, all you had to do was prioritize profit. A pithier version of Friedman’s big idea soon swept through the culture, expressed by Gordon Gekko in Oliver Stone’s Wall Street: “Greed is good.”

  The effect of shareholder primacy was to drive a stark line between a company’s shareholders and its stakeholders, defined as every other party affected by its business and including its employees, its community, its country, its customers, and the environment. Under the new model, shareholder profits came first and foremost, and any significant investment in other stakeholders became a liability.

  The mid-1980s saw the rise of hostile takeovers and corporate raiders, who were in many respects the vanguard of shareholder primacy. They would identify distressed or stagnant companies and buy up equity until they gained control. Then they would reorganize every division toward maximizing returns to shareholders, rooting out any inefficiencies they could find. This often meant making job cuts, relocating headquarters, selling off real estate, and taking on loads of debt—using any tool in the arsenal for channeling assets toward generating a short-term return on investment. The prospect of a hostile takeover struck fear into the hearts of corporate leaders. And a series of legal decisions made it increasingly difficult to resist hostile takeovers, especially in cases where shareholders stood to turn a profit, which ultimately forced many companies to shift toward shareholder primacy just to avoid being a target for a sudden takeover bid.

  Throughout the 1980s, mergers and acquisitions swept through the world’s largest economies, aided by increasingly laissez-faire attitudes toward regulation and antitrust. The United States had been the world leader in antitrust, standing firmly against monopolies since the early 20th century, when it had launched hundreds of lawsuits against large corporations to break up Gilded Age monopolies like Standard Oil and U.S. Steel. In the aftermath of World War II, policy makers established a new wave of antitrust when the world saw how monopolies contributed to the rise of authoritarian regimes in Japan, Italy, and the Third Reich. During the drafting of new constitutions and laws around the world, the United States often encouraged (and, in the case of Japan, imposed) tough new antitrust laws. But in the 1970s, a new school of thought came to dominate the debate around monopoly and competition.

  The newer theory, popularized by judge and legal scholar Robert Bork, held that economic concentration was a bad thing only if there was demonstrable harm in the form of higher prices to consumers. As long as monopolies charged fair prices, they were perfectly acceptable. Bork’s narrow interpretation of antitrust law aligned perfectly with the Friedman doctrine and the political atmosphere of the 1980s. Government watchdogs began to bring fewer suits against large companies, and Bork’s theory became the prevailing basis for the government’s approach to competition. All the while, the private sector grew steadily more consolidated through mergers and acquisitions. Like many people, I went through the experience of the local bank where I had an account being purchased by a larger regional bank, which was then purchased by a national bank, which then merged with another national bank. This happened across sectors of the economy.

  In this same period, as we will explore further in the next chapter, companies also began to recognize the benefits of expanding their influence in Washington. Under Friedman’s philosophy, businesses could maximize their profits as long as they followed the letter of the law. But through lobbying and unlimited political donations, companies could remake the boundaries by shaping the laws themselves, gaining outsized returns for relatively modest allocations of capital.

  Each of these trends amplified the next, and the result has been a rapid expansion of corporate size and power since the 1970s. Shareholder primacy unleashed the ugliest face of capitalism. In theory, profit in the hands of shareholders would lead to the benefit of all, by increasing the overall efficiency of the economy and freeing up excess capital that would be reinvested in communities.

  But in practice, it squeezed out other stakeholders like employees, local communities, and the environment. When the economy was booming in the decades after World War II, just about every medium-sized or larger city had a major corporate headquarters. Companies’ executives sat on the local boards. They supported everything from after-school programs to local arts and sports programs. The children of the CEO went to the same schools as the children of middle managers. If a downturn hit, the company did not lay off employees as soon as a consultant or MBA determined it was balance-sheet optimal; they waited until the
last possible moment because the company and community were inextricably tied, and each felt a responsibility to the other for the long term. This is how the social contract worked in a hyper-local context.

  But once shareholder primacy emerged, the thinking changed. The 1980s wave of mergers and acquisitions led to widespread layoffs in smaller cities, the uprooting of corporate headquarters to tax-optimized locations, and whole local economies spiraling into freefall. I saw this where I grew up in Charleston, West Virginia, when all its banks, mining companies, and chemical companies were swallowed up by companies on the coasts. Shareholder capitalism meant that if an economic benefit was projected by moving the company’s headquarters, a company that had spent decades growing alongside its community would now up and leave, or at least change where it was nominally headquartered and paying taxes. In the United States, this resulted in two-thirds of all job growth taking place in just twenty-five cities and counties; the same pattern took place in Europe.

  * * *

  FAST FORWARD TO the present, and we have seen the first part of Friedman’s vision pan out: the world’s largest companies have posted remarkable profits, and shareholders have seen enormous returns. But the second part—the promise that these profits would come back around and benefit everyone—never arrived. We have seen all the growth in the last few decades go to senior executives and shareholders, not to workers. We have seen money drain out of individual communities where robust local economies had existed, routing instead to financial hubs and shareholders. We have seen major centralization and—instead of the healthy competition promised—a new age of monopoly.

  In 2019, the five hundred largest companies in the United States generated a combined $14.2 trillion in revenue, with nearly half coming from the country’s fifty biggest firms. That year, the gross domestic product of the United States was $21.7 trillion. That means the five hundred largest firms in the country represented two-thirds of the country’s total economic output. The top fifty accounted for a full third of national GDP. When the Fortune 500 list was first published in 1955, the fifty largest companies generated less than 16 percent of the country’s economic output. At no time since the Great Depression have so few firms controlled so much of the wealth in the United States.

  There are certain benefits that big companies bring to the table. Large firms can take advantage of economies of scale—the cost savings that come from doing commerce in bulk—which theoretically lead to lower prices for consumers. But more often than not, large companies tend to use their clout to entrench their own dominance. Thanks to their wealth and global distribution, these companies have access to resources that remain out of reach for their smaller competitors. Large firms can use the offshore financial system to optimize their tax bills. They can leverage global supply chains and labor markets to lower costs. They can wage effective influence campaigns to win over policy makers. They can weather price wars, raise barriers to entry, and buy out would-be rivals before they get off the ground. They have access to larger data sets that they can use to optimize against smaller rivals. They can create a feedback loop in which economic power begets more economic power. The rise of shareholder capitalism has incentivized firms toward larger and larger size. And by reinforcing their own dominance, large companies can make it almost impossible for other firms to compete.

  The lifeblood of capitalism is competition. In the effort to outperform one another, firms are forced to respond to consumer demands, pursue more efficient operations, and pioneer new products. In a competitive market, consumers pay fair prices and companies earn fair profits. New firms strive to unseat the old guard or break into uncharted markets, and established companies aim to fend off rivals by investing in innovation. One of the reasons that socialist and communist economies stagnate and fail is the lack of dynamism in economies without competition. The notion that an ingenious entrepreneur or scrappy start-up can succeed by merit of their inspiration, perspiration, and persistence relies on the existence of competition.

  If companies become too entrenched to topple, innovation and entrepreneurship stagnate. That is exactly what we are seeing today. Not every multinational firm engages in anticompetitive practices, but when given the opportunity, shareholder capitalism compels them to. And the consequences can be severe. As we saw in the pharmaceutical industry, the lack of competition can end up costing lives. Author and activist Matt Stoller told me that this amounts to a “crime spree.”

  The concentration that we see in the insulin market is now the new normal for many other sectors of the economy as well. Four airlines—American, Delta, Southwest, and United—control the vast majority of domestic air travel in the United States. This consolidation has reduced service and increased ticket prices in small- and medium-sized cities and created de facto monopolies in hubs, where one airline dominates. Two telecommunications companies—AT&T and Verizon—provide cell service to nearly 70 percent of American mobile phone users. Charter Communications and the Comcast Corporation service four in five American cable subscribers. AB InBev and Molson Coors are responsible for more than 70 percent of beer sales in the United States. Google handles more than 90 percent of worldwide internet searches.

  Matt Stoller’s research has shown that market consolidation is also in niche industries like portable toilets, prison phone services, mixed martial arts leagues, board games, and cheerleading equipment. He told me that “it’s simply a corrupt way to run a society. If you believe in corruption, which a lot of people seem to, subverting the public interest for private gain, then that’s fine, works well. If you don’t, then you should make sure that people are not subject to market power and it’s all just structured by the state. Corporations are just grants of state power.”

  As the United States becomes a country of conglomerates, the effects of mass mergers and acquisitions on customers, employees, communities, and more can be seen vividly in a vast range of industries. One of the most striking places to look is where the American dream once rang loudest: the family farm.

  Less than a ninety-minute drive from my home in Baltimore is the Delmarva Peninsula. A 170-mile-long comma separating the Chesapeake Bay and the Atlantic Ocean, the peninsula feels a world away from the knowledge-industry mania of the metropolitan East Coast. The landscape is flat and swampy, bordered by sleepy fishing villages and beach towns. Aside from tourism, the main driver of the Delmarva Peninsula’s economy is agriculture, specifically poultry farming. In Delaware—one of the three states that claim part of the peninsula—there are more than two hundred chickens for every human being. Drive around the peninsula and you will see countless rows of stout, six-hundred-foot-long by sixty-foot-wide aluminum buildings, inside of which live tens of thousands of chickens. You will also see a growing number of chicken houses sitting rusted and abandoned. They are the detritus of a decades-long battle to raise the most birds at the lowest possible cost.

  In the aftermath of massive farm failures during the Great Depression, the Roosevelt administration adopted policies that allowed small farmers to survive the sudden swings of food markets. If the price of meat, dairy, grain, or other crops suddenly fell, as happened in the Depression, farmers would need to produce more to cover their debts and living expenses. But as a result, the increasing supply only drove the prices down further. This chain reaction helped force one out of every nine farms in the United States into foreclosure and bankruptcy in just five years. The Roosevelt administration responded by creating a series of taxes, quotas, and subsidies to sustain farmers and stabilize the country’s food supply, offering security to a wide range of smaller farm operations in the decades that followed.

  However, in the 1970s, the government started stripping away these controls on price and supply. The world’s appetite for meat was growing, and the United States wanted to feed it. It would take a lot of grain to raise cows, pigs, and poultry, and the New Deal policies kept farmers from producing enough corn and soybeans to meet the demand. Earl Butz, a food industry executi
ve who served as the US secretary of agriculture from 1971 to 1976, went on speaking tours to encourage American farmers to meet the growing global demand for grain. Conservation techniques like rotating crops and soil management started to go out the window as Butz encouraged farmers to maximize their grain output. “Get big or get out,” he liked to say. “Adapt or die.”

  Early on, many farmers were suspicious of the government’s push for large-scale monocropping, as well as the food companies that stood to benefit from the subsequent trade deals. At the time, Wisconsin senator Gaylord Nelson warned that “corporate farming threatens an ultimate shift in power in rural America.” Many bought into Butz’s vision. Farmers took out loans to buy bigger plots of land, and the country’s food exports rose. But Butz’s vision for free-market farming quickly began to fall apart. In 1979, the US placed a grain embargo on the Soviet Union after it invaded Afghanistan. Overnight, American farmers lost one of their biggest international customers and were left sitting on grain they could not sell. By 1984, the total debt held by the nation’s farms hit $215 billion, double the level of just six years earlier. Thousands of small farmers went under. The ones who survived bought up foreclosed farms on the cheap and rolled them into larger operations. The consolidation continued for decades, and by 2011, 11 percent of US farms controlled more than 70 percent of the country’s cropland.