In October 2019, a group of some 40 U.S. state attorney generals announced they were following the lead of New York state in looking into platform antitrust issues, and the Justice Department and Federal Trade Commission said that they were taking new looks at Google and Facebook for potential antitrust violations. A number of Democratic presidential candidates, led by Massachusetts Senator Elizabeth Warren, have included antitrust planks in their campaign platforms. There are good reasons for the U.S. legal community to take a new look at this issue, which will have profound implications both for the economy, and for the future of American democracy.
The framework under which regulators and judges today look at antitrust was established during the 1970s and 1980s as a byproduct of the rise of the Chicago School of free-market economics. As chronicled in Binyamin Appelbaum’s recent book The Economists’ Hour,65 figures like George Stigler, Aaron Director, and Robert Bork launched a sustained critique of overzealous antitrust enforcement. The major part of their case was economic: antitrust law was being used against companies that had grown large because they were innovative and efficient. They argued that the only legitimate measure of economic harm caused by large corporations was lower consumer welfare, as measured by prices or quality. And they believed that competition would ultimately discipline even the largest companies. For example, IBM’s fortunes faded not because of government antitrust action, but because of the rise of the personal computer. With consumer welfare the only standard for bringing a government action, it was hard to make a case against companies like Google and Facebook that gave away their main products for free.
The Chicago School critique made a further argument, however: the original framers of the 1890 Sherman Antitrust Act were interested only in the economic impact of large scale, and not in the political effects of monopoly. They argued that many of the antitrust actions undertaken in the period after World War II were based on shifting and arbitrary standards, in which the harms either to the economy or to American democracy were not clearly defined.
We are in the midst of a major rethinking of that inherited body of law in light of the changes wrought by digital technology.66 Economists and legal scholars are beginning to challenge the consumer welfare standard as the sole harm caused by corporate scale. “Zero price” platforms like Google and Facebook have built enormous businesses around the exploitation of user data, which in Facebook’s case in particular have been sold despite repeated promises that it would respect its users’ privacy.67 Consumers who are harmed by that loss of privacy in return for the free services they receive may not understand the bargain they have made. In other instances, the harm of large scale lies in foregone innovation, as Google and Facebook buy up startups (like DoubleClick or YouTube by Google or Instagram and WhatsApp by Facebook) that might challenge them. Platform size gives them access to consumer data that makes it very hard to compete against them; Amazon, for example, is both a platform hosting other sellers, as well as a seller itself that can compete against its own clients.
But the political harms caused by large scale are critical issues as well, and ought to be considered in antitrust enforcement. Social media have been weaponized to undermine democracy by deliberately accelerating the flow of bad information, conspiracy theories, and slander. Many political figures have called for stricter government regulation of speech as a result. The U.S. Constitution’s First Amendment contains very strong free-speech protections. But while many conservatives have accused Google and Facebook of “censoring” voices on the right, the First Amendment applies only to government restrictions on speech; law and precedent protect the ability of private parties like the internet platforms to moderate their own content. In addition, Section 230 of the 1996 Communications Decency Act exempts them from private liability that would otherwise deter them from curating content.
The U.S. government’s ability to regulate political speech exists, as evidenced by the Federal Communications Commission’s (FCC) “Fairness Doctrine” which, in the 1970s, was used to mandate “balanced” coverage of political speech. But this doctrine came under sustained attack by conservatives and was eventually rescinded in 1987 through an administrative decision. It is impossible to imagine today’s FCC articulating a modern equivalent of the Fairness Doctrine applied to digital platforms. Our politics are far more polarized; reaching agreement on what constitutes unacceptable speech would be, if not legally, politically impossible.
Europeans have been much more forthright in pursuing both regulatory and antitrust approaches. The German NetzDG, for example, criminalizes fake news with stiff penalties, though its very severity may have chilling consequences for legitimate political speech. European law regards privacy as a fundamental right, and has used the General Data Protection Regulations (GDPR) to limit platform ability to make use of user data. While European competition law accepts many of the same premises as U.S. law, it is more flexible in allowing remedies for alleged violations, and has imposed substantial fines on both Google and Facebook.
A regulatory approach to content moderation is much more problematic in the United States. The problem with platform self-regulation is not that private companies are incapable of moderating content: we don’t complain that the New York Times refuses to publish a conspiracy theorist like Alex Jones, because the newspaper market is decentralized and competitive. The issue is rather one of scale: a decision by Facebook or YouTube not to carry Jones is much more consequential because of their monopolistic control over internet discourse. The government cannot legitimately delegate to a single private company (largely controlled by a single individual) the task of deciding what is acceptable political speech. We would worry much less about this problem if Facebook was part of a more decentralized, competitive platform ecosystem. Antitrust therefore becomes a serious alternative to state regulation if one worries about the impact of fake news and conspiracy theories on democracy.
Remedies will be very difficult to implement: it is the nature of networks to reward scale. As a recent European Commission study notes, digital platforms do not compete for market share, but for the market itself.68 It is not clear how a company like Facebook could be broken up horizontally as AT&T was in the 1970s, since a baby Facebook would likely end up occupying the same position as its parent over time. Other ideas to increase competition among platforms have been suggested. One is to facilitate data portability between platforms by mandating a common API, a proposition that immediately runs into privacy concerns since a platform user’s friends are an integral part of the user’s profile. Another idea is to prohibit or more strictly limit acquisition of startups by large platforms, or to prohibit them from moving into parallel markets where their access to large amounts of consumer data gives them an enormous advantage. Internet companies could be prohibited from being both platforms and sellers on platforms. The United States could adopt privacy rules comparable to GDPR that could be used to limit the extent to which platforms can monetize the consumer data they hold. Finally, U.S. antitrust law could treat media companies differently from large companies in other sectors, given the political externalities generated by political media content. It is less clear in the media space that efficiency and consumer welfare are overriding goals, given these externalities. If the large platforms were held to be media companies, they could be subject to a different interpretation of existing U.S. antitrust law.
The increasing concentration of power in a handful of extremely large corporations that exert oligopolistic or monopolistic control over markets is an issue for many sectors beyond the digital platforms: pharmaceuticals, hospitals, internet providers and telcos, airlines, and countless other sectors have seen mergers and acquisitions in recent decades that have left them far less competitive than they were a generation ago. Economist Thomas Philippon has argued that the U.S. economy is now more concentrated than that of the European Union, and that this has been holding back both innovation and growth.69 The European Union, for its part, has been much more forthright in pushing competition policy against Google, Facebook, and other large internet companies. The current situation in the United States is the byproduct of an intellectual revolution that took place in the 1980s, which has left a legacy of judges and legal scholars who believe that antitrust law is largely an uninteresting issue because it long ago became settled law. Changing this perspective in light of the new challenges posed by digital technology will therefore require a similar intellectual and legal revolution.
65 Binyamin Appelbaum, The Economists’ Hour: False Prophets, Free Markets, and the Fracture of Society (Boston: Little, Brown, 2019).
66 See Lina M. Khan, “The Ideological Roots Of America’s Market Power Problem,” Yale Law Journal Forum 127 (2017); Tim Wu, The Curse of Bigness: Antitrust in the New Gilded Age (New York: Columbia Global Reports, 2018); Jonathan B. Baker, The Antitrust Paradigm: Restoring a Competitive Economy (Cambridge, MA: Harvard University Press, 2019).
67 Roger McNamee, Zucked: Waking up to the Facebook Catastrophe (New York: Penguin Press, 2019).
68 Jacques Cremer et al., Competition Policy for the Digital Era (Brussels: European Commission Directorate General for Competition, 2019).
69 Thomas Philippon, The Great Reversal: How America Gave up on Free Markets (Cambridge, MA: Belknap/Harvard University Press, 2019).