How to Mitigate the Political Impacts of Social Media and Digital Platform Concentration

Ahead of its annual conference on Digital Platforms, Markets and Democracy, the Stigler Center formed a committee to produce independent white papers that will inform policymakers on how to address the political and economic issues raised by tech platforms. In preparation for the conference, we are publishing the executive summary of each preliminary report.

 

 


On May 15-16, the Stigler Center will host its third annual antitrust and competition conference. Titled “Digital Platforms, Markets and Democracy: A Path Forward,” the conference will bring together dozens of top scholars, policymakers, journalists, and entrepreneurs.

 

During the 2018 conference, a consensus emerged that the political and economic issues raised by the market power of tech platforms must be addressed. To provide independent expertise on the appropriate policy responses, the Stigler Center formed the Committee for the Study of Digital Platforms. The committee is composed of four specialized subcommittees: the economy and market structure; privacy and data protection; the media; and the political system. Each subcommittee is comprised of a chair and specialists in different fields (economics, law, data sciences, media, public policy, political science, venture capitalists, etc.). Its ultimate goal is to produce independent white papers that will inform decision and policymaking.

 

During the conference, the four subcommittees will discuss their initial conclusions. In preparation, we will publish the executive summary of each preliminary report.


 

 

The emergence of social media and its dominant platforms has profoundly transformed many aspects of economic and social life. But few of these transformations have been as heatedly debated as social media’s impact on political institutions and behavior. Social media was once touted as a powerful accelerant of democratization and democratic renewal. It would provide voice to the powerless and would spur collective action to overthrow authoritarians and reform democracies. But revelations that Russian intelligence sought to use social media platforms to influence the 2016 US presidential election and the UK Brexit referendum have cast a pall on this early optimism. Concern about the impact of social media has grown steadily as policymakers and the public increasingly view social media as a megaphone for fake news, the mobilizer of extremists, and a polarizer of society.

 

While innovations in communication, from the printing press to cable television, profoundly altered the relations between the governor and the governed and transformed the potential for democratic and accountable governance, there are valid concerns that the political impact of social media may be greater than that of these earlier innovations. The massive scale and reach of social media allows a single post to reach millions of users. Its platforms facilitate anonymity, which enables misinformation and promotes harassment and hate speech. But most importantly, these features are exacerbated by the network externalities that push social media platforms towards natural monopoly. Consequently, a technology with tremendous potential to reshape politics is controlled by a few firms. In a more competitive environment, users might flee platforms that are overly prone to electoral manipulation and misinformation. But platform monopolization removes the accountability of competition.

 

The Stigler Center has charged our committee with exploring the political impacts of social media and its most prominent platforms. In taking up this mandate, we focused not only on the ways in which social media usage has the potential to shape political outcomes but also the ways in which social media platform companies are emerging as uniquely powerful political actors. While political and legal scrutiny of the platform companies is currently high, these firms have a number of formidable political assets. Beyond sheer size and economic clout, these firms are advantaged by First Amendment protections, the complexity and opacity of their algorithms and internal policies, and their connectivity to users and others who may be politically mobilized. Moreover, social media platforms benefit from economic nationalism as countries race for advantages in digital technology and artificial intelligence. Few if any firms have ever had such a rich collection of advantages.

 

The political power of the platform companies complicates reforming social media. Most obviously, these political advantages can be employed to limit government oversight and regulation. Moreover, regulatory authority created during this period of political weakness may later be captured by the industry as public interest in reforming social media wanes. Most importantly, the power of these firms and their control of the relevant data allows them to avoid greater public scrutiny. 

 

Our report also maps out our current understanding of how social media and digital platforms impact the broader political systems. As platforms such as Facebook, Twitter, YouTube, and others grow in importance as a medium for political debates, so too does their potential to impact political outcomes more broadly. However, our knowledge of the political impacts of social media remains in its infancy given that the data necessary to independently evaluate social media’s effect on political outcomes remains proprietary and largely unavailable to researchers and the public at large. The political impact of social media remains in its infancy given that the data necessary to evaluate social media’s effect on political outcomes remains proprietary and unavailable to researchers. Without better access to such information, academia, think tanks, and other civil society organizations can do little to hold social media accountable for the possible distortions of our democracy.

 

“The political power of the platform companies complicate reforming social media. Most obviously, these political advantages can be employed to limit government oversight and regulation.” 

 

Policy Recommendations

 

We outline proposals designed to mitigate the political impact of social media and the political effects of digital platform concentration. Our most important recommendation is our concurrence with the other subcommittees that significant government regulation and greater antitrust scrutiny is warranted. Our contribution to that discussion focuses on regulatory structure, laying out several principles to help insulate regulatory authorities from excessive industry influence while preserving democratic accountability. We also address important issues related to disclosure and transparency. First, we endorse updating campaign finance law to cover spending on social media campaigns. Second, we call for more transparency in the use of platform companies’ support for research on social media, and for greater dissemination of internal research. Finally, we suggest that a new Digital Authority can be essential for facilitating greater independent research by ensuring that scholars have access to relevant social media data.

 

1. New Regulatory Authorities

 

We recommend the creation of a new regulator and enhanced arrangements for inter-agency cooperation. The following principles should guide the creation of a Digital Authority (DA) tasked with regulating digital platforms:

 

1. The DA should have a reasonable degree of autonomy from industry influences to make decisions about social media platforms in the public interest.  

 

2. The jurisdiction of the DA should cover as many social media-related functions as possible to prevent regulatory fragmentation.

 

3. Mechanisms for coordination with other agencies should be created.

 

4. The DA should have responsibility for rulemaking in the following areas:

    1. General consumer protection
    2. Privacy policies and disclosure
    3. Transparency
    4. Data portability
    5. Data and algorithmic access for external auditing and research

5. The DA should have authority to create mechanisms for real time data collection from the platforms (subject to appropriate protections for user privacy).

 

6. The DA should have research capacity to undertake studies of the impact of the platforms on social and political outcomes.  

 

7. The DA should play a facilitating role in generating independent research by outside scholars.

 

8. The DA should have the authority to review relevant internal studies conducted by the platform companies. When the release poses no undue privacy violation or exposure of business secrets, studies should be made publicly available.  

 

9. Rules and regulations should be fostered in a way that promotes innovation and competition in the digital media sphere.

 

2. Antitrust Enforcement or Other Policies to Prevent Political Market Concentration

 

Many of the negative political by-products of social media are associated with the lack of competitive markets for digital platforms. Therefore, policies aimed at reducing “political concentration” should be developed. 

 

Contemporary antitrust enforcement is generally predicated on a consumer welfare standard. In the case of social media, this standard may be inadequate to account for the political impact of concentration. Economic concentration concentrates political power. Large firms who lack competitors are hugely advantaged in the political marketplace. Second, concentration may exacerbate the negative consequences of the role of social media in the political system. The lack of competition deprives us of a marketplace of ideas that might serve to regulate the platforms’ policies on speech and political activity. These political effects of concentration are unlikely to ever be captured by the consumer welfare standard. 

 

Whether the antitrust law should broaden its scope beyond the consumer welfare standard is a complex and controversial issue. But the harms to citizens through the distortion of political processes should be given considerable weight in policies aimed at fighting market concentration. At a minimum, the DA should develop methodologies for evaluating the explicit political impact of social media concentration. Such methodologies may contribute to the establishment of a system of dual review such as that in place for mergers involving broadcasters, where the FCC has a dual mandate that complements that of antitrust authorities but considers different criteria when assessing the consequences of concentration.

 

3. Role of Social Media in Campaigns and Elections

 

We endorse two campaign disclosure provisions that have been proposed as part of the Honest Ads Act. The first amends the definition of “electioneering communication” to include internet or digital communication. The second is a mandate that digital platforms compile publicly available databases of political advertisements that are run on the platforms.

 

These provisions do not cover, however, all political activity on digital platforms that we might like to be disclosed. We also endorse a disclosure requirement on political advertising paid for by foreign entities. Similarly, the Honest Ads Act contains no mechanism to compel buyers of political advertisements to truthfully reveal their identities. The records compiled by digital platforms would be more informative if these issues were addressed.

 

Furthermore, nothing in the Honest Ads Act requires that digital platforms themselves be politically neutral. There are concerns that the regulation or imposition of political neutrality by the DA might impinge upon First Amendment protections. So we support further and strong disclosure requirements that would reveal such non-neutral platform policies. Such disclosures should cover situations i) when the platforms provide specific support or technical assistance to political parties, candidates, or interest advocacy groups, outlining what type of support has been provided and what the outcome of this support was; and ii) when the platforms make algorithmic changes that directly impact how users see political content and the outcome of such changes.  

 

4. Platform Liability

 

Section 230 of the Communications Decency Act of 1996 says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The Politics committee reached no consensus on the desirability of amending or repealing Section 230. Removing the liability protection of the media platforms would undoubtedly spur them to undertake much more aggressive content moderation in an effort to avoid litigation related to slanderous and harassing speech. But the absence of liability protection might induce the platforms to police over-aggressively and have an unduly chilling effect on speech.

 

5. Philanthropic Disclosure

 

Large digital firms have extensive philanthropic efforts that serve many worthwhile causes. Yet philanthropic efforts that support research and teaching on technology and associated policy issues often create conflicts of interest, as such support may make it more difficult for technology and policy scholars to criticize the platforms and their social impact.

 

First, there should be greater and more transparent disclosure of the philanthropic efforts of social media companies, especially those tangibly related to teaching and research. Second. the new DA should create an office of research to help facilitate independent research on social media companies and platforms. Such research will avoid the inherent conflicts of industry-supported research. We also encourage universities and academic associations to develop disclosure standards that would apply to scholars supported by social media firms. The disclosure policy of the American Economic Association is one plausible model.

 

6. Data Access for Academic and Independent Research

 

Independent research on the economic and political effects of social media is crucial to ensuring that the platforms enhance citizen well-being. Currently, the major impediment to such research is data access. The lack of access to data for academic researchers does not, of course, mean that no research is being conducted. Instead, it means that the only people who are able to conduct such research are those working inside the platforms. We offer two proposals.  A major initiative of a new DA ought to be to facilitate independent research. This could include making the data it obtains from social media firms available for research (with suitable restrictions for individual privacy and proprietary secrets). Second, we encourage the reconsideration of the presumption that data collected by social media platforms ought to be considered proprietary at all. There are a number of possible proposals in this regard. The strongest of which would be to recast the role of the platforms not as owners of the data provided by users, but rather as stewards of that data, entitled to use it to improve their own business models but not necessarily to prevent others from using the data for welfare maximizing purposes.

 

 

Subcommittee on Political Systems:

 

  • Chair: Nolan McCarty, Susan Dod Brown Professor of Politics and Public Affairs, Princeton University
  • Rana Foroohar, Global Business Columnist and Associate Editor, Financial Times
  • Andrew Guess, Assistant Professor of Politics and Public Affairs, Princeton University
  • David Lazer, Professor of Political Science, Northeastern University
  • Alexandra Siegel, Postdoctoral Fellow, Stanford University
  • Nick Stephanopoulus, Professor of Law and Herbert and Marjorie Fried Research Scholar, University of Chicago
  • Joshua Tucker, Professor of Politics, New York University
  • Adam White, Research Fellow, The Hoover Institution, Stanford University

 

DISCLAIMER: The purpose of these preliminary reports is to identify what are the new challenges digital platforms pose to the economic and political structure of our countries.These reports also try to identify the set of possible tools that might address these challenges.Yet, there is potential disagreement among the members of the committees on which of these problems is most troubling, which tools might work best, whether some tools will work at all or even whether the damage they might produce is larger than the problem they are trying to fix. Not all committee members agree with all the findings or proposals contained in this report. The purpose of these preliminary reports, thus, is not to unanimously provide a perfect list of policy fixes but to identify conceptual problems and solutions and start an academic discussion from which robust policy recommendations can eventually be drafted.  

 

The ProMarket blog is dedicated to discussing how competition tends to be subverted by special interests. The posts represent the opinions of their writers, not necessarily those of the University of Chicago, the Booth School of Business, or its faculty. For more information, please visit ProMarket Blog Policy