A new report by the journalist network Investigate Europe claims representatives of Facebook and Google pressured members of an EU working group on fake news to drop proposals that would have called for an examination of the role played by tech platforms’ business models and market power in the spread of disinformation online.
This week’s elections for the European Parliament, which kicked off on Thursday, are widely seen as the most consequential European election in recent memory.
Given the pivotal nature of the European parliamentary elections, all eyes are on whether digital platforms like Google and Facebook can curtail the kind of massive disinformation and manipulation drives that have become de rigueur during major elections in recent years.
In recent months, both the platforms and the European Union have publicized efforts to combat the spread of fake news stories and manipulative content. So far, however, the results are not very encouraging: while the platforms say they haven’t identified any large-scale attempts by foreign actors to suppress or manipulate voters, a report by the activist organization Avaaz finds evidence of large-scale coordinated efforts by domestic far-right groups flooding Facebook with propaganda and disinformation in the lead-up to the crucial vote. Last week, the European Commission reprimanded Facebook, Google, and Twitter for not doing enough to curb the deluge of disinformation on social media ahead of the crucial vote.
Earlier this week, a new report by the journalist network Investigate Europe suggested that a key EU effort to tackle the disinformation plague had been “undermined” by Facebook and Google’s refusal to address one of the major obstacles to solving the disinformation problem: their market power.
The report, published by Open Democracy, concerns one of the more high-profile efforts by the EU to fight mass disinformation: an expert committee set up by the European Commission in early 2018 to study the roots and scale of disinformation online and come up with policy recommendations. The “high-level group,” established in response to the outbreak of fake news and propaganda that preceded the Brexit referendum in the UK and the election of Donald Trump in the United States, was comprised of 39 members, among them scholars, journalists, NGOs, and industry groups representing publishers and tech platforms. Its work resulted in a 40-page report and a “code of practice”—a set of voluntary guidelines that the platforms were meant to comply with.
According to Investigate Europe, however, the expert group’s work was hindered by the platforms. Representatives of Facebook and Google, it alleges, pressured members of the group to drop proposals that would have called for an examination of the role played by the business models and market power of digital platforms in the spread of disinformation online. “There was heavy arm-wrestling in the corridors from the platforms to conditionalize the other experts,” one member is quoted as saying. Another, Monique Goyens—the Director General of the European consumer association BEUC—told reporters that “We were blackmailed.”
According to the report, tensions first began to rise after Goyens and several other members of the group wanted to look into whether the disinformation problem has been exacerbated by Google and Facebook’s business models and market power. “We wanted to know whether the platforms were abusing their market power,” Goyens told Investigate Europe.
Although the expert group was convened by Mariya Gabriel, the European commissioner for Digital Economy and Society—a separate office from the EU’s competition commissioner— Goyens and other members wanted to include a call for a so-called sector inquiry, which would have allowed the EU’s Competition Commissioner to receive more information about the platforms’ business model.
At this point, according to Goyens, she was warned by Richard Allan, Facebook’s vice president for global policy, that if the group continues to focus on the competition angle, “we will be controversial.” Another member who spoke to Investigate Europe anonymously said Allan “threatened that if we did not stop talking about competition tools, Facebook would stop its support for journalistic and academic projects.” As the Columbia Journalism Review’s Mathew Ingram notes, Facebook and Google have provided or promised at least $600 million in funding for various journalistic projects and training programs in the past two years; Google has also long funded academic research that supports its business and policy interests.
The European expert group, according to Investigate Europe, included representatives from 10 organizations that previously received funding from the platforms, a conflict of interest that according to Goyens “was not made transparent [to some members of the group].” Faced with increased scrutiny from European competition authorities, Google and Facebook have also substantially increased their lobbying in Europe in recent years.
In a statement to ProMarket, Facebook denied the report, saying: “This is a deliberate misrepresentation of a technical discussion about the best way to bring a cross-industry group together to address the issues around false news. We believe real progress has been made through the code of conduct process and we are looking forward to working with the European institutions to implement it.” [Google did not respond to a request for comment.]
Alexios Mantzarlis, a research fellow at TED and the former director of International Fact-Checking Network (IFCN), himself a member of the EU working group, also disputed Investigate Europe’s characterization of the group’s discussions (“a little less House of Cards and a bit more Veep”), tweeting that
(3) Ultimately the document was hella imperfect but having worked for the UN in intergovernmental negotiations I have a sense of incremental victories that some "experts" who thought this was a high school debate club didn't consider. (They voted for the document too)
— Alexios (@Mantzarlis) May 21, 2019
Nevertheless, Mantzarlis too was critical of the implementation of the EU’s code of conduct. Shortly after the code was published, Mantzarlis, along with several other members of the expert group, wrote that the document
“Contains no common approach, no clear and meaningful commitments, no measurable objectives or KPIs, hence no possibility to monitor process, and no compliance or enforcement tool; it is by no means self-regulation, and therefore the Platforms, despite their efforts, have not delivered a Code of Practice.”
The code of conduct includes a set of commitments, among them for more transparency regarding political advertising and increased scrutiny of accounts and websites that spread disinformation. When it was published in September, the code was presented as a first-of-its-kind self-regulatory framework to fight disinformation, but EU officials have since criticized the “patchy, opaque and self-selecting” way in which the platforms were implementing it.
With the European elections well underway, it is unclear just how much of an effect disinformation and propaganda campaigns will have on the final results (exit polls in the Netherlands showed that right-populist parties fared badly). But the fact that Europe is still “drowning” in disinformation and far-right incitement, even after the platforms have taken far more aggressive measures than they have in the past, raises the question of whether the disinformation problem can ever be fully addressed without also addressing the issue of market power. It’s a question that a growing number of scholars have discussed in recent years, arguing, as Sally Hubbard did in 2017, that mass disinformation is an antitrust problem. Depending on the results of this week’s elections, it’s a question the EU’s next competition commissioner might also want to take up.
For more on the political power of digital platforms, listen to the following episode of the Capitalisn’t podcast:
The ProMarket blog is dedicated to discussing how competition tends to be subverted by special interests. The posts represent the opinions of their writers, not necessarily those of the University of Chicago, the Booth School of Business, or its faculty. For more information, please visit ProMarket Blog Policy.