Despite Mark Zuckerberg’s assurances that Facebook will continue its research program, it seems almost inevitable that the company will narrow the scale and scope of the research into its products’ impacts, limit research into controversial topics, and share results much more narrowly within the company. Researchers will be directed toward low-risk projects that are valuable for the company’s bottom line but unlikely to pose a public relations threat. Other companies in the sector will likely follow suit. 


When Jeff Horwitz published the first story in the Wall Street Journal’s “Facebook Files” series in mid-September, his extraordinary reporting sparked a national conversation about the harms of social media, about the important role that public policy might play in addressing these harms, and about the culture within Facebook that may have led to product decisions that many people find to be concerning.

Yet, as a former member of Facebook’s public policy team, the current debate seems to omit one of the most critical impacts of the Haugen leaks: how they will affect decision-making at Facebook and throughout the tech sector?

Facebook has been plagued by leaking in the wake of the 2016 election, as employees wrestled with the company’s role in spreading misinformation. The problem has become so rampant that when the company wants to make a public statement, it can do it simply by posting it on an internal company message board. 

I left Facebook in the fall of 2019. By that point, leaking was already changing the nature of internal decision-making at the company. Exchanges from Mark Zuckerberg’s weekly company Q&A routinely ended up in the news, and more and more press stories included specific details about specific meetings involving specific people. Reporters cultivated sources, offering their contact information on private messaging services like Signal to try to entice employees to share information.  

As the leaking escalated, the company’s open culture started to change. In internal discussions, people would hold back from making comments out of fear that something they said might appear the next day in The Washington Post or BuzzFeed. When executives posted on internal message boards, their comments started to feel more like press releases than honest explorations of difficult issues. Meetings became more choreographed and less spontaneous and exploratory. 

These changes were prudent because they reduced risk, but they were a sad departure from the openness and transparency that had been the hallmark of Facebook’s culture. A more closed, more hierarchical, less transparent work environment made it harder to reach good decisions.

In the wake of Haugen’s leaks, this trend is likely to accelerate, not just at Facebook but throughout the tech sector. Companies learning the lessons of the Haugen disclosures will likely reduce the number of people involved in hard decisions, reduce the volume of research they conduct on hard problems, and change hiring practices to reduce leaking. 

The biggest impact will likely be on the information that is used to make decisions. The most sensational information in the Facebook Papers stemmed from projects Facebook initiated to try to understand some of its greatest risks, ranging from how teens use and experience its products to how its design decisions may influence the spread of misinformation.  

Despite Zuckerberg’s assurances that the company will continue its research program, it seems almost inevitable that Facebook will narrow the scale and scope of the research into its products’ impacts, limit research into controversial topics, and share results much more narrowly within the company. Company researchers will have less autonomy to pursue independent projects, and will be directed toward low-risk projects that are valuable for the company’s bottom line but unlikely to pose a public relations threat.

Other companies in the sector will look at Facebook’s experience and follow a similar path. They will inventory their existing projects, and deprecate projects that seem too risky. These changes will make it less likely that companies will identify and tackle product problems. You can’t solve a problem if you don’t understand it. 

“Companies learning the lessons of the Haugen disclosures will likely reduce the number of people involved in hard decisions, reduce the volume of research they conduct on hard problems, and change hiring practices to reduce leaking.”

Of course, some might argue that these changes won’t dramatically alter the status quo. Haugen alleges that Facebook didn’t try to solve the problems its internal research identified, and that the company routinely ignored or discontinued controversial research. If that’s the case, then scaling back of research will have limited effect.

That view ignores the valuable role that research plays, even when decision-makers give it less weight than a researcher might want. Research identifies problems, quantifies costs and benefits, provides a basis for discussions about tradeoffs, and helps to generate alternative approaches. If tech companies diminish the scope of their research programs to try to minimize public relations risk, decision-making will likely suffer.

The Haugen leaks will also shrink the number of people who participate in decision-making. Haugen testified that she was shocked to learn that more than 60,000 Facebook employees had access to the documents. She accessed a vast amount of information that had nothing to do with her day-to-day responsibilities. 

No longer. Companies will segment information to ensure that access is based on need. Fewer employees will be included in critical decision-making meetings. The kinds of experiences that I cherished early in my Facebook career—being included in senior-level meetings on hard topics with energetic, exploratory debate—will be far rarer in the future.

Companies will also change their hiring practices to limit risk in the decision-making process. They will be more likely to hire institutionalists: people who derive satisfaction from working on change from within an institution, rather than criticizing it from the outside. 

In the past, Facebook has hired prominent critics, hoping to bring more dissenting opinions inside the product review process. But people who arrive at the company with a formidable external reputation may also be more likely to raise grievances externally. Going forward, companies will be less likely to hire people who pose a leaking risk, even though those voices tend to be valuable in bringing new perspectives to hard problems.  

The point here isn’t to debate whether Haugen’s decision to leak Facebook research was right or wrong, and she’s not the first one to balance risk and reward in making a decision to go public. Our history is full of numerous examples of controversial disclosures of confidential information by insiders, ranging from Daniel Ellsberg and the Pentagon Papers to Mark Felt as Deep Throat to former Department of Homeland Security staffer Miles Taylor and his anonymous op-ed about the Trump Administration. Ellsberg, Felt, and Taylor all brought important information into the public domain, and their disclosures had a profound impact on public policy and public discourse.

But whether you view Ellsberg as a traitor or a hero, it’s notable that in the 40 years since he leaked the Pentagon Papers to The New York Times, the Pentagon has never publicized anything comparable. The type of research in the Pentagon Papers—a deep historical review of a foreign conflict—is unlikely to ever be commissioned again, even though a study of this kind might benefit future decision-making at the Defense Department. That journalists have decided to use a similar moniker for the Haugen disclosures, the Facebook Papers, is prescient.

In an ideal decision-making environment, information is shared freely by a diverse group of stakeholders who openly consider costs and benefits. The Haugen leaks will almost certainly take us further from that ideal. Frances Haugen went public to try to promote transparency on important questions about the relationship between technology and our society, but by doing so, the tech sector will become less open, with decisions made by fewer people and with less information. 

Disclosure: Matt Perault is the director of the Center on Technology Policy at the University of North Carolina at Chapel Hill and a professor of the practice at UNC’s School of Information and Library Science. The Center on Technology Policy receives funding from foundations and the private sector, including Facebook. Perault is also a consultant on technology policy issues; among his clients are firms in the tech industry.

Learn more about our disclosure policy here