The following is the second part to the transcript of Federal Trade Commission Chairman Andrew Ferguson’s keynote at the 2025 Stigler Center Antitrust and Competition Conference. Part II includes Ferguson’s interview with University of Chicago law professor Eric Posner and the subsequent audience Q&A. You can read part I, Ferguson’s speech, here.
Eric Posner
Thank you very much, Chairman Ferguson.
Your thesis is admirable in two ways. One is that it’s so responsive to the topic of the conference, so I’m sure people appreciate that. And then the other is that it’s extremely clear and crisp.
Let me summarize it like this. First, consumers want a free exchange of ideas. They go onto social media platforms in search of them. Second, the platforms (and I want to ask you about this) don’t actually do that, or haven’t done that. Instead, they’ve engaged in what you call censorious conduct, or content moderation, or they limit speech, however you want to put it. And therefore you’re arguing that—I guess maybe presumptively, or you’re claiming just as a matter of fact—the platforms aren’t actually engaged in competition.
So, on the first: Is it true that consumers are actually, or users are actually going onto social media platforms seeking the free exchange of ideas? You mentioned that some go on to, like, have fun, to troll other people, maybe to torment or bully people. It also seems to be the case that people don’t often want to actually hear the ideas of people who differ with them a great deal. So, what’s the basis of your view? It sounds like an idealized view of how people behave, rather than the reality.
Federal Trade Commission Chairman Andrew Ferguson
I do think it’s the reality.
Even people who are pretty strictly aligned with one side or the other do derive pretty obvious pleasure from engaging—in good or bad faith, it depends—with ideas from the other side. I mean, I meant what I said: You can’t troll unless there are people around that you disagree with. So, even someone who goes on just to, you know, bully, I’m not going to say harass, but even if someone is going on just because they want to pick on people they disagree with, you have to have access to those ideas, number one.
Number two, I think the reaction to the censorship epidemic of 2020, the public reaction, revealed a strong preference in favor of having access to ideas that you agree with, that you disagree with, or that you’re not sure about. I think the success of X after it was purchased by someone who had a relatively open commitment to free speech, especially after everyone promised it was going to fail when it was originally purchased, is pretty good evidence that there’s a preference for this.
Now, it’s also obvious that, I don’t think consumers want no-holds-barred access to everything, including categorically illegal stuff. It’s definitely true that people like receiving some content more than others.
But I think it’s equally true that the average consumer does not want a platform where particular ideas are just categorically excluded. If they did, you would expect there to be platforms where the curation was aimed exclusively and expressly to just one side or the other. I guess Blue Sky is a potential example, which has not exactly taken off.
I also think a lot of consumers began engaging with these social media platforms when they were quite openly touting their commitment to free speech and a free exchange of ideas, and then over time that commitment somewhat eroded. One of the things that the FTC, in this initial phase of the investigation, asked consumers to weigh in on was: What was your understanding of the terms of service for these platforms when you signed up on them; and then, was your being kicked off of these platforms inconsistent with their terms of service? A lot of these platforms had—often very flowery—commitments to free exchange of ideas in their terms of service, and a lot of consumers think that they didn’t live up to them. President Trump, in 2020, actually issued an executive order asking the FTC to look into this. The FTC did not, but that’s one of the things that we’re examining now.
Eric Posner
Right, and I do want to ask you about that.
But first, a number of people have complained that Musk has kicked them off X, or deplatformed them or whatever the term is. And then, I’m not on X myself, but I gather that you can’t really avoid Musk’s, um, tweets or x’s or whatever they’re called. So, he starts off with, I think you’re right, commitment to freedom of expression. He loosens the moderation rules. But what should we think about what’s happening now? Is this evidence that there’s not competition in this type of social media, the fact that he’s deplatforming people?
Andrew Ferguson
I don’t know, as an empirical matter, whether he’s deplatforming people. I simply don’t know.
I’m sure some people—as is true of every platform—are removed for violating terms of service. I also think it’s very obvious that people are allowed to say and express ideas on X that you are not allowed to on other social media platforms. Facebook, for example, still has a vaccine misinformation policy. In terms of avoiding Musk’s tweets, it might just be my algorithm that gives me ready access to them. He’s also one of the world’s most famous men, and is an important leader in the most important government on earth. I don’t think it’s terribly surprising that the algorithm would be putting his ideas in front of people. I don’t think that suggests a competition problem, nor do I think the removal of any particular user does.
Insofar as the argument is, there wasn’t widespread idea-specific censorship especially in 2020 and 2021, you’re just asking me not to believe my lying eyes. There quite obviously was.
Eric Posner
I want to come back to this point about exchange of information, and something one of the speakers earlier today reminded us: network effects.
You would think that, if people want to exchange ideas, be exposed to new ideas, you’d want as big a platform as possible. And that seems to lead in the direction of monopoly.
How do you reconcile your view that the best use of a platform is exchange of ideas among as many people as possible; but your desire, as well, to have competition in the social media sector?
Andrew Ferguson
I do not think that a particular platform’s propensity to censorship is categorically an indication of monopoly power, or is categorically an abuse of market power. Honestly, the sort of social media problem that has concerned me more before the purchase of X was the sort of eerie similarity of the censorship policies across all of these platforms, including the almost identically coterminous decision to eject Parler from the online world entirely at the exact same time, to get President Trump off of all the social media platforms within a couple hours of each other. The risk of collusion—which is made easier by an absence of competition—is what concerns me more, and I’ve written about this.
I don’t think you said this, but if you did, just to clarify: The mere fact that a platform conducts censorship is not itself an indication of market power. The thing that concerned, I think, a lot of Americans in 2020 and 2021 was that all these platforms seemed to basically move in lockstep on a variety of social issues. The reality of this became even clearer once we had discovery in Murthy against Missouri. They all were adopting the same policies, often in consultation with the government or with, you know, lists of experts, and they all were using the same experts. That concerns me the most, is the idea of a handful of market participants. They’re just going to agree on what contours their product will have. And when your product is the exchange of ideas, that’s extremely dangerous.
Eric Posner
Right, right, and I did want to ask you about that. But, you quickly went from the… And as an antitrust lawyer, I’m sure you’re thinking about this.
You quickly went from the fact that they were all doing the same thing at the same time, to the possibility that they were agreeing with each other to do the same thing at the same time. And, as you know better than anybody else here, or as well as anybody else here, parallelism occurs all the time in markets. It can be prices, it can be quality, it can be all the rest. It can be all kinds of things.
To use the examples you gave, one you mentioned, pushing Trump off the platforms. This was immediately in response to the election controversy.
Andrew Ferguson
Maybe.
Eric Posner
Well, I just mean as a temporal matter.
Andrew Ferguson
Right. It happened after. It does not follow it was a response. It is entirely possible that they talked to each other about kicking off the world’s most famous man. And that’s the issue I’m concerned about.
If there are only a handful, that sort of collusion is very easy. Or if there’s a willing partner in the government to set, communicate amongst the various platforms, what the rules are going to be. It’s very easy to do if there’s only a small handful of them.
Eric Posner
So, you’re worried that they might have entered into an agreement, but you don’t have any evidence.
Andrew Ferguson
No. Nor did I say that. My contention is—same with the ejection of Parler—these decisions being made coterminously could just be conscious or unconscious parallelism. They could also be the result of collusion. Collusion is made easier if there are fewer participants. That is the concern that I’m discussing.
Eric Posner
Okay.
You also mentioned just now your concern that the government during the Biden administration pressured the social media platforms to, well, censor (as you would put it) or limit what people were saying about covid. Now, that doesn’t sound like an antitrust problem.
Do you think that’s an antitrust problem, if the government says to these platforms, Look, you’re allowing speech on your platforms that is causing people not to get vaccinated, with the result that this epidemic is going to get much worse?
Is that an antitrust problem? Or is it a different kind of problem? Or is it not a problem at all?
Andrew Ferguson
I think it’s a problem. It isn’t necessarily an antitrust problem.
Murthy itself revealed two things going on. There was pressure applied. The court said it wasn’t coercive for First Amendment purposes, but there’s definitely pressure, consultation, collusion between the platforms and the government.
There also was repeated references to the record of all the platforms consulting with private outside experts on what the rules should be. That is potentially an antitrust problem, if there’s an agreement among them that, “We’re going to use these experts to help us set the rules.”
Even if the sort of collusion, pressure from the government, isn’t an antirust problem, it is a problem for a civic society. And it is a problem made easier if there are fewer market participants.
If there’s a wider smattering of market participants, it’s harder for the government to get on the horn with everyone and pressure them equally. It’s easier basically to buck the government because there are more people. It’s harder for the government to apply pressure across the board. If there’s just a handful of these companies that the government has to pressure, it’s just easier to pull that off.
Even if that’s not an antitrust problem directly—like, that conduct isn’t an antitrust problem directly because of Noerr-Pennington or Interstate Act or whatever—I don’t think it follows that we throw up our hands and say, oh well, antitrust laws can’t address this directly.
That is a problem, and it is a problem made easier by concentration in the market.
Eric Posner
I read the Murthy opinion, and I guess I just had a different impression, so I’m curious. Maybe I missed… I read Alito’s concurrence, which sort of lays out the strongest case.
I got the sense what was going on is, you have a bunch of government officials—probably panicking—talking to Facebook and a bunch of these social media platforms and saying: Look, there’s a pandemic going on. People are dying. There’s this misinformation being circulated, which we think is causing some people not to get vaccinated, which of course will harm other people, you know, through contagion. So, Facebook, please stop allowing people to share this information. It’s causing as much harm or more harm than other types of policies that you have—against bullying or harassment or what have you.
Now, that just sounds to me like the government very strongly trying to encourage people in the private sector to act in a way that would advance the public good. Is that pressure? Is that unacceptable pressure in your view, for the government to say to these independent institutions, “We want you to do that?”
Or is there something else going on?
Andrew Ferguson
I think it depends on whose version you believe.
I don’t know if you read the District Court opinion. The District Court’s view was, this wasn’t just, “Hey, do us a solid,” or, “Do the country a solid.” It was like, “Do it, or else.”
Eric Posner
Yeah. What is the or else? That’s what I didn’t get. Like, what was the implicit threat, however you saw it?
Andrew Ferguson
I think it’s the same threat that a government can always potentially inflict on any marketplace participant, which is, “We can make your life difficult. The regulators can show up, they can audit, they can investigate, they can cost you a lot of money, and the path of least resistance is: Do what we say.”
This is not dissimilar to the issue the Supreme Court confronted with the New York financial regulators calling up people and saying, “Quit doing business with the NRA, okay? Or else.”
The or else is, “We have a tremendous array of investigative tools. Those tools are expensive when applied to you even if we don’t win at the end of the day, so knuckle under.”
The court’s conclusion for state action purposes was, it wasn’t sufficient to transform the social media companies into arms of the state. I just don’t think it follows from that sort of doctrinal nicety that the problem isn’t government officials backed with potential coercive power—formal and informal—calling up social media companies and saying, “The following ideas are to be proscribed. Proscribe them.”
Having worked in the government a fair amount, I think that your view is either very charitable, or maybe a little naĂŻve, about what happens on a phone between a private company and its regulator.
Eric Posner
So, the government shouldn’t call up private institutions and tell them, “Do that, stop saying this, or we’re going to punish you?”
Andrew Ferguson
I do generally think the government should not threaten private people with punishment because of things they’re saying.
Eric Posner
Okay.
Andrew Ferguson
Unless they are criminal, yeah.
Eric Posner
All right, good to know.
[laughter]
Let’s talk about the advertiser boycott issue, also.
You’ve said before that you’re worried about… You think that advertisers colluded, or might have colluded, to boycott social media platforms that displayed speech that they didn’t like. Could you tell us more about that?
Andrew Ferguson
I definitely did not say they did collude. I said…
Eric Posner
They might have, yes…
Andrew Ferguson
…that this risk is real and needs to be confronted and taken seriously.
Part of the way that people who have ideas get them out is, they make a living by propagating ideas. You are one of them. You get this incredibly protected job at the University of Chicago to, sort of, say whatever you want.
Other people participating in the marketplace of ideas can say what they want, as long as they can get the eyeballs to pay attention to them. And part of the way that you make sure you can keep getting the eyeballs is advertising. This isn’t just true of social media platforms. This is true for a huge array of individual speakers who have views, both acceptable to the elites and dissident. Bloggers, YouTubers, people on X. All over the place. Part of the way that they get their ideas out there, and they can afford to keep getting the ideas out there, is if they can attract advertisers.
If advertisers get into a back room and agree, “We aren’t going to put our stuff next to this guy or woman or his or her ideas,” then that is a form of concerted refusal to deal. The antitrust laws condemn concerted refusals to deal. Now of course, because of the First Amendment, we don’t have a categorical antitrust prohibition on boycotts. When a boycott ceases to be economic for purposes of the antitrust laws and becomes purely First Amendment activity, the courts have not been super clear. Sort of a “we know it when we see it” type of thing.
The concern I raised in that concurrence—with which I am deeply concerned—is, if advertisers either get in a room together and say, “We’re not going to do advertising next to this idea,” or they say, “We’re going to agree that this third party decides which ideas get advertisement and which don’t; we’re just going to let them do it,” then we are going to dry up the idea because we are drying up the person who has the idea’s ability to make money off of that idea.
Unless those people can get tenure, they need to keep attracting eyeballs because that’s how they make a living. Drying up the advertising will dry up the idea. So, the risk of an advertiser boycott is a pretty serious risk to the free exchange of ideas.
Eric Posner
Just parenthetically: Having tenure does not guarantee you any eyeballs, I’m afraid. We write for a very small…
[laughter]
Andrew Ferguson
That’s right.
Eric Posner
Just a few pairs of eyeballs…
Andrew Ferguson
Although, having gone to law school, I know they pay a lot to see and hear you, so…
Eric Posner
Yeah, well that’s true. Again, though, I just want to press you on this a bit.
We see parallel conduct. We see a bunch of advertisers who are worried about their brand. And, as rational economic actors, they don’t want their advertisements to be on a platform that people associate with stuff they abhor, whether on the left or on the right. So, just seeing them pulling their ads from platforms that host offensive speech in whatever way, you know, it could be just speech that the advertiser itself doesn’t like. That’s not a problem. It’s only if they get in the back room.
But when we talk about advertisers, we’re talking about thousands of companies around the country. Or around the world. There’s no back room that they could all fit in together. Is it plausible at all that they engage in collusion?
You mentioned also that they consulted with third parties. It would seem to me this is just business as usual. There are third parties out there that study brand management and how people react, how the affiliation, association of an advertisement with a particular type of speech might affect business. What’s the basis for actually trying to investigate in this kind of setting?
Andrew Ferguson
Can they all get into a back room and agree? Having represented trade associations, the answer is yes.
Have they? I don’t know. No one’s looked at it. At least, no one has publicly looked at it. But is this a possibility? Of course, number one.
Number two, there are lots of advertisers, but they tend to use a small handful of advertising agencies to make a lot of these decisions for them. There also are a lot of advertisers, but there aren’t that many huge advertisers. And a decision by a bunch of huge advertisers to cut off access could have massive downstream economic effects on speakers who need access to advertising.
Again, you know, I don’t think you’re saying this. The fact that there is coterminous identical conduct—this is true across all of antitrust law—does not mean there’s an antitrust violation. Where there is smoke, there is not always fire. But there might be. And the whole point of having the antitrust enforcement agencies is, when you see smoke, at least take a look.
For example, this could have been for a number of reasons but, one of the most prominent, you can call it brand safety managers if you want, was GARM. They confronted a single antitrust lawsuit from X and dissolved almost immediately. It doesn’t follow necessarily that they were committing the antitrust violations that they were accused of. But I do think that this risk is sufficient enough and, if it got really bad, would dry up enough access to ideas that it is incumbent upon the antitrust enforcers to take it seriously and, if they think there’s something there, to look at it.
Eric Posner
Okay, I have one more question (and then we’re going to take questions from the audience, so please get your questions ready).
In your remarks, you mention this kind of ideal of people exchanging ideas on platforms. Basically, you said you don’t like censorship, but you’ve also acknowledged that there needs to be some kind of regulation; or am I wrong about that? This strikes me as an important question because of your willingness to infer, from limitations on speech, that there might be a competitive problem.
At what point do we become suspicious of a social media platform? You’ve said, I think, that they could, like, block illegal behavior, like child pornography or something like that. But if they, you know, genuinely think that their users are offended by holocaust denial or conspiracy theories or what have you, what do you… Are you saying that if they implement any kinds of limitations, that’s bad for society? It’s bad for the marketplace of ideas? And it suggests that they must be driven by anti-competitive goals?
Andrew Ferguson
No, I do not think that the imposition of any limitations is bad for society.
Eric Posner
So, I just want to ask you, what are those limitations?
Andrew Ferguson
I think it can vary from platform to platform. And I think, frankly, in a good competitive environment, we would expect some variation from platform to platform. Nor do I think that just the act of content moderation is itself evidence of a competitive problem.
But if any company is doing stuff that quite actively drives huge swaths of its consumers off its platform, and doesn’t seem to suffer competitive consequences—ad revenue doesn’t come down, what have you—that is at least a suggestion that there might be a competitive problem. The FTC, and the government generally… Actually no, not generally. The entire government is not the censorship police because it would be very difficult for the government to be the censorship police without winding up doing a lot of its own censoring.
But I do think that if there are businesses who are doing stuff that drive either a huge swath of their users or their preferences off of their product, and that nothing seems to happen to them—they don’t suffer any competitive consequences, they get bigger, they get more profitable—there’s at least evidence that competition is not thriving in the space. But it can’t be—and it isn’t—that every time a company decides to censor, or content moderate, whatever you want to call it, that the antitrust enforcers need to show up. I just disagree with what seemed to be the long-held position by many, which is that this isn’t evidence of a potential antitrust problem at all. I think it is.
Eric Posner
And just a quick follow-up. Was it actually the case, when Facebook and other social media platforms increased the limitations on speech, that lots of people left those platforms?
Andrew Ferguson
People left. It also got… Use of these platforms became super politicized, which is exactly what a lot of these platforms said they were trying to avoid. Facebook acknowledged in January of this year that it had been downgrading all political content, had been trying to keep it—across the board—suppressed. There’s evidence that that isn’t actually what was happening, but it’s sort of neither here nor there. It became this extremely political product at the same time.
They did not lose many users, even as the public impression and public polling about favorability of these platforms plummeted in 2020 and 2021. That doesn’t necessarily demonstrate… That isn’t the proof in the pudding. It doesn’t make the case. But when you are enforcing antitrust laws, you don’t show up at an antitrust problem when the case is fully baked and you just go. You’re constantly looking for smoke to see if there’s fire. And in my view, this is absolutely smoke, at least in some circumstances.
Eric Posner
Okay, questions. Yes, over there.
Audience Member 1 (Luigi Zingales)
Thank you, chairman, for your speech.
I’m very fascinated by your approach to collusion, or potential risk of collusion. That seems to be quite novel, and I really endorse it. If there is smoke, take a look! Last year in this conference, I was asking people:, There seems to be a coordination in not publishing information about the famous Hunter Biden laptop; why is there not an antitrust enforcement on this issue? Because that’s really a serious issue.
I hope that you apply the same rule across the board. There are a lot of potential risks of collusion, a lot of cases where there is smoke. If you take a look in all those cases, this would be the most interesting four years ahead of us.
Let me give you a suggestion, for example. In a lot of industries, not only are there trade associations, but there are also consultants. They play a role in giving the same suggestion to everybody. A colleague at a different university told me that he was, in a former life, a consultant. He went to a bank and suggested that this regional bank should increase their fees. Then the regional bank, what they did is they actually recommended him to all their other competitors. So he went to all the competitors, and they all increased their fees.
Unlike in the case of newspapers, where we maybe see the evidence, in the case of consultants, we don’t—because they don’t actually publish who they consult for. So I hope you look at the smoke, ask all the consultants to publish a list of who they consult for, and look into that.
Andrew Ferguson
I didn’t hear a question, but I can comment nonetheless. Two things.
My view of the antitrust agencies is that they’re cops on the beat. You’re supposed to walk around the markets, and if you see something that looks like trouble, you take a look. You don’t prejudge, but you take a look.
On the collusion thing, obviously, collusion under antitrust laws requires some agreement on something. Trade associations play a valuable role, but there’s a pretty long history of trade associations that can act as third-party facilitators of horizontal conspiracies. I think that there is a risk that that can be taking place in the advertising markets. It’s definitely not the only market where there’s a possibility of that, and I’m always interested in looking for the smoke.
To be clear: Sitting in Washington, or even with our regional offices scattered throughout the country (we’ve got one here in Chicago that I just came from), we can’t see it all. So, part of the role of people like you is to alert the agencies when something’s going on. We get complaints—of varying levels of politeness—about things happening in the market, including from some people sitting in the room right now. And they are often very helpful because it won’t be obvious to us, as we’re all looking where we’re looking, about what’s going on. So this is helpful.
Eric Posner
Yeah, Matt?
Audience Member 2 (Matt Stoller)
Thanks for the speech and the questions. I guess I have a question about your broad observation that consolidation can lead to certain forms of coordinated speech restrictions.
One of the things that I’ve heard a lot of when I go out to LA is, there’s a lot of fear of talking about China—particularly because of vertically integrated streaming systems that are highly consolidated.
The government actually put out reports in the mid-2010s, and you hear it all the time. You haven’t seen a Chinese villain in movies for a long time. There are a number of reasons for this.
How would you approach looking at a problem of consolidated, vertically integrated speech platforms (for lack of a better word) outside of the sort of partisan approach? Looking at something like Chinese control over Hollywood, which is more of a national security question?
Andrew Ferguson
What I said in my confirmation. It is not the role of antitrust law, or of antitrust enforcers, or frankly of government to ensure some particular quantum of competitors in a market. We’re going to end up with central planning, which is going to be worse than whatever we have now.
But I do think that we ought to care about consolidation, independent of the particular doctrines of antitrust law, because it is much easier for government to control what you think, believe, and do, and what I think, believe, and do if there are a limited number of suppliers above us. It’s just much easier for them to coerce a couple people than it is a bunch of people. It just categorically is, and that’s true of any government. I think vigorous antitrust enforcement that protects against consolidation and leaves a variety of suppliers makes it harder for the government to coerce you and me downstream by coercing the people we rely on for stuff.
How I would think about the particular problem of vertically integrated movie studios and streaming services and China? I don’t have a particular answer on that, just because I think there are potentially fronts to think about antitrust law and our competitiveness vis-à -vis foreign rivals. That isn’t generally how antitrust law has constructed it, although I do think there are interesting antitrust theories about whether a merger that would give a foreign government more control over a particular industry should matter to the merger analysis. I’m open to that prospect. Generally it isn’t traditionally what antitrust law has said, but that’s at least an interesting line of inquiry.
My view is, antitrust law does not… Its job is not just to be anti-consolidation and make sure there’s a certain number of competitors. If you’ve got vigorous antitrust enforcement, you can resist consolidation and therefore resist the coercion of the individual or of the family downstream from coercing the supplier. That’s the thing that makes me very nervous about a limited number of suppliers of anything. I mean, de-banking has become this catastrophic problem in this country, where people are being denied access to retail banking services, on the basis of things they say or believe or affiliations. If you’ve only got a small number of very large banks, and the government has a particular view on whether someone’s idea should be permitted in the marketplace, it’s a lot easier to drive that out of the marketplace if all you have to do is have the banking regulator call a couple banks and be like, “We don’t like people who have these views banking with you.” Which is almost exactly what happened in the NRA against Vullo case, with New York’s banking regulator just calling financial institutions and being like, “You do business with the NRA, I’m coming for you.” That is an easier maneuver to pull if there are a fewer number of people involved in the market. It’s a lot harder if there are a wide variety of participants.
That is one of the reasons that I think, even for libertarians on my side of the aisle who are focused more or mostly on the relationship between government and the citizen, you ought to care about antitrust law because the government has a hard time directly coercing 350 million people but a lot easier time coercing six or seven suppliers that all 350 million of us have to interact with.
Eric Posner
Yes, back there.
Audience Member 3
On the topic of speech, I appreciate your remarks.
Other members of your administration have broadly suggested that there hasn’t been enough crackdown on speech—particularly at protests on campuses, and other forms of speech at college campuses. Now, obviously, you’re an independent agency. You’re free to express your own views.
I’m curious whether we can take it that you’re with the university leaders, like Princeton’s Eisgruber, who think it’s important to protect freedom of expression of all points of view—no matter what—at places like college campuses?
Andrew Ferguson
I’m not sure exactly what the question was.
Audience Member 3
The question is whether you believe the freedom of speech principle extends to the rights of protests on campuses, and therefore you stand with universities and against other parts of the administration, who have threatened universities and suggested they should crack down on speech.
Andrew Ferguson
I am not aware of universities suggesting that there be a crackdown on speech. I am aware of the suggestion that universities should not make it so that students fear for their physical safety when they’re crossing campus on the basis of their religion, and I definitely agree with that.
I think that the value of free speech is fundamental to this society, but the right to speak does not include the right to intimidate on the basis of someone’s religion, to harass, whatever you want to call it.
I am quite firmly of the view that I’m a member of the most free speech administration of my lifetime. And I’m extremely proud of that fact.
Eric Posner
Yes, right over here.
Audience Member 4 (Barry Lynn)
Hi, this is Barry Lynn. I agree with a lot of what you said. Our organization a few years back, just as a for instance, we condemned Facebook for knocking Alex Jones off the platform, may he burn in hell. We condemned AWS for knocking Parler off. But I have a couple quick questions.
So, amplification by the platforms. Because all amplification means is that other people’s voices are being obviously put down, in the order at which they’re being presented to an individual. Is amplification a problem? Is it a form of censorship?
And then the second thing is, you mentioned the Biden administration during the pandemic for putting pressure on the platforms. Last August, at that point former president Trump threatened—in writing—Mark Zuckerberg with life in prison unless he did the right thing. Was that a wrong thing for Donald Trump to do when he was running for office?
Andrew Ferguson
Is amplification a form of censorship? No. I think throwing someone off of a platform because of the view they express is a form of censorship. Also, you know, not the greatest example of civic discourse, to just shout out someone burning in hell. I don’t love that. I agree with you that they shouldn’t be throwing off people just on the basis of the views that they articulate.
Audience Member 4 (Barry Lynn)
The second thing, about President Trump?
Andrew Ferguson
I don’t remember him saying that, so I don’t have any comment on that.
Eric Posner
Over here?
Audience Member 5 (Maciej Bernatt)
I have a question.
When the agency is considering entering into new areas—and I think free speech would be one in the US—then potentially the institutional guarantees matter.
So I wanted to ask you about your view on the independence of competition agencies in a context of actions such as the one you discussed, that potentially there might be smoke.
Andrew Ferguson
Sure. I’ve made my view on this extremely clear, a lot.
I think that the idea of independent organs of executive power is very difficult to square with democracy. When we vote for the president, we’re voting for the person who’s going to be in charge of the entire executive branch. I guarantee that almost no one in this room would want me to be able to wield executive power free of political accountability. I guarantee no one in this room would want that. And I don’t think, in a democratic society, any of us should want very powerful agencies like the FTC to be insulated from political accountability.
My view is that any agency that wields substantial executive power in the name of the American people has to be answerable to the American people. And the way that the executive branch in this country, under our form of government, is answerable to the American people is through the election of the president.
And so my view, which I’ve said over and over, is that the president has the constitutional authority to remove commissioners of the FTC, including me. And I am extremely confident that when this gets to the Supreme Court—which it is, I guess, there right now—that the president’s decisions will be vindicated as a question of constitutional law.
Eric Posner
I think we have time for one more question. Yes, right over there.
Audience Member 6
Thank you. Andrew, can I ask you a question?
Andrew Ferguson
Sure.
Audience Member 6
As FTC commissioner responsible for protecting consumers—and especially vulnerable populations like children—from unfair practices, do you believe that it’s in the public interest or in the interest of children themselves for our kids to be on social media platforms designed to capture their attention and monetize their data?
Andrew Ferguson
That is a good question. This is the subject of a roaring debate. The commission is going to host a full-day workshop on this question, where we’re going to bring in scholars and advocates in a series of panels and speeches and debates to address this question.
I think it has become very obvious that the relationship between children and social media is not a healthy one. It’s not healthy for them; it’s not healthy for the country. There are very good arguments that advertising targeted toward children is bad in every circumstance. Congress is going to have to make decisions on this. But I do think that we are headed towards a reckoning on the relationship between children and social media, and the effect that social media has on forming brains, with a product that is designed to get a dopamine drip going over and over and over again.
States are starting to address this. Congress has started to address this. Even under current laws, there are parts of this that government can address. The laws that I enforce were written in 1914 and 1938. The problem we’re confronting is quite new, so we need to be very careful about how we apply these old laws to new circumstances and not just blunder in, sight unseen, to address what is a real problem. But I do think it’s a real problem. Clare Morell wrote The Tech Exit. I think it is phenomenal; it’s also terrifying. She’ll be one of the participants in our workshop. But this is an issue that keeps me up at night. The relationship between children and social media. The difficulty that every parent in America has, regulating their child’s interactions with social media.
My own view—stated at a relatively high level of abstraction—is that the most important thing the government can do is interpose parents back between children and social media. That’s the healthiest response, rather than government interposing itself directly between children and social media. But parents need help. Children get to interact with social media outside of their parents’ presence almost constantly. Screens are everywhere in our schools, whether you like it or not. Even if you are a family who has cut off access to screens to your kids, their friends parents probably have not, and they’ll have access to it there.
Government needs to do something about this. I think the main thing it needs to do is put parents back in charge of their kids and make it easier for parents to interpose themselves between their children and social media.
Eric Posner
Thank you. Well, I’m afraid we’re out of time, sorry. But please join me in thanking Chairman Ferguson for joining us.
[applause]
Articles represent the opinions of their writers, not necessarily those of the University of Chicago, the Booth School of Business, or its faculty.