Do social media sites have a First Amendment right to choose which information they publish on their websites?
That’s the question the Supreme Court will address this term when it reviews two laws from Texas and Florida that would force businesses such as Facebook and YouTube to carry certain content that they do not want to feature. Under the guise of “prohibiting censorship,” these laws seek to replace the private entities’ editorial voice with preferences dictated by the government.
The court’s decision will define the public’s experience on the internet: How much control will the government have over public debate? How vigorous will our online conversations be if platforms feel pressured to avoid controversial topics? What art, news, opinion, and communities will we discover on the platforms that are so central to how we communicate when the government mandates what content and which speakers must appear?
To enable online speech and access to information, the ACLU has long urged social media companies to exercise great caution when deciding whether and how to remove or manage lawful posts. On the very largest platforms, free expression values are best served if companies choose to preserve as much political speech as possible, including the speech of public figures. But, regardless of what platforms ought to permit as a matter of corporate policy, the government can’t constitutionally mandate what they ultimately choose.
Moreover, platforms have no choice but to prioritize some content over others — something always has to come first. They make decisions to remove, demote, or hide lawful content to minimize speech that the business does not want to be associated with, that puts off their consumers or advertisers, and that is of little interest or value to their users. And they don’t all make the same decisions, reflecting their different editorial choices. Facebook, for example, prohibits nudity while Twitter, now X, allows it.
Motivated by a perception that social media platforms disproportionately silence conservative voices, some states have sought to regulate platforms’ curatorial decisions. The Florida law at issue before the Supreme Court prohibits social media companies from banning, in any way limiting the distribution of posts by, or prioritizing posts by or about political candidates; it also prohibits taking any action to limit distribution of posts by “journalistic enterprises.” The Texas law bars larger social media platforms from blocking, removing, or demonetizing content based on the users’ views.
The government’s desire to have private speakers distribute more conservative — or for that matter, progressive, liberal, or mainstream — viewpoints is not a permissible basis for regulating the editorial decisions of private platforms. Choosing what not to publish and how to prioritize what is published is protected expression. In deciding what books to release or sell, publishers and booksellers are unquestionably exercising their free speech rights, as are curators of an art exhibit, and editors deciding what op-eds to publish in a newspaper. The government can’t make the decision for them.
This is why in the lower courts’ review of these laws, the ACLU submitted two friend-of-the-court briefs arguing that it is unconstitutional to force social media and other communications platforms to publish unwanted content.
This has long been settled law. For example, in a case called Miami Herald v. Tornillo, the Supreme Court held that a law requiring newspapers that published criticisms of political candidates to also publish any reply by those candidates was unconstitutional. The law had forced private publishers to carry the speech of political candidates, whether they liked it (or agreed with it) or not. As the Supreme Court explained in striking down the law, a government-mandated “right of access inescapably dampens the vigor and limits the variety of public debate.”
The Supreme Court’s established precedent for protecting editorial discretion applies to online platforms as well. Private speech on the internet should receive at least as much First Amendment protection as print newspapers and magazines do. And social media platforms, in combining multifarious voices, exercise their First Amendment rights while also creating the space for the free expression of their users.
These entities shouldn’t be required to publish, and their users shouldn’t be forced to contend with, speech that doesn’t fit the expressive goals of the platform or of the community of users. Nor should platforms be required to avoid certain topics entirely because they don’t want to publish or distribute all viewpoints on those topics. Under the guise of “neutrality,” if these laws go into effect, we will be confronted by a lot more distracting, unwanted, and problematic content when using the internet.
For example, a platform should be able to publish posts about vaccination without having to present the views of a political candidate recommending that people drink bleach to combat COVID-19. Similarly, a platform should be able to welcome posts about anti-racism without having to host speech by neo-Nazis. And a social media site should be able to host speakers questioning the scientific basis for climate change or affirming the existence of God without having to publish contrary viewpoints. If people want any of this material, they can seek it out. But the government cannot force it upon either the platforms or the public that relies on them.
Social media and other online platforms are vital to online speech, enabling us to discuss ideas and share perspectives. Given their significant role, the major platforms should facilitate robust debate by erring on the side of preserving the public’s speech. And if they remove protected content, they should offer clarity upfront as to why and, at a minimum, stick to their own rules. Platforms should also offer opportunities for appeals when they inevitably get things wrong. But the government can’t force platforms to carry the speech or promote the viewpoints that it prefers, any more than it could require a bookstore to stock books it did not want to sell.
Ultimately, users should have as much control as possible over what expression they can access. Even if we think the major platforms could be doing a better job, a government-mandated point of view would be a cure worse than the disease.