The case of NetChoice vs. Moody1 has become a focal point in the debate over freedom of expression in the digital age. In a context where social media platforms are essential for disseminating information and shaping public discourse, the crucial question arises: Do these platforms possess rights protected under the First Amendment of the United States Constitution? This issue is fundamental to the future of internet regulation and the safeguarding of civil liberties, as the First Amendment protects, among other freedoms, those of speech and press, guaranteeing the right to express oneself and share information without government interference.
On July 1, 2024, the United States Supreme Court ruled on the cases “Moody v. NetChoice” and “NetChoice v. Paxton.” These arose from laws enacted in Florida and Texas that sought to limit the ability of major social media platforms to moderate user-generated content. The state legislations aimed to prevent these platforms from removing or restricting content, arguing that such actions constituted censorship and violated individuals’ freedom of expression.
This case raises several important legal questions. First, it debates the nature of social media platforms. The ruling employs various analogies to clarify this. It references the classification of “common carriers”, public services with neutrality obligations-like telephone or postal companiesthat must allow the transmission of all types of content regardless of its nature. It also alludes, almost sarcastically, to a shopping mall where a variety of content is offered, and users freely consume what they wish, guided mainly by market forces without substantial filtering obligations. Lastly, it considers whether these platforms parallel publishers who hold editorial discretion rights, allowing them to moderate content according to their internal policies.
This distinction is crucial, as it determines the degree of control platforms can exercise over content and how First Amendment protections apply.
At this point, the scope of the First Amendment concerning content moderation decisions by private companies is questioned. Traditionally, the First Amendment protects against governmental restrictions on freedom of expression, but there is intense debate over whether this protection extends to the actions of private entities in digital spaces, which significantly influence public discourse.2 In other words, entities with semi-public power to influence public opinion should have the freedom to moderate content without restrictions?
Another essential aspect is finding the appropriate balance between platform self-regulation and state intervention. Excessive government regulation could limit platforms’ ability to combat harmful content like misinformation and hate speech. However, a lack of regulation could allow discriminatory practices or unjustified censorship, negatively affecting the free exchange of ideas and pluralism.
Not to mention the duties of transparency in exercising content moderation—a challenge for platforms’ internal enforcement. Despite having independent regulatory bodies, such as the Meta Oversight Board, they cannot always provide information and traceability about decisions made regarding content curation and organization. The issue of transparency warrants an entirely separate article, also tackling the reasons why the Eleventh Circuit and the Solicitor General both disagreed on this specific point regarding the general ruling.
In essence, the cases “Moody v. NetChoice” and “NetChoice v. Paxton” bring to the forefront deep legal, social, and political debates about how information rights and freedom of expression should be exercised and protected in the 21st century.
Is it necessary to implement measures that limit the power of platforms to protect social interests? Should the digital public space be regulated? Are platforms’ internal terms and conditions a good foundation for democratic freedom?
The Supreme Court, upon reviewing these cases, reaffirmed that platforms’ content moderation is protected by the First Amendment as a form of “editorial discretion.”
The Court decided to distinguish platforms from traditional common carriers, noting that unlike phone companies or postal services, social media platforms actively curate and organize content3, which the Court views as a form of protected speech. This stance means that attempts to regulate social media platforms as common carriers based solely on their market dominance face significant constitutional hurdles, much like being merely a marketplace of ideas.
That said, and although platforms are considered private online publishers, their exact nature remains unresolved—a pending issue for academics, policymakers, and third-sector institutions.
Furthermore, the Court overturned lower court decisions and ordered new rulings, leaving it to lower judicial instances to make the final determination on the constitutionality of the Florida and Texas laws. By not issuing a definitive ruling on these legislations, the Court keeps the legal and constitutional discussion about the nature of platforms and the degree of obligations imposed by state power alive.
Therefore, many questions arise from these recent cases, opening the door to a global debate of great magnitude. By reinforcing the constitutional protection of platforms’ content moderation decisions, an international legal precedent is set that could limit future state attempts to regulate online content. Later, we will evaluate how Europe is regulating online content moderation and its repercussions for private entities.
In conclusion, the case initiated by NetChoice represents a milestone in the field of international digital constitutionalism and raises fundamental questions about the rights and responsibilities of social media platforms. It is imperative to address these issues carefully, considering both the protection of individual liberties and the collective well-being—legal interests whose optimization is not simple and requires collaboration between authorities and platforms. This collaboration aims to ensure a digital environment that respects fundamental rights and contributes to the development of an informed and democratic society.
1NetChoice, LLC v. Moody, 603 U.S. _ (2024)
2Berman, Paul. 2000. ‘Cyberspace and the State Action Debate: The Cultural Value of Applying Constitutional Norms to “Private” Regulation’. University of Colorado Law Review 71: 1263–1310.
3Christopher S. Yoo, The First Amendment, Common Carriers, and Public Accommodations: Net Neutrality, Digital Platforms, and Privacy, 1 1 J. Free Speech L. 463 (2021)