U.S. Supreme Court Rebuffs Challenge to Tech Platform Immunity
The U.S. Supreme Court declined to reassess the broad legal immunity tech companies have over content hosted on their platforms.
Read the complete article from Reuters, US Supreme Court rebuffs challenge to federal protections for tech platforms.
According to the article, the U.S. Supreme Court recently declined a chance to reassess the broad legal immunity tech companies have over content hosted on their platforms, turning away an appeal in a lawsuit against Grindr by a male plaintiff who was raped at age 15 by adult men matched to him via the gay dating app.
The justices decided not to hear the plaintiff's appeal of a lower court's ruling to dismiss his lawsuit seeking monetary damages against Los Angeles-based Grindr because the company was protected from liability by a provision of federal law called Section 230 of the Communications Decency Act.
Section 230 of the Communications Decency Act
Enacted in 1996, Section 230 allows online platforms, including social media sites and online forums, to host user-generated content without being held legally responsible as the "publisher or speaker" of that content. The provision has shielded online platforms such as TikTok and Meta from a wide range of litigation.
The Plaintiff, “John Doe”
The plaintiff in the Grindr case has remained anonymous, being referred to as "John Doe" in court filings. His lawyers argued in a court filing that Section 230 has been a "goldmine for amoral companies who need not invest in providing safe products."
Lawyers for Doe said four adult men raped him on consecutive days in April 2019, when he was a high school student in a small town in the Canadian province of Nova Scotia after he signed up for Grindr and falsely represented that he was at least 18 years old. Grindr requires users to be over 18 but does not verify the ages of users.
Three of the men were prosecuted in Canada and received multi-year prison terms, while the fourth remains at large, according to court papers.
The plaintiff filed his civil lawsuit in California state court in Los Angeles in 2023, accusing Grindr of negligence and unlawfully failing to warn users about the risks of child sexual abuse, among other claims, as well as defective app design for matching adults and children for illegal sexual activity. It sought an unspecified sum of compensatory damages for physical and emotional harm, plus punitive damages of at least $66 million.
Procedural History of the Case
After the case was moved to federal court, the California-based Ninth U.S. Circuit Court of Appeals ruled in February that Section 230 barred Doe's state law claims.
His lawyers in asking the Supreme Court to hear the case called it an optimal vehicle for addressing whether Section 230 "immunizes apps for their own conduct in marketing and designing defective products."
Proponents of Section 230 have argued that, without it, online services would face potentially crippling legal costs and would be incentivized to censor free expression on the web.
Republican President Donald Trump has been a critic of Section 230 and sought unsuccessfully during his first term in office to end the protections.
The Supreme Court last addressed Section 230 in a set of rulings from 2023. In those cases, the justices declined to chip away at Section 230's scope and rejected lawsuits that sought to hold tech giants including Alphabet liable for terrorism-promoting content on their platforms.
Online platforms have urged courts not to weaken Section 230's protections.
Discussion Questions
- What is an online “platform?”
An online platform is a digital service or website that enables users to interact, share, or transact over the internet. An online platform acts as a virtual space where different participants, such as individuals, businesses, or organizations, can connect for various purposes like communication, commerce, learning, or entertainment.
Examples of social media platforms include Facebook, Instagram, and TikTok, while examples of e-commerce online platforms include Amazon, eBay, and Shopify. Other examples of common types of online platforms include learning platforms like Coursera, streaming platforms such as YouTube, freelance marketplaces like Upwork, communication tools such as Zoom, and cloud services like Google Drive.
- Describe the general concept of legal immunity, as well as the concept of legal immunity in the context of the subject matter of this article.
Legal immunity is an exemption from legal duty or liability, granted by law or authority, which prevents legal action (civil or criminal) from being taken against a person or group in specific circumstances. There are various types of legal immunity:
(a) Sovereign immunity protects governments from being sued without their consent. As an illustration, a plaintiff generally cannot sue the U.S. federal government unless the government allows it (e.g., via the Federal Tort Claims Act).
(b) Diplomatic immunity grants foreign diplomats immunity from most local laws in the host country. This type of immunity is based on international law (e.g., the Vienna Convention on Diplomatic Relations).
(c) Legislative immunity protects lawmakers (e.g., members of Congress) from being sued or prosecuted for actions taken in their official legislative role.
(d) Qualified immunity shields government officials (e.g., police officers) from civil liability unless they violated “clearly established” constitutional rights.
(e) Witness immunity is offered to individuals in exchange for testimony, protecting them from prosecution based on that testimony. Witness immunity can be either transaction immunity, which is full protection, or use immunity, which only protects against the use of testimony, not other evidence.
In the context of Article 2, Section 230 of the Communications Decency Act is a foundational law for the internet that provides broad legal immunity to online platforms for content created by users, not by the platform itself.
According to the language of Section 230:
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
This means that websites, apps, or platforms cannot be held legally responsible for most of the content their users post.
- In your reasoned opinion, should tech companies have legal immunity in terms of the content posted on their social media platforms? Why or why not?
This is an opinion question, so student responses will likely vary.
Your author supports providing tech companies legal immunity in terms of the content posted on their social media platforms. Such immunity allows the internet to function efficiently and effectively, enabling online platforms to host billions of posts, reviews, videos, and messages without being sued for such content. Immunity gives online platforms the flexibility to moderate harmful content without being treated as publishes. Without it, many websites would either shut down or drastically limit user interactions to avoid legal risks.
Even with Section 230 immunity for online platforms, an injured party (e.g., someone subject to online defamation) still has a remedy against the tortfeasor who posted the defamatory content.
One must also be mindful of the fact that there are exceptions to the legal immunity provided by Section 230 of the Communications Decency Act.
Section 230 does not provide immunity for federal criminal liability for child pornography or terrorism. It does not provide immunity for intellectual property violations such as copyright infringement. Finally, Section 230 does not provide immunity for content the platform itself creates or co-creates; in other words, if the platform is actively involved in developing illegal content, it may lose immunity.