According to the article, Meta, YouTube, TikTok and Snapchat know exactly how addictive their platforms can be to teens, yet they continue to target teen users.  Watch the full video on CNN.

Internal Documents

Those are allegations a group of school districts is making in a lawsuit against the social media giants, according to a newly unsealed legal filing that quotes the companies’ own internal documents.

“IG (Instagram) is a drug … we’re basically pushers,” Meta researchers said in an internal chat, according to the filing.

An internal TikTok report noted that “minors do not have executive mental function to control their screen time.”

Snapchat executives once acknowledged that users who “have the Snapchat addiction have no room for anything else. Snap dominates their life.”

And staffers within YouTube once said that “[d]riving more frequent daily usage [was] not well-aligned with … efforts to improve digital wellbeing,” the filing states.

The brief containing the internal comments, research and employee testimony has been presented as evidence in a massive lawsuit brought by hundreds of individuals, school districts and attorneys general from across the United States against the four companies — Instagram-parent Meta, Snap, TikTok and YouTube-parent Google — in the Northern District Court of California.

The platforms “deliberately embedded design features in their platforms to maximize youth engagement to drive advertising revenue,” the complaint claims. And the school districts allege that the social media companies have contributed to a youth mental health crisis that schools must address by investing in counseling and other resources.

Companies Seek to Dismiss the Case

The companies have sought to dismiss the case. Spokespeople for Meta, TikTok and Snap said the recent filing paints a misleading picture of their platforms and safety efforts. 

The plaintiff’s co-lead counsel Lexi Hazam said the companies bypassed parents and teachers to push their platforms into schools despite knowing they were addictive to kids.

The 235-page brief filed by the plaintiffs in the case paints a picture of firms well aware that their apps could harm teens and children pursuing young users anyway to juice engagement and profit. It also cites internal documents suggesting the companies are aware that their well-being and parental control features have limited effectiveness.

Prioritizing Profit over User Safety

Parents, researchers, whistleblowers and lawmakers have previously raised concerns that tech giants prioritize profit over user safety, especially for young people. At a Senate hearing in January 2024, Meta CEO Mark Zuckerberg and Snap CEO Evan Spiegel apologized to parents who said their children had been harmed by social media.

The companies face growing legal pressure. In addition to the Northern California case, the four companies are defendants in a consolidated lawsuit in Southern California claiming that they harmed young people’s mental health, which is set to go to trial in January. The companies have similarly pushed back on those allegations by claiming protection under Section 230, a law that shields tech companies from liability for users’ posts.

Each of the four companies has rolled out a series of youth safety and parental control features in recent years, such as “take a break” reminders, content restrictions for young users and default privacy protections. However, the recent filing alleges that, at least in some cases, the companies are aware those tools have limited efficacy.

“Is It Going to Look Like Tobacco Companies?”

The brief references internal documents from the tech companies indicating that researchers raised concerns about addiction and other mental health risks to young users and accuses the companies of hiding or downplaying those findings.

It cites, for example, a 2019 study Meta planned to conduct in partnership with Nielsen in which it would ask some users to quit Facebook and Instagram for a month and log how they felt afterwards. But after “pilot tests” of the study showed that people who paused their Facebook use for only a week “reported lower feelings of depression, anxiety, loneliness, and social comparison,” Meta allegedly stopped the research project.

“One Meta employee warned, ‘if the results are bad and we don’t publish and they leak, is it going to look like tobacco companies doing research and knowing cigs were bad and then keeping that info to themselves?’” the brief states, citing an internal conversation.

The filing mischaracterizes the study and Meta’s decision to end it, Meta spokesperson Andy Stone said. Meta researchers tried to design the study to overcome participants’ “expectation effects” — where users’ preexisting beliefs about the platform would color their responses. But the pilot showed the study design wasn’t able to account for this, “which is why this study didn’t continue,” Stone said in a post on X.

“Cherry-Picked Quotes and Misinformed Opinions”

In a statement, Stone said of the brief that “we strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions to present a deliberately misleading picture. The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens – like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences. We’re proud of the progress we’ve made, and we stand by our record.”

The filing raises questions about the family pairing tool TikTok rolled out in 2020 to give parents more control over what their teens can see and share on the app. One employee said that because teens could unlink their accounts from their parents’, Family Pairing was “kinda useless.” Another company leader said, “Family Pairing is where all good product design goes to die,” the filing states.

TikTok executives also allegedly rejected a proposal to adopt a screen time limit that would kick users off the app once reached. Less time spent scrolling meant “fewer ads,” which would have a “significant” impact on revenue. The current screen time tool gives users the option to enter a passcode to remain on the platform.

“This brief inaccurately rewrites our history and misleads the public about our commitment to youth safety in a cynical effort to gain an advantage in litigation,” a TikTok spokesperson said in a statement.

Since the app’s launch, “we have invested billions of dollars in Trust & Safety, and rolled out 50+ preset safety, privacy, and security settings for teens, including private accounts, content restrictions, and screen time tools. The plaintiffs’ claims distort this track record, along with the meaningful work we do with respected child-safety organizations to collaboratively build a healthier digital ecosystem,” they added.

The briefing also alleges that late night notifications, beauty filters that alter users’ appearances and endlessly scrolling feeds across Instagram, Snap, YouTube and TikTok have undermined users’ wellbeing.

YouTube, for example, recognized that short-form videos can trigger an “addiction cycle” but developed its Shorts feature anyway, according to the filing. An internal Snapchat document identified “infinite scroll and autoplay as ‘unhealthy gaming mechanics’” and observed that users “feel obligated” to maintain contact streaks with friends, “which ‘become[s] stressful,’” the document states.

“The allegations against Snap in this case fundamentally misrepresent our platform,” a Snap spokesperson said in a statement. “Snapchat was designed differently from traditional social media — it opens to the camera, not a feed, and has no public likes or social comparison metrics. The safety and well-being of our community is a top priority … We’ve built safeguards, launched safety tutorials, partnered with experts, and continue to invest in features and tools that support the safety, privacy, and well-being of all Snapchatters.”

A Google spokesperson said in a statement that “these lawsuits fundamentally misunderstand how YouTube works and the allegations are simply not true.”

Plaintiffs in the suit are seeking a jury trial, claiming in the brief that the tech giants have created a “public nuisance that burdens schools and communities.”

“Meta, Google, TikTok, and Snapchat designed social media products they knew were addictive to kids, and these internal company documents and testimony show they actively pushed these platforms into schools while bypassing parents and teachers,” the co-lead plaintiff’s counsel Hazam said. “They knew the serious mental health risks to kids but provided no warnings, leaving schools and families to suffer the consequences. As the case moves toward trial, we will continue working to uncover the full extent of these companies’ misconduct and ensure that they are held responsible for the impact their platforms have had on children and teens, and the school systems that serve them.”

Discussion Questions

  1. As the article indicates, the plaintiffs in the subject lawsuit are claiming that tech giants Meta, Google, TikTok, and Snapchat have created a “public nuisance that burdens schools and communities.” Define public nuisance.

    Legally, a public nuisance occurs when an individual’s actions or failures to act unreasonably impede the public’s enjoyment of property, safety, or public rights.

    Unlike a private nuisance, which affects a single person or a few neighbors, a public nuisance impacts a broader community or neighborhood, creating harm or inconvenience to the public at large. Depending on state law, a public nuisance can be considered both a civil and a criminal matter.

  2.  Aside from their public nuisance claim, on what other legal theory/theories (if any) can the plaintiffs base their claims?

    In your author’s legal opinion, public nuisance is the appropriate theory in this case, since the lawsuit is being filed by school districts, rather than individual persons who have been adversely affected by the social media giants’ alleged practices.
     
    If individual persons file lawsuits on their own behalf (which is likely an inevitability), the most likely theory on which the plaintiffs could successfully base their claims is negligence.
     
    Negligence is defined as the failure to do what a reasonable party would do under the same or similar circumstances.
     
    Negligence is also defined by the following four elements:

    (a)  The defendant owes the plaintiff a duty of care;

    (b)  The defendant breached the duty of care owed to the plaintiff;

    (c)  The defendant caused the plaintiff harm; and

    (d)  The plaintiff experienced damages (emotional and/or physical) as a result.

  3. In your reasoned opinion, who should prevail in this case, and why?

    This is an opinion question, so student responses may vary.

    In your author’s opinion, this decision is best left to a jury, which will be charged with determining whether the social media giants’ practices constituted a public nuisance.
     
    It is important to note that in the tobacco litigation of the 1990s, tobacco companies agreed to a massive settlement with the states based on comparable facts; more specifically, that the tobacco companies knew that nicotine is addictive and that cigarette smoking causes cancer, but failed to disclose such information to smokers.