Meta is facing a fresh storm of lawsuits that blame Instagram for eating disorders, depression and even suicides among children and teens — and experts say the suits are using a novel argument that could pose a threat to Mark Zuckerberg’s social-media empire.
The suits — which are full of disturbing stories of teens being barraged by Instagram posts promoting anorexia, self-harm and suicide — rely heavily on leaks by whistleblower Frances Haugen, who last year exposed internal Meta documents showing that Instagram makes body image issues and other mental health problems worse for many teens.
The leaks provide proof that Meta was well aware its products were hurting children but chose to put growth and profits over safety, the suits claim. Some of the suits also name Snapchat and TikTok, which the plaintiffs argue have also pushed addictive products despite knowing the deadly downsides.
“In what universe can a company have a product that directs this kind of vile filth, this dangerous content to kids — and get away with it?” said Matthew Bergman, the founder of the Social Media Victims Law Center, which has filed more than a half-dozen of the lawsuits. “These products are causing grievous harm to our kids.”
Bergman faces an uphill battle due to Section 230 of the Communications Decency Act, a law that has largely protected social-media companies from similar litigation. But Bergman also has a novel legal strategy based on Haugen’s leaks that the families he represents hope will force Meta to change its ways.
Meta and other tech companies have fought off lawsuits for years using Section 230, which was intended to preserve internet users’ free speech by preventing web platforms from being held legally liable for content posted by third parties.
But Bergman argues that the problem with Instagram is not just that third parties post harmful content on the app — it’s that Instagram’s design can intentionally route vulnerable users toward such content, as detailed by Haugen’s leaks. Therefore, he argues, the company shouldn’t be protected by Section 230.
“It’s our belief that when you attack the platform as a product, that’s different than Section 230,” Bergman said. “230 has been a barrier and it’s something we take seriously and we believe we have a viable legal theory to get around it.”
Meta did not return a request for comment.
Self-harm, addiction and death
One suit centers around a Louisiana girl named Englyn Roberts, who committed suicide in 2020 at age 14.
According to the suit filed in July in San Francisco federal court, Roberts’ parents had no idea the extent to which she was quietly being “bombarded by Instagram, Snapchat and TikTok with harmful images and videos,” including “violent and disturbing content glorifying self -harm and suicide.”
The more Roberts allegedly interacted with such photos and videos, the more the apps recommended similar content that kept her hooked in a vicious cycle. Roberts started exchanging self-harm videos with her friends, including one disturbing video in September 2019 of a woman hanging herself with an extension cord from a door, according to screenshots included in court papers.
In August 2020, Roberts appeared to imitate the video when she used an extension cord to hang herself from the door. Her parents found her hours later and she was rushed to the hospital. She was put on life support and died days later.
About a year after Roberts’ death, her father saw a report about Frances Haugen’s leaks about Instagram’s harms. He subsequently searched his daughter’s old phones and social media accounts and uncovered her posts and messages about suicide.
“What became clear in September of 2021 is that Englyn’s death was the proximate result of psychic injury caused by her addictive use of Instagram, Snapchat, and TikTok,” the suit reads.
This maneuver around Section 230 means “Meta should be worried,” according to a recent analysis of one of Bergman’s suits by Gonzaga School of Law Professor Wayne Unger.
“The reasons for Section 230 immunity fall flat with respect to Spence’s lawsuit,” Unger wrote. “If the primary beneficiary of Section 230 protection is the internet user, then it follows that platforms should not be allowed to use Section 230 immunity for the harms the platforms directly cause their users.”
‘Knowingly releasing a toxin’
Bergman previously represented Asbestos victims before switching to social media lawsuits last year in the wake of Haugen’s testimony.
“To me that was basically everything I’ve seen in the asbestos industry times one hundred,” Bergman said of Haugen’s leaks. “Both [asbestos producers and Meta] were knowingly releasing a toxin.”
Other alleged victims of social media represented by Bergman’s firm include two other teens from Louisiana and another from Wisconsin who all committed suicide after being hooked on social media apps.
An additional disturbing suit filed by a Connecticut mother alleges that her daughter killed herself at just 11 years old after becoming addicted to social media apps and being barraged by sexually explicit videos from strangers. The pre-teen girl even made a video of herself taking the pills that killed her, the suit claims.
Other suits have been filed by victims who are still alive but who say they have suffered from severe anorexia, mental trauma and other harms harm due to their social media use.