“I couldn’t stand before the jury,” Goldberg recalled, saying that if such cases were allowed to proceed to trial, they would be successful.
A lot has changed in the past five years, she said: The public’s trust in social media companies has waned and courts have begun to entertain the notion that lawyers should be allowed to operate on tech platforms in the same way as providers of other consumer products. Must be able to sue. or services.
In 2021, the 9th Circuit Court in California ruled that Snap could be held potentially liable for the deaths of two boys who were killed in a high-speed car accident while using Snapchat filters. were, which their families say encourages reckless driving.
In October, the US Supreme Court decided to hear a case against Google alleging materially supporting terrorism on its YouTube video platform due to its algorithmic recommendation of videos by the Islamic State terrorist group.
Legal experts said the case could set an important precedent for how Section 230 applies to content recommendations that the platform’s algorithms make to users — including those made for children like Laurie’s daughter. Are included.
“The pendulum has really swung,” Goldberg said. “People no longer trust that these products are acting for the public good, and the courts are waking up.”
Outside the United States, the balance has shifted even further, and is beginning to be reflected in both consumer lawsuits and regulation.
In September, a British government inquiry blamed social media exposure for a 14-year-old girl’s suicide, and lawmakers are set to introduce tougher rules for age verification for social media firms.
But aside from a recent bill in California that mandates “age-appropriate design” decisions, efforts to pass new laws governing digital platforms in the United States have largely faltered.
Trial lawyers like Bergman say leave the issue to them.
consent and control
Laurie’s daughter got her first cellphone in sixth grade, when she started taking the bus to school alone. When her mental health soon began to deteriorate, her mother initially did not make the connection.
“In many ways I was a helicopter parent,” Laurie said. “I did everything right — I put the phone in the cupboard at night, we talked about the appropriate use of technology around the dinner table.”
Now, Laurie learns that her daughter had secretly opened multiple social media accounts in an effort to escape her mother’s vigilance, connecting for hours at night in her bedroom.
Laurie soon realized that her daughter was wearing a long-sleeved shirt to cover the bite marks on her arms.
When I asked her about it, she said, “Mom, there are videos that show you how to do it on TikTok, and Snapchat — they show you what tools to use.”
TikTok and Snap said that harmful content is not allowed on their platforms and they are taking steps to remove it.
Laurie’s daughter was introduced to older users on Snapchat and Instagram who sought to sexually assault her, including requesting sexually explicit images, according to her attorneys.
Although Laurie wanted to keep her daughter offline, the social media platform designed its products “to avoid parental consent and control,” her lawsuit alleges.
A META spokesperson pointed to a number of recent initiatives to give parents control over their children’s online activity, including a “Family Center” launched in 2022 that will allow parents to manage Instagram Allows monitoring and limiting the time spent on
Laurie’s daughter had secretly opened five Instagram, six Snapchat and three TikTok accounts before she turned 13.
“There was no way for me to contact all these companies and say, ‘Don’t let my daughter log in,'” Laurie said.
Although Laurie wanted to further restrict her daughter’s social media access, she worried that — since all of her classmates were communicating on apps — her daughter would feel socially excluded without them.
Laurie’s daughter is just one data point in a trend that psychologists have been trying to understand for the past decade.
Between 2012 and 2015, American teens reporting symptoms of depression rose 21% — double the number for girls, said Jean Twenge, an American psychologist and researcher who studies mental health trends.
Twenge said that in 2015, 12 to 14-year-old girls committed suicide three times more than in 2007.
Psychologist Grant said that until about 10 years ago, cases involving depression, self-harm and anxiety had held steady for decades.
“Then we see this big spike around 2012 — what happened in 2011? The advent of Snapchat and Instagram,” he said.
One driver of this trend, the researchers say, is social comparison – the way products including Instagram and TikTok lead users to constantly compare themselves to their peers, which can erode self-esteem. .
“She would say “Mom, I’m ugly, I’m fat”, Laurie recalled of her daughter. “Keep in mind: She’s 98 pounds (44 kg) and 5 foot 5 (165 cm).”
“So I would ask him, ‘Why do you think that?’ And she’d say, ‘Because I posted a picture and only four people liked it.
Grant said he sees that children are bound by specific design choices made by social media companies.
“Just think of endless scrolling — it’s based on the speed of slot machines — addictive gambling,” said Grant, who spent years treating adult addiction before turning his attention to children’s technology use.
Still, mental health experts are divided on the interplay between children’s mental health and social media use.
“Social media is often the scapegoat,” said Yalda Uhls, professor of developmental psychology at the University of California at Los Angeles (UCLA).
“It’s easier to blame[it]than the systemic issues in our society – there’s also inequality, racism, climate change and parental decisions.”
While some children may attribute a mental health challenge to social media, others say the opposite. A November poll by Pew showed that less than 10% of teens said social media was having a “mostly negative” impact on their lives.
Jennifer King, a research fellow at the Stanford University Institute for Human-Centered Artificial Intelligence, said there are still big gaps in research on concepts such as social media addiction and digital harm to children.
“But the internal research – the Francis Haugen documents – are damning,” she said. “And of course, it was shark bait for the trial lawyers.”
Tony Roberts was watching CNN at 2 a.m. on a winter evening in early 2022 when he saw an ad he never expected.
A woman on the screen invited parents to call 1-800 if they had a “child[who]was experiencing a mental health crisis, eating disorder, attempted suicide.” Or was sexually abusing through social media.”
“I thought, wait, that’s what happened to our daughter,” he recalled.
It had been more than a year since she found her 14-year-old daughter, Anglin, hanged in her room. She eventually died from her injuries.
Roberts later learned that her daughter had seen a video on Instagram depicting the specific suicide method, and that she had been sucked into an online world of self-harm material and abuse in the months leading up to her death.
They began combing through their daughter’s phone, creating a dossier of her mental health spiral, which she attributed to her use of Instagram, TikTok and Snapchat.
To her distress, she discovered that the video that may have played a role in her death was still circulating on Instagram months after she died.