Employees at Meta Platforms Inc and ByteDance Inc were aware of the harmful effects of their platforms on young children and teenagers but disregarded the information or in some cases sought to undermine it, court filings show.
The revelations were disclosed in a lawsuit over social media addiction that had been filed previously but with key portions sealed from public view. An unredacted version filed over the weekend in federal court in Oakland, California, offers details about how much engineers and others, including Meta CEO Mark Zuckerberg, knew about the harms of social media and their misgivings about it.
“No one wakes up thinking they want to maximise the number of times they open Instagram that day,” one Meta employee wrote in 2021, according to the filing. “But that’s exactly what our product teams are trying to do.”
The case in Oakland comprises a collection of scores of complaints filed across the US on behalf of adolescents and young adults who allege that Facebook, Instagram, TikTok, Snapchat and Google’s YouTube caused them to suffer anxiety, depression, eating disorders and sleeplessness. More than a dozen suicides also have been blamed on the companies, based on claims that they knowingly designed algorithms that drew children down dangerous and addictive paths.
In their defense, the social media giants point to a 1996 law that gives Internet platforms broad immunity from claims over harmful content posted by users. Both sides are watching closely a Supreme Court case that will likely determine the fate of the litigation in Oakland.
According to the new filing, internal documents at TikTok parent ByteDance show that the company knows young people are more susceptible to being lured into trying dangerous stunts they view on the platform – known as viral challenges – because their ability to weigh risk isn’t fully formed.
Young people are more likely to "overestimate their ability to cope with risk,” and their "ability to understand the finality of death is also not fully fledged,” according to the filing.
Another unsealed portion of the filing contends that instead of moving to address the problems around children using Instagram and Facebook, Meta defunded its mental health team.
The filing says Zuckerberg was personally warned: "We are not on track to succeed for our core well-being topics (problematic use, bullying & harassment, connections, and SSI), and are at increased regulatory risk and external criticism. These affect everyone, especially Youth and Creators; if not addressed, these will follow us into the Metaverse.”
Snap and Meta had no immediate comment on the court filing. Representatives TikTok didn’t immediately respond to a request for comment.
The companies have previously said that user safety is a priority and that they have taken affirmative steps to give parents more control over their kids’ use of the platforms and to provide more mental health resources. – Bloomberg