Report: TikTok boosts posts about eating disorders, suicide


New research finds that TikTok’s powerful algorithms are promoting videos about self harm and eating disorders to teens. The findings come from the Centre for Countering Digital Hate, which created TikTok accounts for fictitious young people living in the US, Britain, Canada and Australia. — AP

TikTok’s algorithms are promoting videos about self-harm and eating disorders to vulnerable teens, according to a report published Dec 14 that highlights concerns about social media and its impact on youth mental health.

Researchers at the nonprofit Centre for Countering Digital Hate created TikTok accounts for fictional teen personas in the US, United Kingdom, Canada and Australia. The researchers operating the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would respond.

Within minutes, the wildly popular platform was recommending videos about losing weight and self-harm, including ones featuring pictures of models and idealised body types, images of razor blades and discussions of suicide.

When the researchers created accounts with user names that suggested a particular vulnerability to eating disorders – names that included the words “lose weight” for example – the accounts were fed even more harmful content.

“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the centre’s CEO Imran Ahmed, whose organisation has offices in the US and UK. “It is literally pumping the most dangerous possible messages to young people.”

Social media algorithms work by identifying topics and content of interest to a user, who is then sent more of the same as a way to maximise their time on the site. But social media critics say the same algorithms that promote content about a particular sports team, hobby or dance craze can send users down a rabbit hole of harmful content.

It’s a particular problem for teens and children, who tend to spend more time online and are more vulnerable to bullying, peer pressure or negative content about eating disorders or suicide, according to Josh Golin, executive director of Fairplay, a nonprofit that supporters greater online protections for children.

He added that TikTok is not the only platform failing to protect young users from harmful content and aggressive data collection.

“All of these harms are linked to the business model,” Golin said. “It doesn’t make any difference what the social media platform is.”

In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers didn’t use the platform like typical users, and saying that the results were skewed as a result. The company also said a user’s account name shouldn’t affect the kind of content the user receives.

TikTok prohibits users who are younger than 13, and its official rules prohibit videos that encourage eating disorders or suicide. Users in the US who search for content about eating disorders on TikTok receive a prompt offering mental health resources and contact information for the National Eating Disorder Association.

“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” said the statement from TikTok, which is owned by ByteDance Ltd, a Chinese company now based in Singapore.

Despite the platform’s efforts, researchers at the Centre for Countering Digital Hate found that content about eating disorders had been viewed on TikTok billions of times. In some cases, researchers found, young TikTok users were using coded language about eating disorders in an effort to evade TikTok’s content moderation.

The sheer amount of harmful content being fed to teens on TikTok shows that self-regulation has failed, Ahmed said, adding that federal rules are needed to force platforms to do more to protect children.

Ahmed noted that the version of TikTok offered to domestic Chinese audiences is designed to promote content about math and science to young users, and limits how long 13- and 14-year-olds can be on the site each day.

A proposal before US Congress would impose new rules limiting the data that social media platforms can collect regarding young users and create a new office within the Federal Trade Commission focused on protecting young social media users’ privacy.

One of the bill’s sponsors, Sen. Edward Markey, D-Mass., said Wednesday that he’s optimistic lawmakers from both parties can agree on the need for tougher regulations on how platforms are accessing and using the information of young users.

“Data is the raw material that big tech uses to track, to manipulate, and to traumatize young people in our country every single day,” Markey said. – AP

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

Russian court fines Apple for not deleting two podcasts, RIA reports
GlobalFoundries forecasts upbeat Q4 results on strong demand from smartphone makers
Emerson sharpens automation focus with offer for rest of AspenTech in $15 billion deal
Palantir shares surge to record as AI boom powers forecast raise
Tax fraud investigators search Netflix offices in Paris and Amsterdam, says source
Singapore's Keppel to buy Japanese AI-ready data centre
Tesla increases wages for staff at German gigafactory by 4%
Apple explores push into smart glasses with ‘Atlas’ user study
Japan's Kioxia sees flash memory demand almost tripling by 2028
Hacker gets into woman’s email, changes every password, tries to make purchases

Others Also Read