Mother whose child died in TikTok challenge urges US court to revive lawsuit


FILE PHOTO: TikTok logo is seen in this illustration taken, June 2, 2023. REUTERS/Dado Ruvic/Illustration/File Photo

(Reuters) - A U.S. appeals court on Wednesday wrestled with whether the video-based social media platform TikTok could be sued for causing a 10-year-old girl's death by promoting a deadly "blackout challenge" that encouraged people to choke themselves.

Members of a three-judge panel of the Philadelphia-based 3rd U.S. Circuit Court of Appeals noted during oral arguments that a key federal law typically shields internet companies like TikTok from lawsuits for content posted by users.

But some judges questioned whether Congress in adopting Section 230 of the Communications Decency Act in 1996 could have imagined the growth of platforms like TikTok that do not just host content but recommend it to users using complex algorithms.

"I think we can all probably agree that this technology didn't exist in the mid-1990s, or didn't exist as widely deployed as it is now," U.S. Circuit Judge Paul Matey said.

Tawainna Anderson sued TikTok and its Chinese parent company ByteDance after her daughter Nylah in 2021 attempted the blackout challenge using a purse strap hung in her mother's closet. She lost consciousness, suffered severe injuries, and died five days later.

Anderson's lawyer, Jeffrey Goodman, told the court that while Section 230 provides TikTok some legal protection, it does not bar claims that its product was defective and that its algorithm pushed videos about the blackout challenge to the child.

"This was TikTok consistently sending dangerous challenges to an impressionable 10-year-old, sending multiple versions of this blackout challenge, which led her to believe this was cool and this would be fun," Goodman said.

But TikTok's lawyer, Andrew Pincus, argued the panel should uphold a lower court judge's October 2022 ruling that Section 230 barred Anderson's case.

Pincus warned that to rule against his client would render Section 230's protections "meaningless" and open the door to lawsuits against search engines and other platforms that use algorithms to curate content for their users.

"Every claimant could then say, this was a product defect, the way the algorithm was designed," he said.

U.S. Circuit Judge Patty Schwartz, though, questioned whether that law could fully protect TikTok from "having to make a decision as to whether it was going to let someone who turned on the app know there's dangerous content here."

The arguments come as TikTok and other social media companies, including Facebook and Instagram parent Meta Platforms, are facing pressure from regulators around the globe to protect children from harmful content on their platforms.

U.S. state attorneys general are investigating TikTok over whether the platform causes physical or mental health harm to young people.

TikTok and other social media companies are also facing hundreds of lawsuits accusing them of enticing and addicting millions of children to their platforms, damaging their mental health.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

   

Next In Tech News

Motorola raises full-year forecasts on steady demand
Nvidia surpasses $3.6 trillion market value after Trump win
Pinterest's forecast disappoints investors seeking holiday season ad boost
US Cellular to sell some spectrum licenses to AT&T for $1 billion
Datadog raises annual forecast betting on AI-driven cybersecurity demand
Italy minister open to reviewing tax hike on cryptocurrencies
Dutch chipmaker NXP sees sales growth averaging 6-10% -CEO
Italy to change web tax in bid to overcome US objections
JAL-Sumitomo JV secures right to place order for up to 100 Archer air-taxis
Software provider EPAM lifts annual forecasts as IT spending rises

Others Also Read