Facebook removes 8.7 million sexual photos of kids in last three months


FILE PHOTO: A Facebook page is displayed on a computer screen in Brussels, Belgium, April 21, 2010. REUTERS/Thierry Roge/File Photo

SAN FRANCISCO: Facebook Inc said on Oct 24 that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualised context.

Get 30% off with our ads free Premium Plan!

Monthly Plan

RM13.90/month
RM9.73 only

Billed as RM9.73 for the 1st month then RM13.90 thereafters.

Annual Plan

RM12.33/month
RM8.63/month

Billed as RM103.60 for the 1st year then RM148 thereafters.

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Facebook

   

Next In Tech News

Japan's antitrust watchdog to find Google violated law in search case, Nikkei reports
Is tech industry already on cusp of artificial intelligence slowdown?
What does watching all those videos do to kids' brains?
How the Swedish Dungeons & Dragons inspired 'Helldivers 2'
'The Mind Twisting Quadroids' review: Help needed conquering the galaxy
Albania bans TikTok for a year after killing of teenager
As TikTok runs out of options in the US, this billionaire has a plan to save it
Google offers to loosen search deals in US antitrust case remedy
Is Bluesky the new Twitter for teachers in the US?
'Metaphor: ReFantazio', 'Dragon Age', 'Astro Bot' and an indie wave lead the top video games of 2024

Others Also Read