SAN FRANCISCO: Facebook Inc said on Oct 24 that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.
The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualised context.
Already a subscriber? Log in.
Subscribe or renew your subscriptions to win prizes worth up to RM68,000!
Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!