(Reuters) -Apple Inc on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.
Detection of child abuse image uploads sufficient to guard against false positives will trigger a human review of and report of the user to law enforcement, Apple said. It said the system is designed to reduce false positives to one in one trillion.
Already a subscriber? Log in.
Get 30% off with our ads free Premium Plan!
Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!