Apple to detect, report sexually explicit child photos on iPhone


Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. — Photo by VASANTH on Unsplash

Apple Inc said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.

As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

Subscribe or renew your subscriptions to win prizes worth up to RM68,000!

Monthly Plan

RM13.90/month

Annual Plan

RM12.33/month

Billed as RM148.00/year

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

Factbox-Who are bankrupt Northvolt's creditors?
UK regulator will consider probing Apple's, Google's mobile browsers
EU regulators scrap probe into Apple's e-book rules after complaint was withdrawn
Hyundai recalls over 145,000 electrified US vehicles on loss of drive power
'World of Warcraft' still going strong as it celebrates 20 years
Northvolt CEO steps down, saying group needs up to $1.2 billion
Bitcoin at record highs, sets sights on $100,000
Ukraine urges gamers not to enter Chernobyl exclusion zone
Kioxia's market value set at $4.9 billion in IPO
Apple readies more conversational Siri in bid to catch up in AI

Others Also Read