Apple Inc said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.
As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.