Hot topics

iPhones will soon be able to detect child abuse images

Apple child protection
© nextpit / screenshot: Apple

Read in other languages:

Apple announced a few new features that will ramp up the fight against child abuse images for its operating systems, just hours after the Financial Times newspaper revealed this news. Updated versions of iOS, iPadOS, macOS, and watchOS are expected to roll out later this year with feature tools to combat the spread of such content.


TL;DR

  • Messages app will alert you of sexually explicit content.
  • Material with child abuse will be identified in iCloud Photos.
  • Siri and Search will have additional tools to warn against child abuse.

The Financial Times published this news on Thursday afternoon (August 6), and shortly after that, Apple confirmed the new system to prevent child abuse with an official statement and a technical report (PDF) of how this feature will work.

Beginning with iOS 15, iPadOS 15, watchOS 8, and macOS Monterey - initially in the US only, these updated devices will have additional features to prevent and warn against the spread of child abuse content.

Alerts for parents and guardians in Messages

The Messages app will be able to detect the receipt and sending of sexually explicit images. In the case of received images, they will remain hidden with a blur effect, and can only be viewed after agreeing to an alert that the content could be sensitive to view (as seen in the third screen below).

child safety evd7tla79kqe large
Minors will be alerted to the presence of explicit content / © Apple

Parents or guardians will also have the option to be alerted should the child view explicit content identified by Messages which, according to Apple, will perform the analysis on the device itself without the company having access to the content.

This new feature will be integrated into the existing family account options in iOS 15, iPadOS 15, and macOS Monterey.

Detection in iCloud Photos

The feature that should attract the most attention is the new technology that was announced by Apple: the ability to detect images containing scenes of child abuse in iCloud. This tool will be able to identify images that have been pre-registered by NCMEC (National Center for Missing and Exploited Children, a US organization for missing and exploited children).

Despite identifying files that are stored in the cloud, the system will function by cross-checking data on the device itself, a concern that has been addressed by Apple many times, by using hashes (identifiers) of images from NCMEC and other organizations.

Neural Hash
Identifier of the image does not take into account objective attributes of the photo as colors, compression or file size / © Apple

According to Apple, the hash does not change should the file size change, or even by removing colors or changing the compression level of the image. The company will be unable to interpret the analysis results unless the account exceeds a certain degree (which remains unknown) of positive identifiers.

Apple also claimed that this system has a probability of error of less than one per one trillion per year. By identifying a potential red flag, it will evaluate the images analyzed and should there be a positive red flag identified, a report is sent to NCMEC after deactivating the account, a decision that can be appealed by the owner of the profile.

Even before the official announcement of the new tool was made, encryption experts warned about the risk of this new feature which might open the door for the use of similar algorithms for other purposes, such as spying by dictatorial governments, and bypassing the challenges found in end-to-end encryption systems.

For now, Apple has not indicated when (or even if) the system will be available in other regions. There are still questions such as the adequacy found in current laws across the world.

Siri also plays a role

This collection of new features is rounded off by Siri in conjunction with the search system across its various operating systems, as they will now provide information about online safety, including links that allow you to report instances of child abuse.

guidance img eu6rk2abgp6q large
Siri will offer features to offer help and report child abuse / © Apple

Like all of the other features, this additional feature should initially be offered only in the United States, and there is no timeframe as to when it will be made available in other regions - if ever.

Do take note that most countries should have a dedicated toll-free phone number to call on an anonymous basis to report cases of abuse and neglect against children and adolescents, with this service being available 24 hours a day, 7 days a week. Apart from that, the individual country's Ministry of Women, Family and Human Rights (or its equivalent) should also be open to any similar reports. 

Source: Apple

 The best gaming monitors at a glance

  Best gaming monitor up to $400 Best gaming monitor up to $600 Best gaming monitor up to $800 Best gaming monitor up to $1,000 Best gaming monitor for consoles
Model
Image LG Ultragear 27GP850P - product image Asus ROG Strix XG27AQ - product image BenQ MOBIUZ EX3210U - product image Asus ROG Swift PG27AQDM - product image Gigabyte M32U - product image
Offers
Go to comment (1)
Rubens Eishima

Rubens Eishima
Writer

Having written about technology since 2008 for a number of websites in Brazil, Spain, Denmark, and Germany, I specialize in the mobile ecosystem, including various models, components, and apps. I tend to not only value performance and specifications, but also things like repairability, durability, and manufacturer support. I tend to prioritize the end-user's point of view whenever possible.

To the author profile
Liked this article? Share now!
Recommended articles
Latest articles
Push notification Next article
1 Comment
Write new comment:
All changes will be saved. No drafts are saved when editing
Write new comment:
All changes will be saved. No drafts are saved when editing

  • 49
    storm Aug 9, 2021 Link to comment

    The concept is noble; the implementation evil.

    waqasAnnabelle IrisCamila Rinaldi