Apple confirms it already scans iCloud Mail for child sexual abuse materials

Earlier this month, Apple announced that they planned to help find child sexual abuse material (CSAM) on customers’ devices—which also came with some privacy concerns. However, the company confirmed to 9to5Mac that it already scans iCloud Mail for CSAM, and has been doing so since 2019.

“Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task,” wrote Ben Lovejoy of 9to5Mac.

The company, however, has not been scanning iCloud Photos or iCloud backups. Not yet, anyway. As part of their archived version of Apple’s child safety page, their systems “use electronic signatures to find suspected child exploitation”—“much like spam filters in email”.

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation,” continued Apple.

Apple’s anti-fraud chief, Eric Friedman was also reported saying that the company is “the greatest platform for distributing child porn”. The statement raises the question—“How could Apple have known this if they weren’t scanning iCloud accounts?”

Apple wouldn’t further comment on Friedman’s quote. But they stuck with the statement of how the company has “never scanned iCloud Photos”.

An explanation for it might be that there are other cloud services that were scanning photos for CSAM, and Apple wasn’t. If other services were disabling accounts for CSAM, then they could reveal that the content exists more on Apple’s platform than anywhere else. But again, this is just one possible explanation to Friedman’s statement.

Regardless, Apple already revealed that they will be introducing new child safety features that also feel a little more invasive (and dangerous if they abuse the power). The company will soon be able to detect known CSAM images when they are stored in iCloud Photos with “new technology in iOS and iPadOS”.

[ SOURCE, IMAGE SOURCE ]

Recent Posts

Tesla turns on SuperChargers at Toppen Shopping Centre in Johor Bahru

Ahead of the Raya holiday weekend, Tesla Malaysia has just turned on a new SuperCharger…

21 hours ago

OMOWAY: The Tesla of 2-Wheelers Is Charting Its Course With OMO X and Mobility One Wheeled Robot

This post is brought to you by OMOWAY. The production of OMO X, the world’s…

24 hours ago

BMW i3: Fully electric 3 Series, 800V EV with up to 900km range and 400kW DC charging

BMW has unveiled the new BMW i3, its first fully electric 3 Series for the…

2 days ago

Poco X8 Pro: Flagship-class performance and up to 8,500mAh battery for under RM2,000

The Poco X8 Pro and X8 Pro Max have officially launched and they are now…

3 days ago

Lim Tayar marks 3rd year of Ramadan iftar tradition with customers, aims to be leader in EV tyre services

Renowned tyre retailer and automotive service provider, Lim Tayar, has recently hosted the 3rd Berbuka…

3 days ago

This website uses cookies.