Apple confirms it already scans iCloud Mail for child sexual abuse materials

Earlier this month, Apple announced that they planned to help find child sexual abuse material (CSAM) on customers’ devices—which also came with some privacy concerns. However, the company confirmed to 9to5Mac that it already scans iCloud Mail for CSAM, and has been doing so since 2019.

“Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task,” wrote Ben Lovejoy of 9to5Mac.

The company, however, has not been scanning iCloud Photos or iCloud backups. Not yet, anyway. As part of their archived version of Apple’s child safety page, their systems “use electronic signatures to find suspected child exploitation”—“much like spam filters in email”.

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation,” continued Apple.

Apple’s anti-fraud chief, Eric Friedman was also reported saying that the company is “the greatest platform for distributing child porn”. The statement raises the question—“How could Apple have known this if they weren’t scanning iCloud accounts?”

Apple wouldn’t further comment on Friedman’s quote. But they stuck with the statement of how the company has “never scanned iCloud Photos”.

An explanation for it might be that there are other cloud services that were scanning photos for CSAM, and Apple wasn’t. If other services were disabling accounts for CSAM, then they could reveal that the content exists more on Apple’s platform than anywhere else. But again, this is just one possible explanation to Friedman’s statement.

Regardless, Apple already revealed that they will be introducing new child safety features that also feel a little more invasive (and dangerous if they abuse the power). The company will soon be able to detect known CSAM images when they are stored in iCloud Photos with “new technology in iOS and iPadOS”.

[ SOURCE, IMAGE SOURCE ]

Recent Posts

Zeekr 7X 2026 gets a price hike in Malaysia: Still cheaper than Tesla Model Y

Zeekr Malaysia has announced the new 2026 pricing for the Zeekr 7X, following the end…

5 hours ago

Vivo X300 Pro: Forget the iPhone and Galaxy, this is the Real Concert Phone

When it comes to choosing a smartphone with the best camera, most people instinctively look…

6 hours ago

Tesla Model 3 and Model Y now listed with up to 55km more range

Tesla has quietly revised the advertised WLTP-rated range for several Model 3 and Model Y…

7 hours ago

Tune Talk app offers free games and drama worldwide with no SIM or subscription required

Tune Talk has expanded access to its revamped Tune Talk app globally, allowing users worldwide…

11 hours ago

Maxis migrates mission-critical workloads, including Maxis and Hotlink apps, to AWS Malaysia Region

Maxis has completed the migration of its mission-critical workloads from Amazon Web Services’ Singapore Region…

11 hours ago

Dongfeng 007 zooms into Malaysia: Electric sedan with up to 536hp, priced from RM161k

In addition to the Vigo compact SUV, Dongfeng's EV lineup in Malaysia now also includes…

21 hours ago

This website uses cookies.