Apple confirms it already scans iCloud Mail for child sexual abuse materials

Earlier this month, Apple announced that they planned to help find child sexual abuse material (CSAM) on customers’ devices—which also came with some privacy concerns. However, the company confirmed to 9to5Mac that it already scans iCloud Mail for CSAM, and has been doing so since 2019.

“Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task,” wrote Ben Lovejoy of 9to5Mac.

The company, however, has not been scanning iCloud Photos or iCloud backups. Not yet, anyway. As part of their archived version of Apple’s child safety page, their systems “use electronic signatures to find suspected child exploitation”—“much like spam filters in email”.

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation,” continued Apple.

Apple’s anti-fraud chief, Eric Friedman was also reported saying that the company is “the greatest platform for distributing child porn”. The statement raises the question—“How could Apple have known this if they weren’t scanning iCloud accounts?”

Apple wouldn’t further comment on Friedman’s quote. But they stuck with the statement of how the company has “never scanned iCloud Photos”.

An explanation for it might be that there are other cloud services that were scanning photos for CSAM, and Apple wasn’t. If other services were disabling accounts for CSAM, then they could reveal that the content exists more on Apple’s platform than anywhere else. But again, this is just one possible explanation to Friedman’s statement.

Regardless, Apple already revealed that they will be introducing new child safety features that also feel a little more invasive (and dangerous if they abuse the power). The company will soon be able to detect known CSAM images when they are stored in iCloud Photos with “new technology in iOS and iPadOS”.

[ SOURCE, IMAGE SOURCE ]

Recent Posts

Huawei Mate 80 Pro is coming to Malaysia on 12 March, early-bird customers get free Huawei MatePad 11.5 Standard

The candy bar Huawei Mate series smartphone is making a comeback on the global stage.…

17 hours ago

Maybank and TNB Electron launch EV charging pilot at Bangi, exploring rollout at selected branches

Maybank has partnered with Tenaga Nasional Berhad (TNB) through its EV charging arm TNB Electron,…

2 days ago

Samsung Galaxy Buds 4 & Buds 4 Pro Malaysia: New looks, improved sound & battery life, head gestures, priced from RM699

In addition to the Galaxy S26, Samsung has also launched its latest true wireless earbuds,…

2 days ago

Leapmotor C10+: Upgraded 295hp motor, bigger battery with up to 510km range and faster 180kW DC Charging

Stellantis Malaysia has introduced its new Leapmotor C10 PLUS (C10+) for our local market. This…

2 days ago

Samsung Galaxy S26 Ultra: Privacy Display, overclocked Snapdragon 8 Elite Gen 5, brighter cameras, faster charging

The Samsung Galaxy S26 Ultra might have brought the biggest change to Samsung's flagship series…

2 days ago

Samsung Galaxy S26 & S26+: Minor spec tweaks, new AI call screening, agentic AI features

Samsung's latest flagship Galaxy S26 series is now official. The Galaxy S26 Ultra might get…

2 days ago

This website uses cookies.