Choose Country and Language

Apple’s Anti-fraud chief said company was ‘The Greatest Platform for Distributing Child Porn’

An explanation for Apple’s controversial decision to begin scanning iPhones for CSAM has been found in a 2020 statement by Apple’s anti-fraud chief.

apple s anti fraud chief said company was the greatest platform for distributing child porn

An explanation for Apple’s controversial decision to begin scanning iPhones for CSAM has been found in a 2020 statement by Apple’s anti-fraud chief.

Eric Friedman stated, in so many words, that “we are the greatest platform for distributing child porn.” The revelation does, however, raise the question: How could Apple have known this if it wasn’t scanning iCloud accounts… ?

The iMessage thread was spotted by the Verge as it works its way through the internal emails, messages, and other materials handed over by Apple as part of the discovery process in the Epic Games lawsuit.

Ironically, Friedman actually suggests that Facebook does a better job of detecting it than Apple did.

The spotlight at Facebook etc. is all on trust and safety (fake accounts, etc). In privacy, they suck. Our priorities are the inverse. Which is why we are the greatest platform for distributing child porn, etc.

A fellow exec queries this, asking whether it can really be true.

Really? I mean, is there a lot of this in our ecosystem? I thought there were even more opportunities for bad actors on other file sharing systems.

Friedman responds with the single word, “Yes.”

The document is unsurprisingly labeled “Highly confidential – attorneys’ eyes only.”

The stunning revelation may well be explained by the fact that iCloud photo storage is on by default, even if it’s just the paltry 5GB the company gives everyone as standard. This means the service may be the most-used cloud service for photos – in contrast to competing ones where users have to opt in.

Apple has said that it has been looking at the CSAM problem for some time, and was trying to figure out a privacy-protecting way to detect it. It may well be this specific conversation that led the company to prioritize these efforts.

The background is unlikely to make much difference to the controversy surrounding Apple’s decision to carry out on-device scanning of customer photographs, but it does at least provide some context for what many consider to be inconsistent with the company’s strong messaging on privacy. Certainly if I’d been in CEO Tim Cook’s position when that was brought to my attention, I’d want to act.

However, it also raises the question: How did Friedman know this? iMessage is end-to-end encrypted, and Apple says it didn’t want to scan iCloud for CSAM. So if the company wasn’t doing this, how would it have any knowledge of the extent of the problem in its ecosystem? There has been a flurry of recent developments in the ongoing CSAM saga. Governments and civil rights organizations have been calling on the company to abandon the planned rollout. A developer reverse-engineered the core algorithm Apple is using, and another managed to trick it. However, the safeguards Apple applies means that the real-life risks of a false positive appear, from what we know so far, to be very low.

Does this revelation change your view on Apple’s CSAM scanning plans? Please take our poll, and share your thoughts in the comments.

Authored by Ben Lovejoy via 9to5mac September 29th 2021

Stop Child Abuse

There are many ways you can get involved and make a difference to prevent child abuse. Take action and choose what works best for you.