September 4, 2021

Delayed.

Yesterday Apple announced they would be delaying the release of their new “CSAM scanning” software originally slated for upcoming iOS15. For those unfamiliar, CSAM is short for Child Sexual Abuse Material, and the majority of the cloud providers (Google, Dropbox, Microsoft, and Apple) scan photos uploaded to their servers for this type of thing. This is a good thing. Getting the trafficking of CSAM under control is very important and actually a no-brainer. No one wants to see children abused in any way, including this way.

Apple’s intended release was to scan devices BEFORE the image made it to their servers. So, if you used iCloud to store your photos, Apple’s new software would scan the image before it made its way to the cloud, using the horsepower of your device, and on your actual device. The scanning is done based on hashes and these hashes are compared to a third party list of offending hashes. A hash is basically a binary/numeric representation of an image, not the actual image. So, if Apple found a photo’s hash that matched a listed hash, there could be trouble. The system is not 100%, but Apple assured folks there was a one in a million chance of a false positive.

Many security professionals much smarter than me, and security minded folks like me, obviously want to end the exchange of CSAM. CSAM is bad. But essentially building software that is a back door into anyone’s device to scan all cloud-bound images for this material is really bad. Apple promised this scanning would be used for this purpose and this purpose only, but once a back door is created, it can be exploited. It can be abused. Not to mention bad actors putting non-CSAM images on the hit list, for example LGBTQ+ themed images identified in a country that doesn’t like that sort of thing.

Apple has ramped back the release to re-evaluate based on customer feedback, consumer group concerns, etc.

It’s still not a good idea. If the images are on a cloud server, have at it. But scanning images directly on a device, which is essentially the same as inviting Apple to come into your house every night and rifle through your cabinets, wallets, etc., is a very bad idea. No one would allow this in the real world. We shouldn’t give up our digital privacy so easily.