Apple scanning for CSAM
By Ariel Costas - August 12, 2021
Apple announced recently changes to their iThings to include “children protection” in iOthersComputer(aka. iCloud) and iMessage. They will compare every photo uploaded to iCloud with a database of CSAM (Child Sexual Abuse Material). They will also scan all iMessage images sent or received from minor’s accounts for sexually explicit material and notify their parents.
While the intention is not bad, this is just a way of opening the (back)door to a new way of surveillance from Apple and the Governments. Today is only CSAM, but tomorrow it can be anything. If Apple really cares about privacy, as they claim, they shouldn’t implement this no matter what.
More info on https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life