so I think it is safe to say that no one is 'pro cp' but building in image scanning functionality is clearly a privacy violation and security risk. That UK cybersecurity chiefs say it is ok does not mean it is ok
Given that changing one pixel of an image changes its hash, then I doubt that the hash would be the sole criteria for flagging an image. The systems are much more likely to be availing of image-recognition software... ID young face or body... ID lack of clothing... ID adult present... <insert klaxxon sound effects>
> Given that changing one pixel of an image changes its hash
They don't use SHA/MD5 style hashes for CSAM image matching; they use "perceptual hashes"[1] which are resilient to changes (specifically, I believe, PhotoDNA[2] from Microsoft.)
If a match to an existing image is required, wouldn't this easily be circumvented by the violators moving back to laptops and desktop machines? They could still use their phones for creation of new child pornography since it doesn't match anything already in the database. As long as it is transferred off the phone before someone gets caught with it and its hash entered into the database, they wouldn't be flagged.
Then he can start by turning over his and his families phones along with all of their Pin codes, passwords, etc.
so I think it is safe to say that no one is 'pro cp' but building in image scanning functionality is clearly a privacy violation and security risk. That UK cybersecurity chiefs say it is ok does not mean it is ok
What could possibly go wrong?
Still want to send Granny that picture of your new baby in the bath?
Will it match the hash of one of the existing CSAM images in the database?
(And, under the Apple plan, you'd need to also match N other images in the database before anything would be flagged up for human review.)
Given that changing one pixel of an image changes its hash, then I doubt that the hash would be the sole criteria for flagging an image. The systems are much more likely to be availing of image-recognition software... ID young face or body... ID lack of clothing... ID adult present... <insert klaxxon sound effects>
> Given that changing one pixel of an image changes its hash
They don't use SHA/MD5 style hashes for CSAM image matching; they use "perceptual hashes"[1] which are resilient to changes (specifically, I believe, PhotoDNA[2] from Microsoft.)
[1] https://en.wikipedia.org/wiki/Perceptual_hashing [2] https://www.microsoft.com/en-us/photodna
Thanks for explaining. The person who replied originally just said 'hash', so I assumed SHA/MD5
If a match to an existing image is required, wouldn't this easily be circumvented by the violators moving back to laptops and desktop machines? They could still use their phones for creation of new child pornography since it doesn't match anything already in the database. As long as it is transferred off the phone before someone gets caught with it and its hash entered into the database, they wouldn't be flagged.
> moving back to laptops and desktop machines?
Sure, as long as they never upload those images to somewhere that already does scanning (which is basically every cloud storage provider, I believe.)
> As long as it is transferred off the phone before someone gets caught with it and its hash entered into the database, they wouldn't be flagged.
Yep, that's definitely one flaw in the plan.