Apple CSAM hash collisions
2021-08-18 22:55:58.783442+02 by Dan Lyke 1 comments
Apple announced that they're root through your iPhone photos looking for "child sexual abuse material" (CSAM). Everyone with any knowledge at all about how this shit goes down rolled their eyes and said "oh hell no". Apple said "no, really, this will use hashes of the image identification, it's not like we're actually looking through your photos!"
Everyone with any knowledge at all about how this shit goes down rolled their eyes and said "that's even worse."
Sure enough, there are now working hash collisions.