Pro@programming.dev to Technology@lemmy.worldEnglish · 1 day agoGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googleexternal-linkmessage-square16fedilinkarrow-up175arrow-down15file-textcross-posted to: technology@beehaw.orgtechnology@lemmy.ml
arrow-up170arrow-down1external-linkGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googlePro@programming.dev to Technology@lemmy.worldEnglish · 1 day agomessage-square16fedilinkfile-textcross-posted to: technology@beehaw.orgtechnology@lemmy.ml
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up1·4 hours agoThere was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly
There was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly