Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it’s no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.

  • fuzzywombat@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    2 days ago

    I’m pretty sure if anyone created a website that generated CSAM they’d be in jail by now. Just because it’s Elon Musk doing it, authorities are fine with it? At one time law enforcement would make an effort put up a facade of justice system that’s equal to everyone. This is how the masses would not rise up and dethrone the status quo. Anyone remember Martha Stewart going to jail for insider trading?

    Isn’t there Apple app store and Google play store policies that says this is not allowed? How come the app is still available on those mobile platforms? Where are EU regulators doing? No fines? Nothing?

    We’ve basically reached a point where billionaires are publicly mocking and daring the rest of us to react. Do these accelerationist billionaires really think they’ll come out ahead when the masses burn everything to the ground?

    • architect@thelemmy.club
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      19 hours ago

      Yes they do. They have been threatening us with their murder robots for a decade. Who do you think they are going to use those on?

      They view most everyone as takers.

    • REDACTED@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      6
      ·
      23 hours ago

      I actually refuse to believe you can simply undress people with grok, but rather the fact that it’s easily jailbroken, which suddenly makes this not a company’s problem as the service was essentially “cracked” and used outside it’s Terms of Service. Still, this IS a problem.