Summary

AI is increasingly exploited by criminals for fraud, cyberattacks, and child abuse, warns Alex Murray, UK’s national police lead for AI.

Deepfake scams, such as impersonating executives for financial heists, and generative AI used to create child abuse images or “nudify” photos for sextortion are rising concerns.

Terrorists may exploit AI for propaganda and radicalization via chatbots.

Murray urged urgent action as AI becomes more accessible, realistic, and widely used, predicting significant crime growth by 2029.

  • ERROR: Earth.exe has crashed@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    12
    ·
    edit-2
    1 month ago

    I actually see a positive in AI.

    Ever got nudes leaked? Well now you can just claim they are deepfakes. Its so much relieving that you can just dismiss any photo or videos of you doing embarassing things as deepfakes.

    • Cris@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      ·
      edit-2
      1 month ago

      I’m gonna be really honest, I think a big part of what feels violating about people seeing your nudes in the first place is being sexualized without your consent, and losing agency over who you allow yourself to be sexualized by

      That’s not any different with deepfakes. I don’t think that’d actually make that person feel much better. Like maybe they can save face on the fact that they took nudes depicting themselves in whatever way, but the thing that I think does the most emotional damage isn’t actually changed or affected by saying “it’s not actually me, those are deepfakes” :(

    • Gsus4@mander.xyz
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      1 month ago

      Your partner and family members may know particular hidden features of your body to tell if it is a deepfake or not, so it is still somewhat damaging in private even if it isn’t in public.