return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 2 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square35fedilinkarrow-up1206arrow-down17
arrow-up1199arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 2 days agomessage-square35fedilink
minus-squareFreedomAdvocate@lemmy.net.aulinkfedilinkEnglisharrow-up7·2 days agoYou don’t even need an LLM, just an internet connected browser.
minus-squareCodenameDarlen@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-21 day agoYou don’t need a browser just use cURL
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up2·2 days agoOr literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
minus-squareMeThisGuy@feddit.nllinkfedilinkEnglisharrow-up1·13 hours agolike this guy? https://wikipedia.org/wiki/Oklahoma_City_bombing
You don’t even need an LLM, just an internet connected browser.
You don’t need a browser just use cURL
Or literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
like this guy?
https://wikipedia.org/wiki/Oklahoma_City_bombing