Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it’s no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.
I’m pretty sure if anyone created a website that generated CSAM they’d be in jail by now. Just because it’s Elon Musk doing it, authorities are fine with it? At one time law enforcement would make an effort put up a facade of justice system that’s equal to everyone. This is how the masses would not rise up and dethrone the status quo. Anyone remember Martha Stewart going to jail for insider trading?
Isn’t there Apple app store and Google play store policies that says this is not allowed? How come the app is still available on those mobile platforms? Where are EU regulators doing? No fines? Nothing?
We’ve basically reached a point where billionaires are publicly mocking and daring the rest of us to react. Do these accelerationist billionaires really think they’ll come out ahead when the masses burn everything to the ground?
Yes they do. They have been threatening us with their murder robots for a decade. Who do you think they are going to use those on?
They view most everyone as takers.
I actually refuse to believe you can simply undress people with grok, but rather the fact that it’s easily jailbroken, which suddenly makes this not a company’s problem as the service was essentially “cracked” and used outside it’s Terms of Service. Still, this IS a problem.
The real problem here is that Xitter isn’t supposed to be a porn site (even though it’s hosted loads of porn since before Musk bought it). They basically deeply integrated a porn generator into their very publicly-accessible “short text posts” website. Anyone can ask it to generate porn inside of any post and it’ll happily do so.
It’s like showing up at Walmart and seeing everyone naked (and many fucking), all over the store. That’s not why you’re there (though: Why TF are you still using that shithole of a site‽).
The solution is simple: Everyone everywhere needs to classify Xitter as a porn site. It’ll get blocked by businesses and schools and the world will be a better place.
I wonder, just another rename, X → XXX, would do well, wouldn’t it?
The solution is simple: Everyone everywhere needs to classify Xitter as a porn site
I think a large part of it’s popularity has become the porn, because it passes all those filters. Especially since Musk backed conservatives are blocking porn in red states, but as far as I know, never twitter.
Treat it like a porn site and lots of Republicans need to give up their ID to show they’re old enough. They can’t VPN around it because social media hates VPN
I bet that many are simply ignorant of this new problem
It’s like showing up at Walmart and seeing everyone naked (and many fucking), all over the store.
🤢🤮
I thought the real problem is that it is generating *illegal porn.
Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I’m sure they’re working on it (it’s actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there’s always going to be an infinite number of ways to trick it since LLMs aren’t actually intelligent).
Porn itself is not illegal.
He has 100% control over the ability to alter or pull this product. If he’s leaving it up while he’s generating illegal pornography that is on him.
And no s*** I’m concerned about the illegal stuff.
The real problem is that we ever gave a shit about human bodies, especially fake ones.
I don’t know how to tell you this but… Every body gives a shit. We’re born shitters.
(though: Why TF are you still using that shithole of a site‽).
Maybe some places don’t have alternative suppliers than Walmart? Similarly, some places have governments that still only use the porno social network for some services.
Why the &#**### is California putting Amber Alerts on a porn site?
I don’t know, man… Have you even seen Amber? It might be worth an alert 🤷
Bastard? Idk what other swear has that many letters.
But that doesn’t leave many options for the victims. Maddie, who said she’s a 23-year-old pre-med student, woke up on New Year’s Day to an image that horrified her. On X, she had previously published a picture of herself with her boyfriend at a local bar, which two strangers altered using Grok.
I’ve thought a lot of things would kill twitter…
But if every time a woman posts a picture of herself, and neckbeards reply asking twitter AI to sexualize her, and the AI responds right there with it where everyone following the original account can see…
I truly don’t understand how or why any women are still using it.
I truly don’t understand how or why any women are still using it.
FOMO, along with addiction to fake likes from fake friends for fake validation of their fake lives is a powerful mind control technique
how many years have people been saying “DELETE TWITTER ALREADY”
and now that they’re being turned into porn, they still won’t
Nobody with a face should use it anymore…but that will reduce traffic like…5%…
How is this gigantic website even legal and still online? In the civilized world, I mean.
It all starts with a little bill call the Telecommunications act of 1996…
And yeah, loads of people said it would lead to this shit.
But Silicon valley gave a shit ton of money to the Clinton’s for the Internet part, and telecoms for the part doing away with monopoly regulations.
Ban twitter 🇧🇷🤝 🇪🇺
I’d just rather it died and everyone stopped using it tbh
I misread the title and thought it meant thousands of Musk undressed images per hour.
The horror!
Politicians are already hellbent on “age”-verifying social-media, but Elon seems believe there to be a lack of urgency in this regard… Please regulate social-media harder daddy! Please, we’ve had the resources to comply with these perverse regulations for a while now. I didn’t hijack this platform, just for the lefties to be able to speak their mind on alternative platforms…
Scrolled @grok undress and bikini for a bit, most of it is girls jumping on the trend asking to change their own photos and humor.








