content warning: besides the discussion of CSAM, the article contains an example of a Grok-generated image of a child in a bikini. at least it was consensually generated, by the subject of the photo, I guess?
Samantha Smith, a survivor of childhood sexual abuse, tested whether Grok would alter a childhood photo of her. It did. “I thought ‘surely this can’t be real,’” she wrote on X. “So I tested it with a photo from my First Holy Communion. It’s real. And it’s fucking sick.”



@spit_evil_olive_tips
Yes. Thank you. Exactly.
#Grok is not a person. It’s a plug-in.
It has no awareness, intelligence, values, ethics, or even standards.
Grok is an appliance. Like a toaster. It is not sentient.
Blaming Grok for ‘making porn’ is like blaming a browser for ‘showing porn’.
Some person used Grok to make porn.
More importantly, xAI is hosting and creating a tool which can easily create this content