They definitely prefer to spend their money on development, rather than adding safeguards
I don’t believe people misusing ChatGPT helps them in any way, it’s just that adding protections has a cost
but they aren’t able to tell when a child is at risk and report it as well?
Maybe police actually sorts and filters manually reports, but doesn’t want to bother with mental health things? You know how the USA works, I don’t believe OpenAI will go too far, they’ll just randomly report.
Might even be reported for all I know, sometimes I just like to see the reaction of LLMs when I say I’ll commit horrible stuff like school shootings or terrorism. The NSA will just feed it into their mass spying algorithm to check the most important profiles and this will be it
The war on drugs is so much more important than mental health detection, y’know. It sells more.
They definitely prefer to spend their money on development, rather than adding safeguards
I don’t believe people misusing ChatGPT helps them in any way, it’s just that adding protections has a cost
Maybe police actually sorts and filters manually reports, but doesn’t want to bother with mental health things? You know how the USA works, I don’t believe OpenAI will go too far, they’ll just randomly report.
Might even be reported for all I know, sometimes I just like to see the reaction of LLMs when I say I’ll commit horrible stuff like school shootings or terrorism. The NSA will just feed it into their mass spying algorithm to check the most important profiles and this will be it
The war on drugs is so much more important than mental health detection, y’know. It sells more.