cross-posted from: https://sh.itjust.works/post/1062067
In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.
cross-posted from: https://sh.itjust.works/post/1062067
In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.
Because they can’t or are not willing to investigate what happened at this particular company nor to its staff. The push of the story is therefore about what’s happening on Twitter (“getting absolutely roasted”) because people connect with action.
A better story could recount the events up to now. Maybe something like this?
Finding this information and weaving it into a story that people go “And then what happened?!” is difficult and takes time. It’s hard to justify when you can get clicks from shit like this article.