• 7 Posts
  • 332 Comments
Joined 3 years ago
cake
Cake day: June 23rd, 2023

help-circle



  • Nobody is bitching about photoshopping, a thing that exists and almost anybody can do to put some person’s face on a naked body or whatever situation they want. It’s existed for decades. Suddenly, journalists are inventing a new moral panic with LLMs, saying they can do whatever they want with pictures, despite the fact that this technology already existed, it’s just a little bit easier now. It’s not a new problem, so reporting on it is just shifting the blame to a new boogeyman.

    See, the magic formula is to slap the word “AI” on a headline and boom, instant attention! It doesn’t matter what it’s about, if it’s a new problem, if it’s only slightly related to the main root cause… As long as you’re talking shit about every angle around AI in the most extreme ways possible, mission accomplished. It is outrage reporting because there is no solutioning or historical context. The sole purpose is the outrage, because outrages generates clicks. It’s too hard for journalists to think outside the outrage box.




  • LLM liability is not exactly cut-and-dry, either. It doesn’t really matter how many rules you put on LLMs to not do something, people will find a way to break it to do the thing it said it wasn’t going to do. For fuck’s sake, have we really forgotten the lessons of Asimov’s I, Robot short stories? Almost every one of them was about how the “unbreakable” three laws were very breakable thing, because absolute laws don’t make sense in every context. (While I hate using AI fiction with LLM comparisons, this one fits.)

    Ultimately, it’s the person’s responsibility for telling it to do a thing, and getting the thing it was told to get. LLMs are a tool, nothing more. If somebody buys a hammer, and misuses that hammer by bashing somebody’s brains in, we arrest the person who committed murder. If there’s some security hole on a website that a hacker used to steal data, depending on how negligent the company is, there is some liability with that company not providing enough protections against their data. But, the hacker 100% broke the law, and would get convicted, if caught.

    Regardless of all of that, LLMs aren’t fucking sentient and these dumbass journalists need to stop personifying them.




  • Download all existing literature to build a library for preservation and you’re called a pirate.

    Said library contains petabytes of the exact text of each and every piece of literature.

    Download all existing literature from aforementioned library to train an LLM and you’re a tech innovator.

    Said model contains gigabytes of a bunch of weights that can never go back to the exact words of the book.

    What a strange world we live in.

    It’s not strange at all. It’s degrees of compression. You compress a JPEG to the point that it’s unrecognizable, and it’s no longer breaking copyright. It’s essentially like trying to write a book you just read based on memory.