They’re glorified autocompletes. Way too much attention is being given to LLMs in isolation. By themselves: Not a silver bullet.
But when called in a chain . . . eyebrows
Cause China spooky noises
Ah, but that’s the thing. Training isn’t copying. It’s pattern recognition. If you train a model “The dog says woof” and then ask a model “What does the dog say”, it’s not guaranteed to say “woof”.
Similarly, just because a model was trained on Harry Potter, all that means is it has a good corpus of how the sentences in that book go.
Thus the distinction. Can I train on a comment section discussing the book?
We have to distinguish between LLMs
They are not one and the same
Quit looking at my screen SNOOGGUMS!
ooooo don’t get me started!
I hope they’re all finally free now
Bethesda seems to have lost what Elder Scrolls is about