• 5 Posts
  • 83 Comments
Joined 3 years ago
cake
Cake day: June 1st, 2023

help-circle
  • Its an interesting perspective, except… that’s not how AI works (even if it’s advertised that way). Even the latest approach for ChatGPT is not perfect memory. It’s a glorified search functionality. When you type a prompt the system can choose to search your older chats for related information and pull it into context… what makes that information related is the big question here - it uses an embedding model to index and compare your chats. You can imagine it as a fuzzy paragraph search - not exact paragraphs, but paragraphs that roughly talk about the same topic…

    it’s not a guarantee that if you mention not liking sushi in one chat - talking about restaurant of choice will pull in the sushi chat. And even if it does pull that in, the model may choose to ignore that. And even if it doesn’t ignore that - You can choose to ignore that. Of course the article talks about healing so I imagine instead of sushi we’re talking about some trauma…. Ok so you can choose not to reveal details of your trauma to AI(that’s an overall good idea right now anyway). Or you can choose to delete the chat - it won’t index deleted chats.

    At the same time - there are just about as many benefits of the model remembering something you didn’t. You can imagine a scenario where you mentioned your friend being mean to you and later they are manipulating you again. Maybe having the model remind you of the last bad encounter is good here? Just remember - AI is a machine and you control both its inputs and what you’re to do with its outputs.









  • The Test part of TDD isn’t meant to encompass your whole need before developing the application. It’s function-by function based. It also forces you to not have giant functions. Let’s say you’re making a compiler. First you need to parse text. Idk what language structure we are doing yet but first we need to tokenize our steam. You write a test that inputs hello world into your tokenizer then expects two tokens back. You start implementing your tokenizer. Repeat for parser. Then you realize you need to tokenize numbers too. So you go back and make a token test for numbers.

    So you don’t need to make all the tests ahead of time. You just expand at the smallest test possible.