• 3abas@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    10 hours ago

    You can run a model locally on your phone and it will answer most prompts without breaking a sweet, it’s actually way less energy than googling and loading the content from a website that’s hosted 24/7 just waiting for you to access the content.

    Training a model is expensive, using it isn’t.

    • squaresinger@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      Nice claim you have there. Do you have anything to back that up?

      If it’s so easy, it shouldn’t be hard for you to link a model like that.

    • Witziger_Waschbaer@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 hours ago

      Can you link me to what model you are talking about? I experimented with running some models on my server, but had a rather tough time without a GPU.

    • bystander@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 hours ago

      I would like to learn about this a bit more, I keep hearing it in conversations here and there. Do you have links around studies/data on this?