• @[email protected]
    link
    fedilink
    English
    08 months ago

    You need that amount of power to provide that service for hundreds of millions of people simultaneously, like ChatGPT. Do you seriously think it takes that amount of equipment and power to output to a single device?

    I linked you to one that runs locally on a phone, dude. Here’s a whole list of pre-trained LLMs you can run on an average computer. 🤷‍♀️

    • @[email protected]
      link
      fedilink
      English
      08 months ago

      your tiny PC and phone based LLMs are going to be fuck-all useful after the apoc. Oh yeah “ClimateBert’s Hugging Face” sounds like just the thing to help you survive.

      the only significant advantage an LLM is going to offer is the illusion of company, and they only way you’ll get it is a giant data center.

      you’d be better off having wikipedia summarized by a chatbot, but again, it’s gonna require grunt and storage.

      just because something can be stripped down to run on any device doesn’t make it useful.

      • @[email protected]
        link
        fedilink
        English
        08 months ago

        So you do unironically think it takes that amount of equipment and power to output to a single device lmao

        • @[email protected]
          link
          fedilink
          English
          08 months ago

          I can’t tell if you’re fucking dense or can’t read.

          A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.

          you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?

          fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.