• alphabethunter@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    6 days ago

    I find it unlikely that someone might be on Lemmy and not be aware about the basics of how ““AI”” actually works. But if you don’t know, truly, the rundown is that all of these AI apps you use are just an interface where you can make a request to an ““AI”” do something. The ““AI”” is not running on your computer. It’s like sending someone a message “hey, do this for me” and they will do it, and then saying you don’t feel tired after doing it.

    You can use a 15 year old pc or a top of the line gaming rig, then go to chatgpt website and request something, and the result would be the same, because it’s not your machine doing the work.

    Now, you can indeed run local AI on your machine, and if you try, you’d quickly see that you need beefy hardware and that your power use would spike like crazy to deliver results that are way slower than what you’d get from using an app/website. Which makes it obvious that they’re using stronger (more) hardware than you are, and, therefore, using way more energy than you are.

    • Clay_pidgin@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      I find it unlikely that someone might be on Lemmy and not be aware about the basics of how ““AI”” actually works.

      That feels unnecessarily aggressive?

      I don’t really use it because I don’t like it, so I haven’t bothered to read much about it. I tried some local stable diffusion image generation and it didn’t seem to strain my gaming computer much. I don’t have a power measuring device.

      I tried stuff like Claude which I know is running remotely, and I assumed Gemini is remote since Google wouldn’t pass up a chance to exfiltrate more of my data.