This seems like a waste of time to me when you could instead focus on Coal or things that matter

    • LaLuzDelSol@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      15 days ago

      Local AI probably uses more CO2 per prompt than datacenter, unless you’re running off your own solar panels or something

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 days ago

        You really bit the AI phobia.

        Even 1000 non-local AI prompts use about the same energy as your microwave.

        Datacenters running AI are crunching billions of prompts a second.

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 days ago

            A single prompt is about 1 - 1.5 W/h.

            A microwave is 1000 - 1500 W/h.

            It doesn’t matter how long you run either. Most prompt are only seconds.

            • LaLuzDelSol@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              13 days ago

              Watts per hour is not a unit of energy or power, do you mean watt-hours? Neither of those numbers seems right if so. And the amount of energy consumed by a prompt will vary wildly based on the size of the model, your hardware, what your prompt is, etc. My point is that, with 2 identical prompts on 2 identical models, one done in a specialized datacenter and one done at home, the one at home will probably use more power because it’s less efficient. Therefore, if we are concerned with how much power AI datacenters are using, switching from datacenters to home computing is clearly not a solution.