Comment by getwiththeprog

Comment by getwiththeprog 18 hours ago

20 replies

This is a great article for discussion. However articles like this must link to references. It is one thing to assert, another to prove. I do agree that heating/cooling, car and transport use, and diet play massive roles in climate change that should not be subsumed by other debates.

The flip side to the authors argument is that LLMs are not only used by home users doing 20 searches a day. Governments and Mega-Corporations are chewing through GPU hours on god-knows-what. New nuclear and other power facilities are being proposed to power their use, this is not insignificant. Schneider Electric predicts 93 GW of energy spent on AI by 2028. https://www.powerelectronicsnews.com/schneider-electric-pred...

simonw 18 hours ago

The question this is addressing concerns personal use. Is it ethical to use ChatGPT on a personal basis? A surprising number of people will say that it isn't because of the energy and water usage of those prompts.

  • strogonoff 17 hours ago

    I would be surprised if many people said it is unethical to use LLMs like ChatGPT for environmental reasons, as opposed to ethical principles such as encouraging unfair use of IP and copyright violation.

    Still, LLM queries are not made equal. The environmental justification does not take into account for models querying other services, like the famous case where a single ChatGPT query resulted in thousands of HTTP requests.

    • simonw 17 hours ago

      I see people complaining that ChatGPT usage is unethical for environmental reasons all the time. Here's just the first example I found from a Bluesky search (this one focuses on water usage): https://bsky.app/profile/theferocity.bsky.social/post/3lfckq...

      "the famous case where a single ChatGPT query resulted in thousands of HTTP requests"

      Can you provide more information about that? I don't remember gearing about that one - was it a case of someone using ChatGPT to write code and not reviewing the result?

      • jazzyjackson 16 hours ago

        Bluesky is a special place, I am not surprised you will bump into the sort of activists who will critique every bucket of water that go into making a hamburger there. For those of us who avoid contact with the axe grinding hoards of short form media, meeting somebody who spends any portion of their day fretting about the water usage of language models is indeed a rarity.

        • JimDabell 15 hours ago

          It’s not specific to Bluesky. I’ve seen it on Threads, X, Facebook, and Reddit. It’s a talking point for people who hate AI and they tell anybody who will listen.

    • minimaxir 17 hours ago

      > I would be surprised if many people said it is unethical to use LLMs like ChatGPT for environmental reasons, as opposed to ethical principles such as encouraging unfair use of IP and copyright violation.

      Usually they complain about both.

  • fulafel 8 hours ago

    I feel it's great that people have gotten invested in energy use this way, even if it's a bit lopsisded. We should use it in a positive way to get public opinion and political overton window behind rapid decarbonization and closure of oil fields.

BeetleB 18 hours ago

> Governments and Mega-Corporations are chewing through GPU hours on god-knows-what.

The "I don't know so it must be huge" argument?

jonas21 18 hours ago

> However articles like this must link to references.

There are links to sources for every piece of data in the article.

  • blharr 17 hours ago

    Where?

    One of the most crucial points "Training an AI model emits as much as 200 plane flights from New York to San Francisco"

    This seems to come from this blog https://icecat.com/blog/is-ai-truly-a-sustainable-choice/#:~....

    which refers to this article https://www.technologyreview.com/2019/06/06/239031/training-...

    which is talking about models like *GPT-2, BERT, and ELMo* -- _5+ year old models_ at this point.

    The keystone statement is incredibly vague, and likely misleading. What is "an AI model"? From what I found, this is referring to GPT-2,

    • gloflo 15 hours ago

      Just 200 flights? I would expected a number at least 100 times that. 200 flights of that range are what, 0.1% of a single day of global air traffic?

      All of that is crazy in terms of environmental destruction but this makes AI training seem nothing to focus on to me.

    • mmoskal 16 hours ago

      I assume this comes from the 60GWh figure, which does translate to about 200 flights (assuming energy density of gasoline; in actual CO2 emissions it was probably less since likely cleaner energy was used than for running planes).

    • moozilla 15 hours ago

      The link the article uses to source the 60 GWh claim (1) appears to be broken, but all of the other sources I found give similar numbers, for example (2) which gives 50 GWh. This is specifically to train GPT-4, GPT-3 was estimated to have taken 1,287 MWh in (3), so the 50 GWh number seems reasonable.

      I couldn't find any great sources for the 200 plane flights number (and as you point out the article doesn't source this either), but I asked o1 to crunch the numbers (4) and it came up with a similar figure (50-300 flights depending on the size of the plane). I was curious if the numbers would be different if you considered emissions instead of directly converting jet fuel energy to watt hours, but the end result was basically the same.

      [1] https://www.numenta.com/blog/2023/08/10/ai-is-harming-our-pl...

      [2] https://www.ri.se/en/news/blog/generative-ai-does-not-run-on...

      [3] https://knowledge.wharton.upenn.edu/article/the-hidden-cost-...

      [4] https://chatgpt.com/share/678b6178-d0e4-800d-a12b-c319e324d2...

    • KTibow 17 hours ago

      If I understand TFA correctly that's a claim it's covering and arguing against, not arguing for.

      • blharr 17 hours ago

        By TFA do you mean the author of the article? It seems to be using an outdated [and incorrect] claim (as far as I know, GPT-4 has no note of taking 200 flights of energy to train), arguing against it saying that those numbers are especially small, when they are potentially significantly larger.

[removed] 18 hours ago
[deleted]