Comment by jonas21
> However articles like this must link to references.
There are links to sources for every piece of data in the article.
> However articles like this must link to references.
There are links to sources for every piece of data in the article.
I'm the author, the 200 flights number is taken from posts made by environmentalists and seem to match public numbers given (about 50 GWh) https://www.forbes.com/sites/arielcohen/2024/05/23/ai-is-pus....
If you think the numbers I used are wildly off I'd really appreciate any source saying so and I'll update the post with the correct amounts.
Just 200 flights? I would expected a number at least 100 times that. 200 flights of that range are what, 0.1% of a single day of global air traffic?
All of that is crazy in terms of environmental destruction but this makes AI training seem nothing to focus on to me.
The link the article uses to source the 60 GWh claim (1) appears to be broken, but all of the other sources I found give similar numbers, for example (2) which gives 50 GWh. This is specifically to train GPT-4, GPT-3 was estimated to have taken 1,287 MWh in (3), so the 50 GWh number seems reasonable.
I couldn't find any great sources for the 200 plane flights number (and as you point out the article doesn't source this either), but I asked o1 to crunch the numbers (4) and it came up with a similar figure (50-300 flights depending on the size of the plane). I was curious if the numbers would be different if you considered emissions instead of directly converting jet fuel energy to watt hours, but the end result was basically the same.
[1] https://www.numenta.com/blog/2023/08/10/ai-is-harming-our-pl...
[2] https://www.ri.se/en/news/blog/generative-ai-does-not-run-on...
[3] https://knowledge.wharton.upenn.edu/article/the-hidden-cost-...
[4] https://chatgpt.com/share/678b6178-d0e4-800d-a12b-c319e324d2...
By TFA do you mean the author of the article? It seems to be using an outdated [and incorrect] claim (as far as I know, GPT-4 has no note of taking 200 flights of energy to train), arguing against it saying that those numbers are especially small, when they are potentially significantly larger.
Where?
One of the most crucial points "Training an AI model emits as much as 200 plane flights from New York to San Francisco"
This seems to come from this blog https://icecat.com/blog/is-ai-truly-a-sustainable-choice/#:~....
which refers to this article https://www.technologyreview.com/2019/06/06/239031/training-...
which is talking about models like *GPT-2, BERT, and ELMo* -- _5+ year old models_ at this point.
The keystone statement is incredibly vague, and likely misleading. What is "an AI model"? From what I found, this is referring to GPT-2,