Comment by YetAnotherNick

Comment by YetAnotherNick 16 hours ago

6 replies

Why do you belive this? Datacenter uses just a 1-1.3 percent of electricity from grid and even if you suppose AI increased the usage by 2x(which I really doubt), the number would still be tiny.

Also AI training is easiest workload to regulate, as you can only train when you have cheaper green energy.

kolinko 14 hours ago

I also had doubts, but asked chat and it confirms it’s an issue - including sources.

https://chatgpt.com/share/678b6b3e-9708-8009-bcad-8ba84a5145...

The issue is that they are often localised, so even if it’s just 1% of power, it can cause issues.

Still, by itself, grid issues don’t mean climate issues. And any argument complaining about a co2 cost should also consider alternative cost to be reliable. Even if ai was causing 1% or 2% or 10% of energy use, the real question is how much it saves by making society more efficient. And even if it wasn’t, it’s again more of a question about energy companies polluting with co2.

Microsoft, which hosts OpenAI, is famously amazing in terms of their co2 emissions - so far they were going way beyond what other companies were doing.

  • baobun 3 hours ago

    ChatGPT didn't "confirm" anything there. It is not a meaninful reference.

  • YetAnotherNick 10 hours ago

    What do you mean by confirms the issue? What's the issue exactly?

    • oneplane 5 hours ago

      The issue is that when you have a high local usage your grid loses the ability to respond to peaks since that capacity is now always in use. Essentially it raises the baseline use which means your elasticity is pretty much gone.

      A grid isn't a magic battery that is always there, it is constantly fluctuating, regardless of the intent of producers and consumers. You need to be able to have enough elasticity to deal with that fact. Changing that is hard (and expensive), but it is the only way (such as the technical reality).

      The solution is not to create say, 1000 extra coal-fired generating facilities since you can't really turn them on or off at will. Same goes for gas, nuclear etc. You'd need a few of them for your baseline load (combined with other sources like solar, wind, hydro, whatever) and then make sure you have your non-renewable sources have margin and redundancy and use storage for the rest. This was always the case, and it will always be the case.

      But now with information technology, the degree to which you can permanently raise demand on the grid to an extreme degree is where the problem becomes much more apparent. And because it's not manufacturing (which is an extreme consumer of energy) you don't really get the "run on lower output" option. You can't have an LLM do "just a little bit of inferencing". Just like you can't have your Netflix send only half a movie to "save power".

      In the past we had the luxury of nighttime lower demand which means industry could up their usage, but datacenters don't sleep at night. And they also can't wait for batch processing during the day.

      • YetAnotherNick 4 hours ago

        Except neither ChatGPT and nor sources say this. First source says:

        > Gas-fired generation could meet data centers’ immediate power needs and transition to backup generation over time, panelists told the Northwest Power and Conservation Council.

        What you are saying has nothing to do with local, but has to do with large abrupt changes in electricity usage, and datacenter electricity usage is generally more predictible and smooth than most other industry.

        • oneplane 3 hours ago

          I'm not talking about fluctuations (i.e. a datacenter with fluctuating usage). I'm taking about adding a datacenter to an existing grid. That significantly changes the baseline load on the grid, and that is a local problem because transmission is not universally even across an entire grid.

          If your transmission line is saturated, it doesn't matter how much more generation you add on the source end, it's not gonna deliver 'more' over the transmission lines.

          And that is just a simplistic local example, because it's not a single producer, single consumer, single transmission line scenario. ChatGPT and the article aren't diving in to that. The closest they might get is congestion but even then you already have to know the issue to be able to ask about it.

          As far as the article itself is involved here, this tread mostly goes into the reason why global usage percentages doesn't mean there are no problems. It's like saying gerrymandering has no impact because of some percentages elsewhere.