oneplane 5 hours ago

The issue is that when you have a high local usage your grid loses the ability to respond to peaks since that capacity is now always in use. Essentially it raises the baseline use which means your elasticity is pretty much gone.

A grid isn't a magic battery that is always there, it is constantly fluctuating, regardless of the intent of producers and consumers. You need to be able to have enough elasticity to deal with that fact. Changing that is hard (and expensive), but it is the only way (such as the technical reality).

The solution is not to create say, 1000 extra coal-fired generating facilities since you can't really turn them on or off at will. Same goes for gas, nuclear etc. You'd need a few of them for your baseline load (combined with other sources like solar, wind, hydro, whatever) and then make sure you have your non-renewable sources have margin and redundancy and use storage for the rest. This was always the case, and it will always be the case.

But now with information technology, the degree to which you can permanently raise demand on the grid to an extreme degree is where the problem becomes much more apparent. And because it's not manufacturing (which is an extreme consumer of energy) you don't really get the "run on lower output" option. You can't have an LLM do "just a little bit of inferencing". Just like you can't have your Netflix send only half a movie to "save power".

In the past we had the luxury of nighttime lower demand which means industry could up their usage, but datacenters don't sleep at night. And they also can't wait for batch processing during the day.

  • YetAnotherNick 4 hours ago

    Except neither ChatGPT and nor sources say this. First source says:

    > Gas-fired generation could meet data centers’ immediate power needs and transition to backup generation over time, panelists told the Northwest Power and Conservation Council.

    What you are saying has nothing to do with local, but has to do with large abrupt changes in electricity usage, and datacenter electricity usage is generally more predictible and smooth than most other industry.

    • oneplane 3 hours ago

      I'm not talking about fluctuations (i.e. a datacenter with fluctuating usage). I'm taking about adding a datacenter to an existing grid. That significantly changes the baseline load on the grid, and that is a local problem because transmission is not universally even across an entire grid.

      If your transmission line is saturated, it doesn't matter how much more generation you add on the source end, it's not gonna deliver 'more' over the transmission lines.

      And that is just a simplistic local example, because it's not a single producer, single consumer, single transmission line scenario. ChatGPT and the article aren't diving in to that. The closest they might get is congestion but even then you already have to know the issue to be able to ask about it.

      As far as the article itself is involved here, this tread mostly goes into the reason why global usage percentages doesn't mean there are no problems. It's like saying gerrymandering has no impact because of some percentages elsewhere.