Comment by yapyap

Comment by yapyap 18 hours ago

9 replies

Such a stupid post, I know people on HN don’t like absolute descriptors like that and sorry for that.

Obviously the LLMs and ChatGPT don’t use the most energy when answering your question, they churn through insane amounts of water and energy when training them, so much so that big tech companies do not disclose and try to obscure those amounts as much as possible.

You aren’t destroying the environment by using it RIGHT NOW, but you are telling the corresponding company that owns the LLM you use “there is interest in this product”, en masse. With these interest indicators they will plan for the future and plan for even more environmental destruction.

nick__m 18 hours ago

It's not like they are mixing that water with oil and pumping into the aquifer. Water evaporate, turn into clouds, that precipitate into rain that fall on the ground and water bodies, where it can be used again. So what's the problem, with datacenter water usage? Has the water cycle has stopped and I was not informed?

  • ternnoburn 17 hours ago

    Fresh water is finite. Infinite in reuse, but we can only take so much from a river before that river ceases to be. If you have a megabit connection, it doesn't matter that your cloud backups have infinite storage, you are limited by bandwidth.

    Water vapor stays aloft for wild, so there's no guarantee it enters the same watershed it was drawn from.

    It's also a powerful greenhouse gas, so even though it's removed quickly, raising the rate we produce it results in more insulation.

    It's not a finite resource, we need to be judicious and wise in how we allocate it.

simonw 18 hours ago

Plenty of companies have revealed exactly how much energy and CO2 they have used training a model. Just off the top of my head, I've seen those numbers are available for Meta's Llama models, Microsoft's Phi series and DeepSeek's models - including their impressive DeepSeek v3 which trained for less than $6m in cost - a huge reduction compared to other similar models, and a useful illustration of how much more effect this stuff can get on the training side of things.

fulafel 14 hours ago

Anyone care to have a go at back of the envelope number for training energy use amortized per query for ChatGPT's models? Is the training or the inference going to dominate?

jna_sh 18 hours ago

Similar feelings about the repeated references to the apparently agreed consensus that individual action is pointless vs systematic change like switching to a renewable energy system. Jevons Paradox would like a word.

[removed] 16 hours ago
[deleted]
monero-xmr 18 hours ago

I don’t care about energy usage. How exhausting it must be to be a climate hysterical person and try to factor the climate cost of every single action you take in life.

Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great. But I refuse to worry and be burdened by anxiety over every single unit of electricity consumed. Such a tiring, bullshit part of life the progressives have foisted on elites. And it is elites only as poors don’t give a shit

  • ben_w 14 hours ago

    > Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great.

    > And it is elites only as poors don’t give a shit

    The poor people are also consumers; raising prices of energy for that group is a fantastic way to get kicked out of office even if you're an actual literal dictator.

    People are complex.

    The screeds you're objecting to are part of the political process to tell governments to do something, even if that something ends up being a mix of what you suggest plus subsidies for the poor, or something completely different, in any case to avoid being defenestrated