Comment by titanomachy

Comment by titanomachy 17 hours ago

2 replies

The major players in AI are collectively burning 1-2 gigawatts, day and night, on research and development of the next generation of LLMs. This is as much as my city of a million people. The impact is real, and focusing on inference cost per query kind of misses the point. Every person who uses these tools contributes to the demand and bears some of the responsibility. Similar to how I have responsibility for the carbon emissions of a flight, even if the plane would have flown without me.

I'm saying this as someone who finds LLMs helpful, and uses them without feeling particularly guilty about it. But we should be honest about the costs.

nharada 5 hours ago

Agreed, I feel like the main response seems to be "Does Not!", but it's reasonable to accept that a thing you like has a cost. We all emit carbon every day to do things we don't 100% need, and we should just be willing to admit there's a cost and try and move towards paying them.

Personally, I'm not tripping too hard about datacenter energy long term because it's very easy to make carbon free (unlike say ICE cars or aircraft). But it would be nice to see some efforts to incentivize green energy for those datacenters instead of just saying "whatever" and powering them with coal.