Comment by HendrikHensen
Comment by HendrikHensen 2 days ago
I think this is a strange, and honestly worrying, stance.
Just because there are worse problems, doesn't mean we shouldn't care about less-worse problems (this is a logical fallacy, I think it's called relative privation).
Further, there is an extremely limited number of problems that I, personally, can have any impact on. That doesn't mean that problems that I don't have any impact on, are not problems, and I couldn't worry about.
My country is being filled up with data centers. Since the rise of LLMs, the pace at which they are being built has increased tremendously. Everywhere I go, there are these huge, ugly, energy and water devouring behemoths of buildings. If we were using technology only (or primarily) for useful things, we would need maybe 1/10th of the data centers, and my immediate living environment would benefit from it.
Finally, the site could perhaps be run on a Raspberry Pi. But the site itself is not the interesting part, it's the LLMs using it.
I don't think it's odd at all- having taken a deep look at the potential impact and problems surrounding AI, including training and datacenters, I've come to the conclusion that they're about as trivial and low ranking a problem as deciding what color seatbelts should be in order to optimize driving safety. There are so many more important things to attend to - by all means, do the calculus yourself, and be honest about consumed resources and environmental impacts, but also include benefits and honest economics, and assess the cost/benefit ratio for yourself. Then look at the potential negatives, and even in a worst case scenario, these aren't problems that overwhelm nearly any other important thing to spend your time worrying about, or even better, attempting to fix.