Comment by snailmailman
Comment by snailmailman 6 hours ago
I’m curious where you got any of those numbers. Many laptops use <20W. But most local-ai inferencing requires high end, power hungry nvidia GPUs that use multiple hundreds of watts. There’s a reason those GPUs are in high demand, with prices sky high, because those same (or similar) power hungry chips are in data centers.
Compared to traditional computing it seems to me like there’s no way AI is power efficient. Especially when so many of the generated tokens are just platitudes and hallucinations.
> The agreed-on best guess right now for the average chatbot prompt’s energy cost is actually the same as a Google search in 2009: 0.3 Wh. This includes the cost of the answering your prompt, idling AI chips between propmts, cooling in the data center, and other energy costs in the data center. This does not include the cost of training the model, the embodied carbon costs of the AI chips, or the fact that data centers typically draw from slightly more carbon intense sources. If you include all of those, the full carbon emissions of an AI prompt rise to 0.28 g of CO2. This is the same emissions as we cause when we use ~0.8 Wh of energy.
How concerned should you be about spending 0.8 Wh? 0.8 Wh is enough to:
Stream a video for 35 seconds Watch an LED TV (no sound) for 50 seconds Upload 9 photos to social media Drive a sedan at a consistent speed for 4 feet Leave your digital clock on for 50 minutes Run a space heater for 0.7 seconds Print a fifth of a page of a physical book Spend 1 minute reading this blog post. If you’re reading this on a laptop and spend 20 minutes reading the full post, you will have used as much energy as 20 ChatGPT prompts. ChatGPT could write this blog post using less energy than you use to read it!
I found this helpful.