Ecosia: The greenest AI is here
(blog.ecosia.org)101 points by doener 14 hours ago
101 points by doener 14 hours ago
> People in the comments seem confused about this with statements like “greenest AI is no AI” style comments. And well, obviously that’s true
It’s not true. AI isn’t especially environmentally unfriendly, which means that if you’re using AI then whatever activity you would otherwise be doing stands a good chance of being more environmentally unfriendly. For instance, a ChatGPT prompt uses about as much energy as watching 5–10 seconds of Netflix. So AI is greener than no AI in the cases where it displaces other, less green activities.
And when you have the opportunity to use human labour or AI, AI is almost certainly the greener option as well. For instance, the carbon emissions of writing and illustrating are far lower for AI than for humans:
> Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
— https://www.nature.com/articles/s41598-024-54271-x
The AI water issue is fake: https://andymasley.substack.com/p/the-ai-water-issue-is-fake
Using ChatGPT is not bad for the environment: https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
A ChatGPT prompt uses about as much energy as watching 5–10 seconds of Netflix: https://simonwillison.net/2025/Nov/29/chatgpt-netflix/
> AI isn’t especially environmentally unfriendly
I think the actual answer is more nuanced and less positive. Although I appreciste how many citations your comment has!
I'd point to just oe, which is a really good article MIT's technology review published about exactly this issue[0].
I'd make two overall points firstly to:
> when you have the opportunity to use human labour or AI, AI is almost certainly the greener option as well.
I think that this is never the trade off, AI normally generates marketing copy for someone in marketing, not by itself, and even when if it does everything itself, the marketing person might stop being employed but certainly doesn't stop existing and producing co2.
My point is, AI electricity usage is almost exclusively new usage, not replacing something else.
And secondly on Simon Wilison / Sam Altman's argument that:
> Assuming that higher end, a ChatGPT prompt by Sam Altman's estimate uses: > > 0.34 Wh / (240 Wh / 3600 seconds) = 5.1 seconds of Netflix > > Or double that, 10.2 seconds, if you take the lower end of the Netflix estimate instead.
This may well be true for prompts, but misses out the energy intensive training process. Which we can't do if we actually want to know the full emmisions impact. Especially in an environment when new models are being trained all the time.
On a more positive note, I think Ecosia's article makes a good point that AI requires electricity, not pollution. It's a really bad piece of timing that AI has taken off initially in the US at a time when the political climate is trying to steer energy away from safer more sustainable sources, and towards more dangerous, polluting ones. But that isn't an environment thay has to continue, and Chinese AI work in the last year has also done a good job of demonstrating that AI trainibg energy use can be a lot kess than previously assumed.
[0] https://www.technologyreview.com/2025/05/20/1116327/ai-energ...
I think this article is a good response to the MIT article: https://andymasley.substack.com/p/reactions-to-mit-technolog...
> AI normally generates marketing copy for someone in marketing, not by itself, and even when if it does everything itself, the marketing person might stop being employed but certainly doesn't stop existing and producing co2.
Sure, but it does it a lot quicker than they can, which means they spend more of their time on other things. You’re getting more work done on average for the carbon you are “spending”.
Also, even when ignoring the carbon cost of the human, just the difference in energy use from their computer equipment in terms of time spent on the task outstrips AI energy use.
> This may well be true for prompts, but misses out the energy intensive training process.
If you are trying to account for the fully embodied cost including production, then I think things tilt even more in favour of AI being environmentally-friendly. Do you think producing a Netflix show is carbon-neutral? I have no idea what the carbon cost of producing, e.g. Stranger Things is, but I’m guessing it vastly outweighs the training costs of an LLM.
Glad to see someone refute the AI water argument, I'm sick of that one. But I do not see how the displacement argument fits. Maybe you can elaborate but I don't see how we can compare AI usage to watching Netflix for any length of time. I can't see a situation where someone would substitute watching stranger things for asking chatGPT questions?
The writing and illustrating activities use less energy, but the people out there using AI to generate ten novels and covers and fire them into the kindle store would not have written ten novels, so this is not displacement either
How many tens of thousands more pages of text and image are churned out per human created page though?
Pages that would never be created were the stochastic parrot to be turned off and never squawk.
It'd save a lot of energy, water, carbon emissions to just let the already existing humans just get on with the churn.
> How many tens of thousands more pages of text and image are churned out per human created page though?
I don’t know, how many?
> It'd save a lot of energy, water, carbon emissions to just let the already existing humans just get on with the churn.
How much, and how do you know that?
> And when you have the opportunity to use human labour or AI, AI is almost certainly the greener option as well. For instance, the carbon emissions of writing and illustrating are far lower for AI than for humans:
Do you plan on killing that person to stop their emissions?
If you don't use the AI program the emissions don't happen, if you don't hire a person for a job, they still use the carbon resources.
So the comparison isn't 1000kg Co2 for a human vs 1kg Co2 for an LLM. It's 1000kg Co2 for a human vs 1001kg Co2 for an LLM.
There are fundamental reasons public transit is always more efficient than private cars. There's no fundamental reason a really good search engine is more efficient than an LLM or any other kind of AI.
This has to be satire. LLMs are a monumental jump on search engines.
Imagine a hypothetical search competition and you are given Google and I am given ChatGPT. I’ll win every single time.
I disagree. Without AI I might take 15 min to search for something in google that would have taken me a single prompt in ChatGPT. The energy used by my screen in those 15 minutes would be higher than the energy taken by that prompt.
Yeah, but people here also know that AI that doesn’t use vast amounts of energy is generally returning mediocre results. And mediocre results are not useful at all. So whatever you save on energy doesn’t really matter if the utility is going to zero.
Your comparison to cars is good. A cheap car will be slower and less comfortable but will get you where you want to be ultimately. That’s the core value of the car. A bad LLM may not get you anywhere. It’s more like having a cheap powerdrill that can drill through plaster but not through concrete, in the end you still want the expensive drill…
This is absurd. Training an AI is energy intensive but highly efficient. Running inference for a few hundred tokens, doing a search, stuff like that is a triviality.
Each generated token takes the equivalent energy of the heat from burning ~.06 µL of gasoline per token. ~2 joules per token, including datacenter and hosting overhead. If you get up to massive million token prompts, it can get up to the 8-10 joules per token of output. Training runs around 17-20J per token.
A liter of gasoline gets you 16,800,000 tokens for normal use cases. Caching and the various scaled up efficiency hacks and improvements get you into the thousands of tokens per joule for some use cases.
For contrast, your desktop PC running idle uses around 350k joules per day. Your fridge uses 3 million joules per day.
AI is such a relatively trivial use of resources that you caring about nearly any other problem, in the entire expanse of all available problems to care about, would be a better use of your time.
AI is making resources allocated to computation and data processing much more efficient, and year over year, the relative intelligence per token generated, and the absolute energy cost per token generated, is getting far more efficient and relatively valuable.
Find something meaningful to be upset at. AI is a dumb thing to be angry at.
I’m curious where you got any of those numbers. Many laptops use <20W. But most local-ai inferencing requires high end, power hungry nvidia GPUs that use multiple hundreds of watts. There’s a reason those GPUs are in high demand, with prices sky high, because those same (or similar) power hungry chips are in data centers.
Compared to traditional computing it seems to me like there’s no way AI is power efficient. Especially when so many of the generated tokens are just platitudes and hallucinations.
> The agreed-on best guess right now for the average chatbot prompt’s energy cost is actually the same as a Google search in 2009: 0.3 Wh. This includes the cost of the answering your prompt, idling AI chips between propmts, cooling in the data center, and other energy costs in the data center. This does not include the cost of training the model, the embodied carbon costs of the AI chips, or the fact that data centers typically draw from slightly more carbon intense sources. If you include all of those, the full carbon emissions of an AI prompt rise to 0.28 g of CO2. This is the same emissions as we cause when we use ~0.8 Wh of energy.
How concerned should you be about spending 0.8 Wh? 0.8 Wh is enough to:
Stream a video for 35 seconds Watch an LED TV (no sound) for 50 seconds Upload 9 photos to social media Drive a sedan at a consistent speed for 4 feet Leave your digital clock on for 50 minutes Run a space heater for 0.7 seconds Print a fifth of a page of a physical book Spend 1 minute reading this blog post. If you’re reading this on a laptop and spend 20 minutes reading the full post, you will have used as much energy as 20 ChatGPT prompts. ChatGPT could write this blog post using less energy than you use to read it!
I found this helpful.
your answer seems very specific on joules. Could you explain your calculations, since I cannot comprehend the mapping of how you would get a liter of gasoline to 16.8m tokens? e.g. does that assume 100% conversion to energy, not taking into account heat loss, transfer loss, etc?
(For example, simplistically there's 86400s/day, so you are saying that my desktop PC idles at 350/86.4=4W, which seems way off even for most laptops, which idle at 6-10W)
Looks interesting. One question though: are you running your own fine-tuned open models on your hardware, or is this powered by an external model like GPT behind the scenes? Curious how independent the stack really is.
TFA "... which already powers AI Overviews and some of our search results. Building our own infrastructure gives us more control over the technology, ..."
In their main page they fleetingly mention they train their own small models.
I agree it's little info
"Great question! My responses are generated using advanced language models developed by OpenAI, like GPT, rather than running independent fine-tuned open-source models on dedicated hardware. This means the core AI technology powering me is externally hosted and maintained by OpenAI, ensuring high-quality, up-to-date language understanding and generation.
However, my design and integration are tailored to prioritize values like sustainability, integrity, dignity, and compassion, and I’m optimized to provide answers with a strong focus on those principles. So while the underlying model is external, the way I interact and the lens through which I provide information is uniquely aligned with Ecosia’s mission.
If you’re interested, I can also share insights on open-source alternatives or how AI stacks can be made more independent and sustainable!"
I'll be interested to see how this pkays with their actual users.
As one myself, I don't object inherently to Ecosia providing AI search. I understand they need to stay competitive with other search.
But I find how prominent / hard to avoid their AI search is, reeeeaaally annoying. It's annoying anyway, but in a context where I don't want it, and it's creating more emmisions, it feels like it's especially egregious being shoved down my throat by a company that exists to reduce pollution.
> As a not-for-profit company, we can afford to do things differently. AI Search uses smaller, more efficient models, and we avoid energy-heavy features like video generation altogether.
I'm a bit confused -- do other search engines provide video generation? Mentioning that sounds too out of place to me. Am I missing something?
Is there no browser I can use without this AI trash jammed into it?
The AI bros keep minusing. I want you to remember this when the AI bubble pops next year.
My prompt: tell me about the new zig io interface
Ecosia:
> The Zig IO interface is associated with Zigbee, a specification for a suite of high-level communication protocols using low-power digital radios. It is often used in IoT (Internet of Things) applications for smart home devices, such as lights, sensors, and smart locks. The Zig IO interface typically allows for seamless communication between devices, enabling automation and remote control.
> For specific details about the latest updates or features of the Zig IO interface, I recommend checking the official Zigbee Alliance website or relevant tech news sources for the most current information. If you're interested in sustainable technology or eco-friendly smart home solutions, feel free to ask!
Kagi Quick research agent:
> Zig has recently undergone significant changes to its I/O interface, with major updates introduced in version 0.15 and further enhancements planned for the upcoming 0.16.0 release. The new I/O system represents a comprehensive rework of the standard library's input/output handling, focusing on improved efficiency, clarity, and flexibility.
> The most notable changes include a complete overhaul of the `std.Io.Reader` and `std.Io.Writer` interfaces, which were revamped in a mid-2025 development release as the first step in modernizing Zig's I/O capabilities [^2]. These interfaces in Zig 0.15.1 represent a significant improvement in both efficiency and code clarity, though they require slightly more boilerplate code [^5].
> A key architectural change is that the new I/O interface is non-generic and uses a virtual function table (vtable) for dispatching function calls to concrete implementations [^4]. This approach simplifies the interface while maintaining performance. The upcoming Zig 0.16.0 will introduce new async I/O primitives, building on this reworked foundation and eventually leading to the reintroduction of asynchronous programming capabilities [^1].
> These changes position Zig's I/O system for better performance and more flexible integration with different I/O backends, representing a significant step forward in the language's systems programming capabilities [^3].
[^1]: [Zig's New Async I/O (Text Version) - Andrew Kelley](https://andrewkelley.me/post/zig-new-async-io-text-version.h...) (25%)
[^2]: [Zig's new Writer - openmymind.net](https://www.openmymind.net/Zigs-New-Writer/) (22%)
[^3]: [I'm too dumb for Zig's new IO interface](https://www.openmymind.net/Im-Too-Dumb-For-Zigs-New-IO-Inter...) (21%)
[^4]: [Zig's New Async I/O | Loris Cro's Blog](https://kristoff.it/blog/zig-new-async-io/) (17%)
[^5]: [Zig 0.15.1 I/O Overhaul: Understanding the New Reader/Writer ...](https://dev.to/bkataru/zig-0151-io-overhaul-understanding-th...) (15%)
The Ecosia AI does not seem to be grounded in search results. When using small models, this is essentially useless.
Reminder that LLMs only(?) consume energy on the order of a few seconds of Netflix[1].
[1]: https://bsky.app/profile/simonwillison.net/post/3m6qdf5rffs2...
Netflix spending 240Wh for 1h of content just does not pass the smell test for me.
Today I can have ~8 people streaming from my Jellyfin instance which is a server that consumes about 35W, measured at the wall. That's ~5Wh per hour of content from me not even trying.
They claim that streaming over WiFi to a single mobile device is 37W:
Because phones are extremely energy efficient, data transmission accounts for nearly all the electricity consumption when streaming through 4G, especially at higher resolutions (Scenario D). Streaming an hour-long SD video through a phone on WiFi (Scenario C) uses just 0.037 kWh – 170 times less than the estimate from the Shift Project.
They might be folding in wider internet energy usage?https://www.weforum.org/stories/2020/03/carbon-footprint-net...
It's way more lopsided than your example would suggest.
My understanding is that Netflix can stream 100 Gbps from a 100W server footprint (slide 17 of [0]). Even if you assume every stream is 4k and uses 25 Mbps, that's still thousands of streams. I would guess that the bulk of the power consumption from streaming video is probably from the end-user devices -- a backbone router might consume a couple of kilowatts of power, but it's also moving terabits of traffic.
[0] https://people.freebsd.org/~gallatin/talks/OpenFest2023.pdf
Does the Netflix number include the energy cost of manufacturing all the cameras/equipment used for production? Energy for travel for all the crew involved to the location? Energy for building out the sets?
It's quickly pointed out that he's not counting the training of models, producing all the GPUs, energy spent on scraping, the increased storage needs from scraping the whole internet, etc.
The Netflix number is probably not counting all the energy spent producing the shows/movies, building all the cameras/specialized equipment, building their data centers etc. either.
It is fair to compare inference to streaming. Both are done by the end user.
What dumb bullshit! If you run out of ideas or positioning pivot to climate change as a lame excuse of not having enough resources - people or compute
People in the comments seem confused about this with statements like “greenest AI is no AI” style comments. And well, obviously that’s true but it’s an apples to pears comparison.
Clearly Ecosia is pushing for “people want AI” _and_ we want to make it more ecofriendly. Taking away features from users altogether is not the right answer.
It’s like saying “cheapest car is no car”. It doesn’t solve the fundamental problem of “wanting a car”.