Comment by jonas21
Comment by jonas21 13 hours ago
What exactly are you basing this assertion on (other than your feelings)? Are you accusing Google of lying when they say in the technical report [1]:
> This impact results from: A 33x reduction in per-prompt energy consumption driven by software efficiencies—including a 23x reduction from model improvements, and a 1.4x reduction from improved machine utilization.
followed by a list of specific improvements they've made?
[1] https://services.google.com/fh/files/misc/measuring_the_envi...
Unless marketing blogs from any company specifically say what model they are talking about, we should always assume they're hiding/conflating/mislabeling/misleading in every way possible. This is corporate media literacy 101.
The burden of proof is on Google here. If they've reduced gemini 2.5 energy use by 33x, they need to state that clearly. Otherwise a we should assume they're fudging the numbers, for example:
A) they've chosen one particular tiny model for this number
or
B) it's a median across all models including the tiny one they use for all search queries
EDIT: I've read over the report and it's B) as far as I can see
Without more info, any other reading of this is a failing on the reader's part, or wishful thinking if they want to feel good about their AI usage.
We should also be ready to change these assumptions if Google or another reputable party does confirm this applies to large models like Gemini 2.5, but should assume the least impressive possible reading until that missing info arrives.
Even more useful info would be how much electricity Google uses per month, and whether that has gone down or continued to grow in the period following this announcement. Because total energy use across their whole AI product range, including training, is the only number that really matters.