The original press release and report are at [1], couldn't find a link to them in the article.
> In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity
If they're running on, say, two RTX 6000s for a total draw of ~600 watts, that would be a response time of 1.44 seconds. So obviously the median prompt doesn't go to some high-end thinking model users have to pay for.
It's a very low number; for comparison, an electric vehicle might consume 82kWh to travel 363 miles. So that 0.24 watt-hours of energy is equivalent to driving 5.6 feet (1.7 meters) in such an EV.
When I hear reports that AI power demand is overloading electricity infrastructure, it always makes me think: Even before the AI boom, shouldn't we have a bunch of extra capacity under construction, ready for EV driving, induction stoves and heat-pump heating?
> When I hear reports that AI power demand is overloading electricity infrastructure ...
It feels like dog-whistle tactics. "Aren't the technology companies bad for the environment!" "What about the water usage?" "What about the electricity?"
For me the peak of this is complaining about water consumption at the Dalles datacentre [0]. The buildings are next to the Colombia river and a few miles away from the Dalles Dam [1] which generates an average of 700MW. The river water should be used for cooling, taking out some of the water, warming it up by a few degrees and returning it to the river; one might argue that this is simply returning the heat to the river that would have come from the water flowing downhill.
What's the dog whistle? People are concerned about the impact industry has on the environment, and they are stating those concerns plainly. I don't think the non-profit WaterWatch's true goal is to destroy big tech.
I think you're oversimplifying the "just use rivers" idea. Most data centers (80% for Google) require potable water for cooling, and it can't come straight from a river. Plus, using potable water in cooling adds mineral deposits to the water and will require treatment to be consumable again.
"Aren't technology companies terrible. They destroy the natural environment for their profit. Look at these large numbers out of context."
> Most data centers (80% for Google) require potable water for cooling, and it can't come straight from a river.
Well, there are two kinds of water that are required for cooling; that which circulates around the datacentre, and the water used to take away the excess heat. These can be different, using heat exchangers to move the heat from one to the other.
> I think you're oversimplifying the "just use rivers" idea.
The most difficult bit appears to be dealing with conservative (with a little c) people and environmental regulation.
> Plus, using potable water in cooling adds mineral deposits to the water
Because in most cases it kind of is. It's not that the H2O molecules are forcefully disintegrated, but most data centers use evaporative cooling, meaning that whatever water is fed to the datacenter through the municipal water system ends up as moisture in the atmosphere. This is in effect equivalent to a sizable leak in the water infrastructure.
Yes, but you can neither drink rainwater nor use it to cool a data center. And the rain may end up falling thousands of miles away. Excessive use of water reduces flow in natural bodies of water and can mess up local ecosystems.
Not sure I'd drink most river water either, and I would hope most data centers don't pull water straight from the aquifer (though maybe they do). Fair points though.
I'd guess that you're not from somewhere that water is scarce. In the American West there's not really a lot of water. I won't turn this into a multi paragraph lecture, but I'll give a few bullet points to give you a sense of what it's like:
- There's an entirely different legal framework around how water from rivers is allocated. The "normal" flow of the most important river, the Colorado River, was calculated during a time of unusually high flow, so there's a lot of tension between different states about whether they're getting their fair share.
- To give you a sense of how "thirsty" the west is, the Colorado River rarely reaches the ocean anymore.
- groundwater use is generally much less regulated, which is causing issues like the Oglala aquifer to drop at an alarming rate. Many aquifers are damaged if they're overpumped, because the weight of the ground above them will crush empty spaces.
- bad actors in the groundwater space can lower the water table and make other peoples' wells run dry
- this is made more complicated by the fact that surface water and ground water interact with each other. Reducing stream flow can affect groundwater, and using groundwater can affect streamflow.
If you want an approachable and entertaining introduction to some western water issues (including the backstabbing and plotting that inspired Chinatown!), I'd suggest reading Cadillac Desert by Marc Reisner.
Surely all uses of water are part of the closed-loop water cycle? Other than launching it into space for astronauts to drink, and using it in permanent reactions like concrete hydration?
Drinking water, spraying it on crops, using it to clean a car, or using it to flush a toilet all end up with the water evaporating, or making its way to the ocean and evaporating from there.
Ultimately, if a river provides a certain number of acre-feet of fresh water, evaporating it to cool a data centre uses it just as much as to evaporating it to grow alfalfa in a desert, except perhaps more usefully.
Fresh water isn't meaningfully a closed loop. We are draining fresh water aquifers, causing the land above them to sink downwards eliminating the voids where fresh water was stored, and moving the formerly fresh water into the ocean where it is no longer drinkable, usable for growing crops, or for most industrial purposes.
We do get new fresh water at a reasonable pace thanks to rain - but in many parts of the world we are using it faster than that, and not just depleting the stored volume of fresh water but destroying the storage "containers" themselves.
That's not what a dog whistle is. A dog whistle is when someone isn't doing something, but their ideological opponent wants to imply they are doing that thing, so they accuse them of "dog whistling" the thing. Like if Elon Musk says something that categorically isn't racist but his opponents want to call him racist anyway then they just say he's "dog whistling" to racists.
That's not what "dog whistles" are, lol. Dog Whistle means "coded language" basically.
Dog whistles are where someone says something that their audience will understand to mean a specific thing, but will be inaudible or neutral sounding to people who are not in their audience. They are named that because they are like the whistles only dogs can hear, while most people cannot.
"Inner city" is a canonical example of a dog whistle. Where the literal meaning is the districts in a city in the urban center, but is often used to denote poor minority communities. (If the literal meaning is only "city centers", then would you describe Manhattanites as inner city?)
On the left, "tax the rich" might be a dog whistle that carries a similar literal meaning disjoint from the understood meaning within the community.
> Dog whistles are where someone says something that their audience will understand to mean a specific thing, but will be inaudible or neutral sounding to people who are not in their audience. They are named that because they are like the whistles only dogs can hear, while most people cannot.
That's basically what I said, except you're missing that more often than not it's an intentional stretching of a literal phrase in order to cast aspersions on someone who didn't do the thing you're mad about.
For example, here was one of the top results when I googled "trump dog whistle",
> In February 2018, during Trump’s first term as president, the Department of Homeland Security issued a 14-word press release titled “We Must Secure The Border And Build The Wall To Make America Safe Again.” I and other investigators of far-right extremism attributed this phrase’s use to a clear dog whistle of the common white supremacist saying known as “the 14 words” – “we must secure the existence of our people and a future for white children.”
Or this top result from the search "musk dog whistle",
> Omar Suleiman has called on Elon Musk to stop blowing political "dog whistles of Islamophobia"
> Yet, for the past week, you have blown every conceivable dog whistle of Islamophobia, by highlighting a select group of (horrifying) incidents supposedly in the name of Islam
In this case absolutely no examples were given, but that's the great thing about accusing someone of dog whistling - you don't need to provide any evidence! In fact, literally any evidence you can provide would only serve to weaken your accusation because by definition anyone who isn't whichever -ist you're accusing them of will literally be unable to decode the -ism in their phrasing. If it sounds obviously -ist then by definition it can't be a dog whistle.
Just because you can find a bad article with bad examples, and some are for sure coincidences, but that doesn't mean its not true. Musk did heil, Musk does post well known white supremacy signals. Trump might be a racist and like the fascist power but he is not a white supremacist christian like the rest of his cabinet of project2025 people.
I don't know why, but it always irks me when a corporation puts out a document like this cosplaying as peer-reviewed research, and they don't bother to put in even a lip-service "conflict of interest" disclosure. I should expect it at this point.
>~If they're running on, say, two RTX 6000s for a total draw of ~600 watts, that would be a response time of 1.44 seconds. So obviously the median prompt doesn't go to some high-end thinking model users have to pay for.
You're not accounting for batches for the optimal gpu utilization, maybe it can takes 30 seconds but it completed 30 requests.
> When I hear reports that AI power demand is overloading electricity infrastructure, it always makes me think: Even before the AI boom, shouldn't we have a bunch of extra capacity under construction, ready for EV driving, induction stoves and heat-pump heating?
One of the problems might be that a data center puts a lot of demand into a small area and it needs that power soon.
Those other things are being phased in over time, so we only need modest annual capacity growth to deal with them, and they are spread out.
I'm not sure why they would report on the median prompt, and not the average, which would give a better sense of (well average) consumption in this case
100%, and I say that as someone who often think average is misleading, but in this case it makes no sense to report median (unless you are working at Google and trying to chase tail usage).
> I'm not sure why they would report on the median
The why is an easier question. They probably picked the lower of the two numbers because it lies in their interest to state they are energy efficient.
Well first of all, you're implying this measures our consumption at all. But it's left completely vague what this is a median of. They said "A point-in-time analysis quantified the energy consumed per median Gemini App text-generation prompt, considering data from May 2025".
Considering what data? All queries sent to Gemini? Real users? A select few? Test queries from Google?
Does it include AI summaries of google searches? Because if the data includes stuff as simple as "How tall is Lee Pace," that is obviously going to bring the median query down, even if the top distribution is using many times more energy.
But still, the median is not useful by itself. It tells us 50% of the queries measured were under 0.24Wh. It obviously obscures policy-relevant information to not include the mean, but it also obscures what I can do individually without more details on the data. Where am I on this median?
It makes the most sense to provide the entire distribution and examples of data points.
>Even before the AI boom, shouldn't we have a bunch of extra capacity under construction, ready for EV driving, induction stoves and heat-pump heating?
When it comes to the EV, the answer is simple: the EV takeover "by 2030" was 100% wishful thinking - the capacity is nowhere near there, starting from scaling the battery production, never mind the charge capacity.
No, mostly misunderstanding. ~95% of all cars sold in Norway are EV, yet only ~25% of the cars on the road are EV's. Most cars predate the EV transition. It'll take another ~20 years until the 95% of the cars on the road are EV's.
We'll have the battery capacity and charge capacity to allow 100% of cars sold in 2030 to be EV's. We only need 2 capacity doublings for batteries, and currently doublings happen every ~18 months. Charge capacity is even easier, we just need to increase electricity production by 1-2% per year for a couple decades to support the transition to EV's.
>No, mostly misunderstanding. ~95% of all cars sold in Norway are EV, yet only ~25% of the cars on the road are EV's
Norway is a tiny market which had big artificial tax/cost incentives to buy an EV. Norway could be 100% EV and it wouldn't make any dent to global adoption.
Norway is a tiny market that is an example of what most Western nations will look like in 2030 or 2035. Despite ~95% of sales being EV, downtown Oslo is still noisy and stinky because the vast majority of cars are still ICE.
Contrast with China -- downtown Shanghai has the vast majority of cars being EV's despite the EV sales rate in China only being ~50%.
Existence of “2030 deadline” was/ is significant factor by itself. (Current sate would be less electrified without that arbitrary and over optimistic fantasy deadline)
That seems irrelevant to the question. If I took your prompt, my response would be "do you suggest we should run on thin margins of transmission capacity while we watch an explosion of demand coming at us?"
If you forecast 100, build 20, and sell 10, you have oversupply for the caving in of demand, but huge lack of supply compared to the never-arriving extrapolated demand
China however is continuously providing double the energy they currently require, only to notice that every two years or so it actually did end up getting used.
Yes we should, which is yet another example of ineffective vision from the prior administration and outright incompetence and sabotage by the current one, regarding renewables and transmission infrastructure.
> In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity
If they're running on, say, two RTX 6000s for a total draw of ~600 watts, that would be a response time of 1.44 seconds. So obviously the median prompt doesn't go to some high-end thinking model users have to pay for.
It's a very low number; for comparison, an electric vehicle might consume 82kWh to travel 363 miles. So that 0.24 watt-hours of energy is equivalent to driving 5.6 feet (1.7 meters) in such an EV.
When I hear reports that AI power demand is overloading electricity infrastructure, it always makes me think: Even before the AI boom, shouldn't we have a bunch of extra capacity under construction, ready for EV driving, induction stoves and heat-pump heating?
[1] https://cloud.google.com/blog/products/infrastructure/measur...