Google Claims One Gemini AI Prompt Uses Five Drops of Water

Google Claims One Gemini AI Prompt Uses Five Drops of Water

How much does one Gemini prompt really cost the planet? Google says one Gemini app text prompt requires just 0.24 watt-hours of energy, emits 0.03 grams of CO₂, and uses 0.26 milliliters of water — roughly five drops.

The tech giant paired these statistics with a proposed framework, released Thursday, that aims to bring consistency to how big tech reports AI’s environmental footprint.

“We’re hoping to foster greater industry-wide consistency in calculating the impact of AI,” said Savannah Goodman, Google’s head of Advanced Energy Labs, in a press briefing in advance of the report.

The transparency of Google’s report about the environmental impact of AI has drawn support from academics and researchers. It’s rare for major generative AI companies to release specific resource usage data like this. On the other hand, Google’s overall emissions rose 11% in 2024 due to greenhouse gas emissions in the supply chain, cancelling out gains in Scope 1 and Scope 2.

Gemini prompts use far less energy and carbon after one year

Google measured the environmental footprint (electricity, water, carbon emissions) of an average Gemini query on the mobile app. The company determined water use by estimating based on energy per prompt and calculating based on Google’s 2024 average fleetwide water usage effectiveness.

Similarly, emissions per prompt were estimated based on energy per prompt and applying Google’s 2024 average fleetwide grid carbon intensity. For context, Gemini had 350 million monthly active users in April.

Gemini’s median energy consumption per app text prompt fell by a factor of 33 times over a 12-month period, Google said. Likewise, its carbon footprint decreased by 44 times. (Google said those results are based on overall decreases in data center energy emissions.)

To pursue comprehensive efficiency, Google is trying to take into account the full stack impact of running AI — data centers, power delivery, cooling, networking, agent and application design, and more — as well as to shift machine learning and AI workloads during peak hours.

The research focuses only on Gemini apps because adding other Gemini products — including the AI answers included in many Google searches — can turn the calculations into a case of “apples and oranges,” said Partha Ranganathan, Google vice president and engineering fellow, at the press briefing.

“The folks who come to the Gemini app have a very different modality of interaction than the folks who come to Google Search,” he said.

Google shared its methodology for measuring AI’s environmental impact

Measuring the impact of text prompts on Gemini apps is just the beginning of figuring out how to paint a comprehensive picture of AI’s energy use.

“We have little consensus on how to comprehensively measure the serving environmental impact of even text generation,” said Goodman. She noted that it’s an ongoing project to include the environmental impact of images, videos, or deep research in the calculations.

Energy used while training the AI isn’t included in the data, and independent organizations have not verified Google’s claims.

Google wants its methodology for determining those numbers to be adopted by other AI companies. The methodology, detailed in Google’s technical report, includes ways to measure:

  • Energy and water use of AI model infrastructure both in use and when idle.
  • Energy use by CPUs and RAM.
  • Energy use by cooling systems, power distribution, and other data center overhead.
  • Water consumption.

Measuring full-system dynamic power “includes not just the energy and water used by the primary AI model during active computation, but also the actual achieved chip utilization at production scale, which can be much lower than theoretical maximums,” wrote Amin Vahdat, vice president and general manager of AI and infrastructure at Google Cloud, and Jeff Dean, chief scientist at Google DeepMind and Google Research, in a blog post.

Because this is the first time the research has been presented publicly, no other companies have signed on to use the same methodology, Goodman said. Google plans to continue to work with other companies, as well as with standards and policy bodies.

In other AI news, companies and utilities debate over who pays for the electricity powering giant data centers. 

Source of Article