Google recently published new data showing how much energy it takes to run its Gemini AI assistant. What makes this release important is not just the numbers, but how Google measured them.
How we measure AI’s energy use
Most existing estimates of AI energy use, such as AIEnergyScore, focus only on the power used by the chips that perform the core computations. These approaches often simplify the picture by excluding other parts of the system, such as supporting processors, memory, and cooling infrastructure, and by using limited workloads. For example, some studies restrict batch sizes to just one prompt at a time. Even Google’s own chip-only estimate for Gemini draws from just the top 10% of its most energy-efficient data centers.
Google’s more comprehensive approach to calculating AI’s energy consumption shows that typical estimates can miss more than half of the energy actually used. For example, Google found that a single Gemini prompt in May 2025 used about 0.24 watt-hours of energy when measured comprehensively, compared to just 0.10 watt-hours using the narrower, chip-only method.
Google’s latest disclosures offer a valuable benchmark for understanding the true energy use of AI inferencing. Their broader, system-wide measurement approach reveals that many existing estimates may significantly understate AI’s actual energy footprint. As the sector scales, relying on narrow metrics could lead to poor infrastructure planning and misguided sustainability strategies. Transparent and comprehensive measurement will be essential for guiding smarter decisions around energy, infrastructure, and climate impact.

Exhibit 1: Energy consumption of the median Gemini prompt (Watt-hours per prompt). (Source: Google, ADI Analytics)
Drastic improvements in energy efficiency
Over the past year, Google achieved a 33 times reduction in energy use per prompt through a combination of software and hardware improvements. On the software side, Gemini models were redesigned to be more efficient, using techniques like Mixture of Experts, speculative decoding, and quantization. These changes allowed the models to activate only the necessary components for each prompt and process data more efficiently. Google also introduced smaller, faster models like Gemini Flash that consume less electricity when handling user prompts.
On the hardware side, Google’s custom-built TPUs were co-designed with its models to maximize performance per watt. They also improved how prompts are grouped and scheduled across machines, which helped reduce idle time and made better use of available computing resources.
These results show that there’s still significant headroom for improving the energy efficiency of AI systems. If similar gains are achieved across the industry, the long-term energy impact of AI could be far lower than many forecasts suggest. This raises important questions: Are current projections for AI-driven electricity demand overstated? How quickly can other companies replicate these kinds of improvements? Google’s progress suggests that innovation, not just scale, will shape the energy future of AI.
Connecting the dots: AI, energy, and infrastructure
At first, these two findings might seem to point in opposite directions. Google’s broader measurement approach shows that many estimates may not fully capture the total energy used to process AI prompts. At the same time, their rapid efficiency gains suggest that future energy demand could be lower than expected. Together, these insights reflect how quickly the AI energy landscape is evolving. They show that both the way we measure energy use today and the assumptions we make about tomorrow’s demand can shift significantly in a short time.
Google’s progress also underscores the need for smarter energy strategies as AI continues to grow. At ADI, we’re helping companies navigate this evolving landscape. Our work spans clean power sourcing, data center energy efficiency, and the growing role of nuclear energy in meeting AI-driven electricity demand. We also examine the water intensity of AI workloads and their impact on local communities. As AI’s energy footprint becomes a more pressing issue, better data and thoughtful design will be essential for building a more sustainable digital future.
– Piercen Hoekstra
About ADI Analytics
ADI is a prestigious, boutique consulting firm specializing in oil and gas, energy, and chemicals since 2009. We bring deep expertise in a broad range of markets where we support Fortune 500, mid-sized and early-stage companies, and investors with consulting services, research reports, and data and analytics, with the goal of delivering actionable outcomes to help our clients achieve tangible results.
We also host the ADI Forum that brings c-suite executives together for meaningful dialogue and strategic insights across the oil & gas, energy transition, and chemicals value chains. Learn more about the ADI Forum.
Subscribe to our newsletter or contact us to learn more.