There are a number of important questions on AI and energy that we’re working on at Google: How do we use AI to increase energy efficiency? How do we leverage AI to help address environmental and climate challenges? Can those approaches and their beneficial impact be scaled globally? At the same time, a key question is the amount of energy needed to use generative AI tools and apps, and if it can be reduced. To help answer this question, my colleagues and I did the math on the energy, carbon and water footprint per Gemini prompt. This is an area where state of the art is rapidly evolving, but we believe a clear understanding is important to enable us and others to continue to improve efficiency. Here’s a quick summary: We estimate the median Gemini App text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water. One way to explain the per prompt energy impact is that it's equivalent to watching TV for less than 9 seconds. We’re sharing our findings and methodology in our Technical Paper (link below) aiming to encourage industry-wide progress on more comprehensive reporting on the energy use of AI. We’ve been making rapid ongoing progress on the efficiency of our AI systems. Over a period of 12 months, as we continued to deliver higher-quality responses using more capable models, the median energy consumption and carbon footprint per Gemini Apps text prompt decreased by factors of 33x and 44x, respectively. All this thanks to our full stack approach to AI development, our latest data-center efficiencies, decades of experience serving software at scale (details in Technical Paper) – and the work of many teams at Google. To be clear, there is much more still to do. We’re continuing to improve efficiency at every layer of our operations (hardware, software, models, data centers), and we’re continuing to invest in energy infrastructure, grid improvements, and clean energy (existing tech and next-gen initiatives such as geothermal and nuclear). Thanks to my co-authors on this study Cooper Elsworth Tim Huang, Dave Patterson, Ian Schneider, Robert Sedivy, Savannah Goodman, Ben Townsend, Partha Ranganathan, Jeff Dean, Amin Vahdat, Ben Gomes. If you’d like to learn more, here are some resources: Blog post on our approach to energy innovation and AI: https://lnkd.in/gh8HX7zR Blog post on the math behind the numbers we shared today: https://lnkd.in/gQgq-Eir Technical Paper: https://lnkd.in/g8T88xMf
Currently designing an AI agent system for Sales. I'm looking at energy per completed task. Because an efficient failure is still a failure. I'm seeing best results with narrow agents that nail it first try. I'd love to hear more about case studies measuring total energy cost to successful outcome.
James Manyika The transparency here matters. Industry-wide reporting is the only way to drive real efficiency gains.
Big respect to Google for leading with transparency and responsibility in AI efficiency 👏. Truly inspiring work toward a sustainable, scalable AI future 🚀🙌
This level of transparency on Google’s AI energy footprint is market-shaping -- thanks for sharing! These metrics begin setting baselines for trust, policy, and how fast responsible adoption can scale.
Very impressive and well articulated!! Efficiency is what makes users enjoy AI tools without complaining or encountering any difficulties. Thanks for sharing James Manyika
Terima kasih telah berbagi, James
Interesting stats! It would be great to see a comparative table with other models
Sizani Weza - I love the ability to quantify the amount of energy being used for Gemini in realistic terms - the number of TV seconds watched. It's also great that the talk on AI is moving towards adaptation to solving world challenges such as environment and climate changes. It's a hard sell I personally believe, but if AI tech can be made more efficient then it should be doable.
ni
1moThanks for sharing, James