Ever wonder how much energy it takes for the near-instant AI responses that we are increasingly encouraged to depend upon? For those curious, the amount is tremendous. In 2022, training the AI model for the first version of ChatGPT consumed a staggering amount of electricity—the same amount of electricity as 130 households consumed in one year. And that was three years ago. Last year, for the first time, the International Energy Agency (IEA) included projections for electricity consumption associated with data centers, cryptocurrency, and artificial intelligence in its forecast for global energy use over the next two years. With these energy consumers added together, the IEA noted that by 2026, usage will be roughly equal to the amount of electricity used by the entire country of Japan.
One of the areas devouring mass amounts of energy on the fastest trajectory is the form of machine learning called generative AI. As previously noted, training a model the size of ChatGPT slurps up 1,300 megawatt-hours (MWh). Again, enough to power 130 U.S. homes for a year. Even simple tasks eat power. The IEA reports that a single Google search takes 0.3 watt-hours of electricity, but a ChatGPT request needs 2.9 watt-hours. For comparison, an old-school incandescent light bulb pulls 60 watt-hours. If ChatGPT were integrated into the over 9 billion searches performed worldwide each day, the electricity demand would increase by 10 terawatt-hours a year—equal to the electricity used by 1.5 million European Union residents.