It'd be nice to have some official numbers from OpenAI, though.
It’s easy to slate AI in all its manifestations—trust me, I should know, I do so often enough—but some recent research from Epoch AI (via TechCrunch) suggests that we might be a little hasty if we’re trashing its energy use (yes, that’s the same Epoch AI that recently dropped a new, difficult math benchmark for AI). According to Epoch AI, ChatGPT likely consumes just 0.3 Wh of electricity, “10 times less” than the popular older estimate which claimed about 3 Wh.
Given a Google search amounts to 0.0003 kWh of energy consumption per search, and based on the older 3 Wh estimate, two years ago Alphabet Chairman John Hennessey said that an LLM exchange would probably cost 10 times more than a Google search in energy. If Epoch AI’s new estimate is correct, it seems that a likely GPT-4o interaction actually consumes the same amount of energy as a Google search.
Server energy use isn’t something that tends to cross most people’s minds while using a cloud service—the ‘cloud’ is so far removed from our homes that it seems a little ethereal. I know I often forget there are any additional energy costs at all, other than what my own device consumes, when using ChatGPT.
Thankfully I’m not a mover or a shaker in the world of energy policy, because of course LLM interactions consume energy. Let’s not forget how LLMs work: they undertake shedloads of data training (consuming shedloads of energy), then once they’ve been trained and are interacting, they still need to pull from gigantic models to process even simple instructions or queries. That’s the nature of the beast. And that beast needs feeding energy to keep up and running.
It’s just that apparently that’s less energy than we might have originally thought on a per-interaction basis: “For context, 0.3 watt-hours is less than the amount of electricity that an LED lightbulb or a laptop consumes in a few minutes. And even for a heavy chat user, the energy cost of ChatGPT will be a small fraction of the overall electricity consumption of a developed-country resident.”
Epoch AI explains that there are a few differences between how it’s worked out this new estimate and how the original 3 Wh estimate was calculated. Essentially, the new estimate uses a “more realistic assumption for the number of output tokens in a typical chatbot usage”, whereas the original estimate assumed output tokens equivalent to about 1,500 words on average (tokens are essentially units of text such as a word). The new one also assumes just 70% of peak server power and computation being performed on a newer chip (Nvidia’s H100 rather than an A100).
All these changes—which seem reasonable to my eyes and ears—paint a picture of a much less power-hungry ChatGPT. However, Epoch AI points out that “there is a lot of uncertainty here around both parameter count, utilization, and other factors”. Longer queries, for instance, it says could increase energy consumption “substantially to 2.5 to 40 watt-hours.”
It’s a complicated story, but should we expect any less? In fact, let me muddy the waters a little more for us.
We also need to consider the benefits of AI for energy consumption. A productive technology doesn’t exist in a vacuum, after all. For instance, use of AI such as ChatGPT could help bring about breakthroughs in energy production that decrease energy use across the board. And use of AI could increase productivity in areas that reduce energy in other ways; for instance, a manual task that would have required you to keep your computer turned on and consuming power for 10 minutes might be done in one minute with the help of AI.
What is artificial general intelligence?: We dive into the lingo of AI and what the terms actually mean.
On the other hand, there’s the cost of AI training to consider. But on the peculiar third hand—where did that come from?—the benefits of LLM training are starting to plateau, which means there might be less large-scale data training going forwards. Plus, aren’t there always additional variables? With Google search, for instance, there’s the presumed cost of constant web indexing and so on, not just the search interaction and results page generation.
In other words, it’s a complicated picture, and as with all technologies, AI probably shouldn’t be looked at in a vacuum. Apart from its place on the mathematician’s paper, energy consumption is never an isolated variable. Ultimately, what we care about is the health and productivity of the entire system, the economy, society, and so on. As always, such debates require consideration of multi-multi-variate equations in a cost-benefit analysis, and it’s difficult to get the full picture, especially when much of that picture depends on an uncertain future.
Which somewhat defines the march of capitalism, does it not? The back and forth ‘but actually‘ that characterises these discussions gets trampled under the boots of the technology which marches ahead regardless.
And ultimately, while this new 0.3 Wh estimate is certainly a pleasant development, it’s still just an estimate, and Epoch AI is very clear about this: “More transparency from OpenAI and other major AI companies would help produce a better estimate.” More transparency would be nice, but I won’t hold my breath.