ChatGPT currently requires the electricity of 33,000 households. 70,000 or 80,000 people could use this energy for a whole year.
However, OpenAI's plans are of course based on a future that is similar to Google's present. Its search queries alone can generate a considerable amount of energy.
According to various sources, all of which refer to a somewhat older figure, 0.3 Wh is used per search query - 8.5 billion search queries per day worldwide.
That would be a total of almost 1 terawatt hour per year just to process the search queries. If you extrapolate that back to households, that's already 90,000 to 100,000. That's the electricity consumption of households in Reno or Buffalo.
Nevertheless, the figure for Google, which is still clearly gigantic, is only three times as high as for ChatGPT. And for both companies, we are talking about one aspect of the entire company.
According to Google's company data, the total consumption is actually 6 terawatt hours, 6 times as much as the search queries themselves. However, this is worldwide. If you take total US electricity consumption, for example, this figure corresponds to just over 0.1 percent.
However, OpenAI states a factor of 4 to 5 per search query. In addition, the number of search queries also doubles every few years.
So if the current search volume were to be handled by ChatGPT and its competitors, the value used for Google could easily be increased fivefold.
That would be 30 terawatt hours. The whole of Los Angeles with its 4 million residents can get by with this amount of electricity. On the other hand, 30 TWh is only a tiny fraction of global energy production, but represents the global use of AI.
And yet: if development generally progresses, search queries will only be answered by AI in the future and the number continues to increase at the same rate, it will be closer to 60 or 100 TWh. There seems to be undeniable potential for savings here.
Of course, technology can also be used to save electricity. If I get better search results, I don't have to click on four or five websites, I can just click on the first one. I can generate a photo and don't have to create and edit it myself. I can edit documents with colleagues and friends without being on site, which of course also saves energy (although no AI is required for this).
Nevertheless, it remains important to raise awareness of the fact that AI can and must be made more efficient. Simply trusting that there is enough cheap energy available could prove to be an all too convenient fallacy.
In this respect, Sam Altman's wake-up call is a good signal. His plan to use nuclear fusion to balance everything out, on the other hand, seems rather naïve.