AI's energy use: Separating fact from fiction
Artificial intelligence is undoubtedly extremely energy-intensive. However, when its development is viewed from a broader perspective, the reality appears less clear-cut.
Artificial intelligence (AI) attracts attention from the media and public in many ways, and one of them is its impact on global energy consumption.
It's no wonder these concerns arise, as AI not only requires a lot of electrical energy but also rapidly becomes part of everyday life. As noted by Axios, within just two years, generative artificial intelligence has already reached 39% of households in the USA. For comparison, two years after its commercial introduction, approximately 15% of households owned computers, and just over 20% had internet access.
How much energy does AI require?
New data, however, indicates that AI is not the energy-hungry vampire it is often made out to be.
In Ireland, data centres already consume over 20% of electricity, and in some US states, it's about 10%. Although these examples are frequently cited, they are exceptions.
According to the International Energy Agency's analysis, data centres worldwide currently consume about 2% of electricity. This includes not only artificial intelligence but also all data transmission activities: from watching videos on YouTube on a laptop to managing a smart home.
More energy will be consumed by air conditioning
What does the future hold? The honest answer is: we don't know, because the development of AI depends on many factors.
The IEA's average projections indicate that from 2023 to 2030, the growth of data centres will increase electricity demand by 800 terajoules, while overall demand will rise by about 21,600 terajoules. However, data centres will account for only 3% of this increase, and artificial intelligence will not be the sole cause.
Data centres rank eighth on this list of causes. The projected increase in electricity demand for electric vehicles will be four times higher, for air conditioning, three times higher, and for various industries, more than ten times higher.
"The world operates digitally now. Stop our internet services, and everything around us will collapse. A few per cent of the world's electricity to maintain this seems more than reasonable to me," comments Hannah Ritchie, a data analyst from Our World in Data, on her blog.
Efficiency is key
The future energy demand of AI heavily depends on further efficiency improvements, both in hardware and software.
"The efficiency of computer chips related to AI doubles roughly every 2.5–3 years, and a modern AI-related computer chip uses 99% less energy to perform the same calculations as a model from 2008. New cooling technologies are being developed, and AI models are becoming increasingly efficient," writes Thomas Spencer and Siddharth Singh, IEA analysts.
Hannah Ritchie, however, points out that concerns about a sharp increase in energy demand by digital technologies have arisen before. For example, between 2010 and 2018, the computing power of data centres increased by over 550%, yet their energy consumption increased by only 6%.
There are more uncertainties regarding the future. These relate, for example, to what we will use AI for: generating videos is much more energy-intensive than generating text. The development of this technology may also be influenced by the production capabilities of chips or expectations regarding returns on investment.
On the other hand, the development of artificial intelligence itself can increase the efficiency of energy systems and facilitate the introduction of innovations. This is an aspect that must also be considered in the overall calculation, and it is fraught with many uncertainties.
Mainly local significance?
Examples in some US states or Ireland show that data centres can indeed consume significant amounts of electricity. For this reason, situations where their expansion is blocked are becoming increasingly frequent.
"The development of data centres can lead to significant strain on local power grids, exacerbated by the huge mismatch between the fast construction times for data centres and the often slow pace of expanding and strengthening grids and generation capacity. There have already been cases where jurisdictions halted new data centre contracts due to the increase in applications. In regions or countries that are particularly affected, the increasing electricity consumption from data centres can hinder achieving climate objectives," assess IEA analysts.
The recent strong interest of Big Tech in nuclear energy is precisely the result of this. Large corporations like Microsoft and Nvidia are hoping that new nuclear plants will meet the local demand for servicing data centres.
What does this all mean? Firstly: artificial intelligence will certainly pose a localised challenge. Secondly: although its significance on a global level remains uncertain, it's doubtful it will be as significant as often portrayed.