Artificial intelligence (AI) and sustainability are both being framed as the future of business. However, the energy-intensive nature of AI has highlighted a dichotomy between the two concepts and growing concerns that the current state of AI is a sustainability disaster.
Data centres – the massive repositories of servers around the world that power the internet – have long been rapacious users of energy. In 2023 they accounted for 1-1.5% of global electricity use, according to the International Energy Agency, but the dawn of AI has really ramped up energy demand.
The specialised computer chips required for generative AI, which are designed to negotiate vast amounts of data, use far more electricity than traditional chips. The new chips also generate much more heat, meaning even more energy and water are needed to keep them cool.
By way of comparison, a ChatGPT-powered search consumes 10 times as much power as a search on Google without AI, according to analysis by the International Energy Agency. A recent study suggested that the estimated new AI servers made in 2027 alone could use 0.5% of worldwide electricity – roughly what a small country would use in a year.
Percentages aside, this huge increase in energy consumption presents very real issues for those geographies hosting data centres. The electricity grid in California, home to the Silicon Valley headquarters of several AI-led tech firms, is already struggling to meet the electricity demands presented by an uptick in energy consumption. More worryingly, the risk of blackouts during peak periods is becoming a daunting possibility.
Then there’s the issue of increased carbon emissions, which is at odds with current global net zero goals to reduce global warming. Data centre expansion due to the rapid rise in generative AI models has led to a marked increase in greenhouse gas emissions reported by tech companies.
Microsoft, an investor in OpenAI, the name behind ChatGPT, announced in its 2024 sustainability report that its CO2 emissions has risen nearly 30% since 2020; Google, meanwhile, reported that its emissions were almost 50% higher than in 2019.
Although the chips running AI are made in so-called ‘clean rooms’, the manufacturing process is actually quite a dirty business: in summary, it’s very energy- and water-intensive and produces copious amounts of chemical waste.
Energy – AI takes, but may also give?
Of course, many of the arguments presented thus far fail to take into consideration the potential energy-saving benefits presented by AI, and there is a balance to be struck and a need to see the bigger picture. For example, what of the potential sustainability gains to be made by using generative AI to replace hundreds of people in a call centre? Or indeed catalysing breakthroughs in green energy sources, such as nuclear fusion or the optimisation of solar and wind farms to fund its own energy usage?
Human need is the mother of invention and the need for more energy is no exception. The growth of AI may indeed be a game-changer and tech companies are already pouring money into researching renewables; Microsoft, for example, just this week announced it is working on a $30bn fund with BlackRock to develop renewable energy to power data centres.
With the advancement of computing power and the quest to improve the efficiency of chips, plus the growing use of quantum computing (which uses quantum mechanics to solve complex problems faster than on classical computers), it is possible that the energy usage required by these large AI models will be reduced. New technologies such as accelerators, 3D chips and chip-cooling techniques offer the promise of improved energy performance.
Similarly, we have seen innovative uses of data centre energy byproducts to reduce energy consumption elsewhere in the ecosystem; for example, a local council is using a data centre to heat its public swimming pool. Shifting AI workloads to align with times of lower energy demand, or positioning data centres in locations with low electricity usage and close to renewable energy sources (such as overnight in a desert, using solar batteries), can also lead to substantial energy savings.
A possible solution – transparency through regulation
One of the problems with addressing the energy and water consumption of AI is that it can be very difficult to quantify how much of these resources a model uses. Machine learning models of energy usage depend on variables including the phase of development and size of model. The training phase of model development tends to be disproportionately energy intensive and the upward trend in the size of models has also led to higher energy usage.
The tech companies that deal most with AI almost certainly have the data, but many shy away from disclosing exact details of their AI energy usage, partly due to concerns about competition as AI becomes more profitable, but also to detract from criticism and comparison with the perceived wastefulness of cryptocurrencies.
The lack of meaningful information means it is incredibly difficult to work out how to improve the energy efficiency of AI. It also makes it nigh on impossible for people to make informed decisions about the sustainability benefits of the new technologies on offer compared to their traditional counterparts – for example traditional search versus an AI-enabled search.
Improved transparency through regulation – for example, requiring AI developers to disclose their energy usage – could go some way to addressing this issue. Earlier parliamentary versions of the EU AI Act came close to doing this, by including requirements for systems to be designed with the capability of logging their energy consumption and calling on providers to assess environmental impact throughout the system’s lifecycle.
In the meantime, it’s fair to say that sustainability and AI are by no means mutually exclusive, but increased transparency and a healthy dose of human ingenuity will likely be needed to address current issues posed by the confluence of these two hinterlands.
Financial Services Faculty
This article was created by the Financial Services Faculty. Join the Faculty to gain digital access to practical guidance, expert analysis and professional development support across the financial services industry.