How to address AI's growing energy needs
18 November 2024
Author
AJ Dellinger Tech Reporter, IBM

In the nearly two years since the release of OpenAI’s ChatGPT, the adoption of artificial intelligence has grown exponentially. Organizations’ reported use of generative AI nearly doubled from 2023 to 2024, according to a McKinsey Global Survey—and that figure is expected to grow by more than 40% every year over the next decade, per Bloomberg Intelligence.

That growth does not come without a cost. As demand for AI implementation grows, so, too, does the need for data centers capable of processing a high volume of complicated computations and requests. McKinsey predicts that global demand for data center capacity could rise by as much as 22% annually for the rest of the decade.

As those data centers come online, they will use a considerable amount of energy. In fact, McKinsey estimates that energy demands from data centers could be as high as 298 gigawatts by 2030, the equivalent of the amount needed to power more than 200 million residential homes. Goldman Sachs projects a 160% increase in data center power demand by 2030, with data centers expected to consume as much as 4% of all energy generated by the end of the decade.

The stress of AI’s energy needs

“As AI adoption grows, the demand for data centers is likely to rise, leading to an increase in energy use and potentially placing further pressure on electrical grids, as they already look to meet the needs of electrification and decarbonized energy,” says Phil Spring, Senior Partner and EMEA Energy & Resources Industry Leader at IBM Consulting.

Some places are already feeling that stress. Northern Virginia, which has been home to an explosion of data centers in the past half-decade, has started to see demand nearly outpace capacity. Similar problems have plagued parts of Georgia and Arizona.

And it’s not just the US, either—Alex de Vries, PhD student at Vrije Universiteit Amsterdam and founder of the research company Digiconomist, points to Ireland as an example of a grid under pressure from data centers.

For the most part, de Vries says, electricity grids should sustain functionality under pressure in the short term because “device manufacturing and data center construction takes time, and the pressure of AI on electricity grids should be limited right now.” However, he notes that the rapid adoption of AI and the expansion of infrastructure to support it will likely present problems in the future if not properly addressed.

Can AI solve itself?

Some believe that one tool that could help curb the energy demands of AI is AI itself. Earlier this year, NVIDIA CEO Jensen Huang argued that investment in the technology would lead to “better energy, better carbon-capture systems, better energy-generating materials”— and, ultimately, a grid capable of supporting AI energy demands.

Former Google CEO Eric Schmidt has advocated for the rapid development of AI, even at the cost of abandoning current climate goals, arguing that AI itself is more likely to find solutions to the problem than conservation. "I'd rather bet on AI solving the problem, than constraining it and having the problem," he said during an AI summit earlier this year.

Spring agrees that AI tools have a place in the process of improving sustainability. “AI and generative AI in particular hold great promise in accelerating sustainability initiatives,” he says. “With its ability to accelerate the conversion of data into actionable insight, it provides the tools necessary to make more informed, sustainable decisions in the context of a low-carbon energy system.” He adds, however, that “to effectively reap the benefits of AI, companies deploying the technology must take “a strategic approach that considers its broader environmental implications as well.”

The energy sector is embracing AI as a potential solution. A recent IBM study found that 63% of energy and resources executives expect to realize value from generative AI and automation. The hope is that the technology can help mitigate increasing energy demands—not just from AI, but from other major energy consumers, including electric vehicles.

De Vries, for his part, is skeptical about AI’s capacity to solve all of the energy-related issues that it creates. “This would be wishful thinking,” he says, explaining that “historically, efficiency gains have also often led to increased demand, sometimes resulting in more demand for resources in total than before the efficiency gains were realized.” De Vries says it would be shortsighted to bet on future efficiency while ignoring the real effects that AI’s growing demand is already having on energy production and consumption.

Solutions for slashing AI energy needs

Experts say that there are a number of ways to reduce AI’s environmental impact. But addressing the issue, Spring says, requires “making more sustainable choices at every step of the AI lifecycle.”

One approach that can help limit energy demands is to use foundation models—models that only require training once, compared to large language models, which are constantly growing.

Spring points to IBM’s geospatial foundation model, developed in collaboration with NASA, which can be fine-tuned by organizations to track deforestation, detect greenhouse gases or predict crop yields without requiring additional training.

“It’s also important to keep in mind model size,” Spring says. “Bigger isn’t always better. In fact, smaller models trained on high-quality, curated data can often achieve similar or even better results, while being more energy-efficient.”

Choosing the right processing location can also help to mitigate excess energy demands. Spring says a hybrid cloud approach can help organizations reduce energy consumption and provide additional visibility into overall usage, allowing them to make more informed decisions about how to run AI technologies.

“Situating data centers in areas with renewable energy sources can lead to significant energy savings and a reduced carbon footprint,” Spring says, noting that 74% of the electricity IBM data centers consumed in 2023 came from renewable sources.

Ultimately, the rise of AI presents significant opportunities for advancing sustainability. But it also creates a pressing dilemma: that of balancing the need to transition to clean energy with AI’s growing energy demands. Despite this challenge, 90% of business leaders still believe that AI can have a positive impact on sustainability goals. Moving forward, the key will be managing AI’s energy consumption responsibly—harnessing its potential in ways that contribute to, rather than detract from, a sustainable future.