Artificial intelligence (AI) is already changing how we live and work, and it has the potential to revolutionize industries and the world at large. It is expected to produce trillions in value (link resides outside ibm.com), doing everything from improving predictions of catastrophic weather events to speeding the discovery and delivery of lifesaving drugs.
Individuals are using it to serve as virtual assistants and copilots. Companies and employees are deploying it to achieve efficiencies in several key areas, including customer service, finance and other areas.
In May, a McKinsey report (link resides outside ibm.com) found that the number of organizations that use generative AI (gen AI) had almost doubled to 65% in just the past 10 months. A recent IBM Institute for Business Value (IBV) study found that 77% of respondents felt that they needed to adopt gen AI quickly to keep up with customers.
The rapid growth in AI adoption has also resulted in dramatic increases in energy use. Energy is needed both to build and train AI models and then to power the complex math that a model completes each time it is asked for information or to generate content.
The International Energy Agency (IEA) has suggested (link resides outside ibm.com) that integrating AI into existing tools such as internet search engines might result in a tenfold increase in electricity demand. By 2030, the IEA projects the share of global electricity that powers data centers will double.
AI is not the first technology to raise energy consumption challenges. Similar concerns were caused by cloud computing in the early 2000s but were fortunately avoided through efficiency innovations (link resides outside ibm.com). Nevertheless, as AI adoption continues and businesses seek stable, affordable electricity from competitive energy markets, the topic is top of mind for many executives.
Yet, throughout this AI boom, many companies are still pursuing ambitious sustainability goals. 45% of S&P companies (link resides outside ibm.com) have made net-zero commitments, and Gartner has shared that 42% of executives (link resides outside ibm.com) consider their sustainability efforts a key differentiator.
As a result, many companies are now facing a dual task: accounting for increased, AI-driven energy use in their sustainability goals, while supporting industry-wide efforts to make AI more energy-friendly.
No one expects AI adoption to slow because too many companies and executives see it as an indispensable part of their future. Marrying these 2 ambitions—tapping AI’s benefits while progressing on net-zero pledges— requires a smart approach.
Fortunately, numerous industry experts are working on a range of solutions. Those solutions include:
Power-capping hardware has been shown to decrease energy consumption by up to 15%, while only increasing the time it takes to return a result by a barely noticeable 3%.
AI energy use can also be reduced by using carbon-efficient hardware (link resides outside ibm.com), which “matches a model with the most carbon-efficient mix of hardware,” according to MIT.
New and improved chips are another solution for energy issues. IBM recently released architecture details for its upcoming IBM Telum® II Processor and IBM Spyre Accelerator, which are designed to reduce AI-based energy consumption and data center footprint when released in 2025.
In general, larger models—such as generalist large language models (LLMs) used by ChatGPT and Google Gemini—require more energy (link resides outside ibm.com) than smaller ones. Such generalist models can be useful for wide-ranging consumer-facing needs, but for businesses with specific use cases (link resides outside ibm.com), IBM and other companies recommend smaller, more efficient, more affordable and less energy-hungry models.
Existing methods of training models require significant energy because AI developers often use several previous models (link resides outside ibm.com) as a starting point to train new models. Running all these models increases the power required.
However, researchers are attempting to better predict which models are outperforming and underperforming expectations, stopping the underperforming models early to save energy. This is all part of the burgeoning “designing for sustainability” movement that defines workload parameters to better use energy efficiently.
All companies should look to build or use data centers that are close to areas (link resides outside ibm.com) where renewable energy is in abundance. Sourcing from green data centers, which use renewable and sustainable energy, is a great way to reduce environmental impact.
Companies operating in the AI space should not let excessive competition stand in the way of sharing tips and tools (link resides outside ibm.com) that can help society reap the benefits of AI models with fewer energy demands.
IBM has collaborated with Columbia University (link resides outside ibm.com) to produce meaningful solutions to the energy crisis, including modeling how AI behaves on different hardware, developing lower-power chips, eliminating software bloat and optimizing AI systems.
In addition to these various approaches, AI itself can help with problem-solving around its energy needs.
A recent IBM study found that 74% of companies surveyed in the energy and utility industry are embracing AI to tackle data-related challenges. This could help them increase efficiency, lessening their impact on the environment. From grid maintenance to load forecasting (link resides outside ibm.com), AI has the potential to have a huge impact on the energy industry, enabling energy to be delivered more efficiently to all other industries.
IBM has taken a leadership role in the clean energy transition, creating the Clean Electrification Maturity Model (CEMM) in coordination with APQC to help energy companies conduct maturity assessments to benchmark their results and hasten their energy transitions.
The same study showed that, by the end of 2024, 63% of businesses surveyed plan to apply gen AI in sustainable IT initiatives. Yet, only 23% are currently considering sustainability assessments during the design and planning stages of IT projects to a great extent. This needs to change.
It is good that there is already a robust discussion about energy use and AI, and hopefully, more breakthroughs for minimizing energy usage follow those we are already making.
IBM is especially committed to helping identify smaller, more effective models and smarter hardware to minimize energy usage. Improving AI while decreasing energy use creates even more opportunities to embed the technology into our daily lives. The likelihood that using AI can help the world solve its biggest environmental challenges makes the goal of minimizing energy use even more important.
Use the right combination of people, processes and technologies to turn sustainability ambitions into action and become a more responsible, profitable business.
Put your sustainability initiatives into action by managing the economic impact of severe weather and climate change on your business practices.
Intelligent asset management, monitoring, predictive maintenance and reliability in a single platform.