It’s no secret that artificial intelligence (AI) has quickly become a part of everyday life for most of us. Whether you need help drafting an email, planning a trip, or answering a simple question, it’s super easy to type and get an instant response. But behind the minimal benefits of AI is a massive surge in computing power for model training and inference.
All that computing power requires energy. A data center is a specialized facility that houses the computer systems and equipment needed to store, manage, and process data and is powered by the local electrical grid. Most power grids around the world still rely heavily on fossil fuels (coal, natural gas, oil) for power generation, which is a significant source of emissions.
Explanation
Electricity usage on a national or global scale is measured in terawatt hours (TWh). This unit is approximately large enough to power a medium-sized town of about 100,000 people for one year at 1 TWh. In 2024, US data centers consumed 183 TWh of electricity. This is equivalent to the annual electricity use of the entire state of Arizona or Washington. And that total is projected to increase by 133% by 2030.
The adoption of AI is accelerating in almost every sector, and each new application is increasing energy demand. Businesses are leveraging AI to make their operations more efficient and sustainable, but efficiency alone won’t offset the carbon challenges AI creates. Without access to low-carbon energy solutions, AI expansion could derail rather than support global sustainability efforts.

Fortunately, there are opportunities to meet AI’s growing energy demands in ways that align with long-term carbon constraints. That means looking beyond the walls of data centers and focusing on reducing the carbon intensity of the energy grids that power them and the infrastructure that society relies on.
Responsible progress starts with better visibility into how different AI applications drive energy use and environmental impact. By understanding where the greatest pressures are coming from and what trade-offs they make, organizations can make more informed choices about how to sustainably scale AI. Doing so will facilitate the continued adoption of AI in nearly every industry in the world, while reducing environmental impact and, importantly, making business operations more efficient and cost-effective.
Why demand for AI will continue to grow despite environmental concerns
AI is being incorporated into almost every sector and field. In healthcare, AI enables personalized medicine, improves diagnostic accuracy, and accelerates drug discovery. In the supply chain, we support demand forecasting, inventory optimization, and more efficient logistics planning. Self-driving vehicles and AI-powered traffic management systems provide safer and more efficient transportation options. In agriculture, AI-powered precision agriculture is increasing crop yields and optimizing irrigation and fertilization practices.
These applications of AI only scratch the surface. As energy requirements increase with each new use case and adoption expands globally, the cumulative impact on power consumption will grow. Even as individual models become more efficient and require less computing power, the overall demand curve will trend upward as AI becomes more accessible and pervasive. This is known as the “rebound effect.”

A rebound effect occurs when improved efficiency makes a technology more available and affordable, increasing overall usage. In the context of AI, even if cooling systems become more efficient or models evolve to require less energy to run, total energy consumption can still increase, even if individual tasks use less energy.
The sustainability challenge behind AI
Training large-scale language models (LLMs) and performing inference on simple questions in ChatGPT requires significant energy and increases demands on other resources, both directly and indirectly.
These processes require huge amounts of computing power, which leads to high power consumption and has a direct impact. Data center cooling systems consume large amounts of water to maintain optimal temperatures, adding another layer of resource usage. In a region already facing water scarcity, this will place further stress on local supplies.
Indirect impacts extend beyond electricity and water. Hardware manufacturing relies on mining rare earth minerals and other critical resources, which destroy ecosystems and cause pollution. The rapid turnover of hardware further compounds the problem and generates e-waste. E-waste is difficult to recycle and often contains hazardous substances that can leach into soil and water if not managed responsibly. Data centers can also strain local power grids, increase dependence on fossil fuel plants, amplify pollution in surrounding areas, and create public health crises similar to those seen with oil and gas.
While AI will not cause oil spills, unchecked energy demand could cause similar systemic harms, including pollution, resource depletion, and public health challenges concentrated in vulnerable regions. These similarities are a reminder that technological advances without sustainability plans can repeat old mistakes with long-term impacts on people and places. Avoiding this trajectory requires solutions that reduce AI’s growing energy demands and the waste it creates, without slowing down innovation.
Building blocks for responsible AI growth
AI sustainability challenges cannot be solved by technology alone. As AI adoption increases, more people across industries are starting to focus on its environmental impact and how to better measure it. This includes understanding how much energy AI systems use and finding ways to report emissions more consistently. A standard way to track these impacts makes it easier for organizations to compare progress, understand where improvements are needed, and make better decisions. Without a clearer way to measure carbon impact, it will be more difficult to assess whether new efficiencies are actually working.
Progress also depends on how well different groups work together. It becomes difficult to scale AI responsibly when approaches vary widely across regions and industries. But when researchers, companies, and public institutions share their learnings, they can help create better practices to understand and reduce the environmental impact of AI. Over time, this type of collaboration can also support renewable energy integration, carbon-aware computing, and more efficient hardware efforts.
Sustainability is also becoming part of a broader conversation about ethical and trustworthy AI. Environmental impact is now considered alongside equity, accountability and reliability. This change will encourage clearer communication about the energy demands of AI systems, as well as more attention to hardware and software choices that impact waste and resource use.
Education ties all of this together. Developers can benefit by learning how to build and deploy models that use fewer resources. Business leaders and policymakers need a clearer understanding of how the implementation of AI will impact energy use and emissions. And as the public becomes more aware of these impacts, the community will be ready to engage in conversations about how digital technologies can support more sustainable infrastructure.
Today’s reality and what it means for tomorrow
The rapid increase in the use of AI is making it difficult for many technology companies to meet their carbon reduction targets. While AI can help other industries reduce emissions, its own environmental impact is rapidly increasing. If data centers continue to rely on fossil fuel-based power grids, emissions will continue to rise and climate impacts will last for decades.
To understand your AI footprint, you need to look beyond your daily power usage. A life cycle assessment provides a more complete picture by examining every step of the process, including how raw materials are extracted, how the hardware is manufactured, how much power and cooling systems are required during operation, and what happens to the equipment at the end of its useful life. Without this broader perspective, improvements in AI model efficiency or data center performance can mask impacts occurring earlier or later in the chain. A full-systems perspective increases the credibility of sustainability claims and helps organizations make informed decisions as AI adoption increases.
Over time, the sustainability of AI will depend on reducing the carbon intensity of the energy that powers it. Therefore, expanding energy generation, improving storage capacity, and modernizing transmission infrastructure must proceed in parallel with increased operational efficiency, improved reporting standards, and smarter system design.
With steady investment and cross-sector collaboration, AI can advance in ways that support long-term climate goals and generate benefits that extend beyond the technology industry.
—Dr. Anastasia Behr He is the Senior Director of Sustainability Science and Technology at UL Solutions. Dr. Young Lee I am the Principal Engineer for Artificial Intelligence at UL Solutions.
