As AI energy demand increases, so should the company transparency

AI For Business


The rise of commercial AI systems has led to surprising headlines. Every time ChatGPT writes an email, it's like a dump out Water bottleor all prompts answered by the chatbot are equivalent to powering the bulb for about 20 minutes.

As technology becomes more integrated into our everyday lives, researchers are trying to quantify the environmental impact of AI. While concerns about the growing energy demand for AI and its ubiquitousness are growing, there is no successful effort among policymakers to track or regulate the industry's footsteps. This is primarily due to lack of relevant data and reporting mechanisms from companies in the technology and energy sector.

What is the AI energy cost? And where do they come from?

The direct mile of the environment used in news articles highlights one aspect of AI's energy demand. The ecological impact of this technology spans months, sometimes years, that it takes to train and deploy AI models. The different stages of development rely on specialized hardware with varying, but rather large consumption needs.

During the training phase, models are curenced to “learn” based on the algorithm, but often “train” a wide range of data. Electronic chips or graphics processing units (GPUs) used during this process Frequently Runs 24 hours a day, leading to high energy demand (used normally Non-reproducible source) It increases rapidly depending on the complexity of the model.

For example, researchers from Google and UC Berkeley Estimated Its training Openai's ChatGPT-3 model consumed enough power to power around 120 US homes a year. The lack of access to this type of data makes it more difficult to calculate the power demand for GPT-4, but researchers estimate That means they probably needed 50 times more electricity.

One estimate using data using International Energy Agency (IEA) data suggests that ChatGPT uses almost 10 times more power than regular Google searches. These numbers grow only because more users will accept generation AI programs instead of traditional search engines. In fact, researchers estimate that if Google handles a search with 9 billion AI per day, it could require 23-30 times the energy needed for a regular search.

Some experts, such as Sasha Lucciioni, Hugging Face's AI and climate leads, are “AI Energy Score” dashboardI'll almost report it 62,000 times The difference between the best and worst energy needs across different use cases and models on the dashboard. The energy needs to generate text, images and videos vary greatly, and video generation is the most energy-intensive. MIT Technology Review Recently, we found that the AI model uses around 3.4 million joules, which is the amount of energy to run the microwave for over an hour, producing a 5-second video.

Much of these commands is processed through the data center. That's how the data center consumes electricity. 4.4% of US electricity demand in 2023and may grow 6% by 2026. Globally, these centers account For 1-2% of the global energy needs, but some expect this, given the growing demand for AI. You can reach 21% by 2030.

This does not take into account the use of water needed to cool the hardware. Some UK researchers have the potential to use water globally for data processing It reaches half The country's water usage until 2027.

Maintaining the large building infrastructure itself, including computing hardware, storage systems, and network equipment, adds additional indirect energy costs.

There are also end-of-life costs. Usually, the GPU is used 4 years Before being disposed or reused. Currently, there is a lack of information on how these devices are disposed of and what the environmental impact of this waste accumulation will be.

Lack of data reduces AI energy consumption

These estimates are further complicated by the fact that tech and electric companies typically do not release energy or water consumption data and are not subject to the obligation to disclose them.

Most estimates are uncertain Not enough Original information. Google, Microsoft, and Meta I declined To share your model's AI prompt energy needs, recently Openai CEO Sam Altman I wrote it The average query would be to use how much the bulb will use in 0.34 watts hours, or a few minutes, and to use 0.000085 gallons of water. However, these numbers do not include the broad energy needs of model training, and only reflect energy needs per query.

Increased efficiency in AI energy consumption

This level of consumption is not lost to those who lead the fees. Openai CEO Sam Altman admits that AI consumes More power More than expected.

Efficiency has grown somewhat, but these improvements have recently been plateau. Large data centers offer greater energy efficiency opportunities, as exemplified by the construction of “Hyperscale” Center. Other examples include hardware-level improvements, such as power capping, and limit the amount of power supplied to the processor and GPU. This technique has been shown to reduce energy consumption Up to 15% It has a minimal impact on your user experience.

Other efficiency comes from the construction of Smaller model, Pruning or quantization Designing algorithm architectures, switching to renewable energy sources, increasing collaboration AI between companies, or use AI to identify possible improvements to itself and other models. For example, Google Deepmind said it has managed to reduce energy consumption in its data centers. 30% By using AI to better predict your cooling needs.

However, these improvements require the use of additional AI, even if new efficiencies are taken into consideration, which increases energy consumption. Due to the low energy costs of AI systems, increasing demand can lead to rebound effects, and more efficient technologies and lower production costs result in greater demand and adoption, and consumption again increases. This cycle is sometimes called Jevons Paradoxthe term that was first generated when steam engines became more efficient.

Unfortunately, the impact of increasing energy demand does not seem equal. Some regions and states already emit far more resources than others. Virginia “Allways in the data center” I provide an example. It hosts over 300 data centers, the largest number in the US. Residents fight the continued expansion of local data centers and Rising energy bills, Increased water demandand Deteriorating air quality For the facility's backup diesel generator.

Further south, the NAACP recently sued Elon Musk's Xai They are allegedly operating the turbine for a data center in Southern Memphis without proper permission. The NAACP emphasized that toxic emissions from turbines are primarily directed towards black neighborhoods, bearing the brunt of environmental racism.

What will Power AI look like in the future?

To ensure that the increased adoption of AI does not exacerbate environmental degradation, policymakers, businesses and communities need to start with better data on emissions and energy consumption to better understand where to focus on attention and resources.

MIT Scientist Found Even if they emit more than certain companies, companies using independent auditors for guarantees have reduced their total emissions by 7.5% per year. However, estimates may not be sufficiently detailed. for example, IEA estimates It includes cross-datacenter activity without a direct focus on AI applications. The ambiguity persists unless tech companies reveal that energy usage is related to AI.

Last year, Senator Ed Markey (D-MA) It was introducedArtificial Intelligence Environment Impact Act 2024“To better measure the impact of AI, the Act requires the Environmental Protection Agency (EPA) to convene a consortium of stakeholders through the National Institute of Standards and Technology (NIST) to conduct comprehensive research on the issue, in addition to creating a voluntary reporting system.

The US Government Accountability Office (GAO) Published April 2025 report on the environmental and human effects of generated AI. Here we provide an overview of six possible policy options: This includes encouraging developers to share details of the models regarding the infrastructure used to train and use the generated AI, and providing government incentives for more resource-efficient models and training methods.

At the state level, At least 60 invoices It has been introduced nationwide to address the impact of data centers, but there have been little meaningful changes. Harvard University researchers say that the carbon strength of the data center is 48% higher Data centers are often built in areas with dirty electric grids, so they are better than the national average.

Large companies such as Meta and Microsofthas turned to alternative forms of energy to enhance data centers. Nuclear energy. but, MIT Technology Review Nuclear power accounts for only 20% of U.S. electricity production, but he emphasized that clean but intermittent technologies such as wind and solar cannot always deliver electricity.

Given many unknowns, it's time for the US to support further research initiatives on ways to reduce the environmental impact of AI while continuing to increase transparency and set standards in this critical area. Otherwise, not only will it accelerate the climate crisis, it will also slow down the adoption of AI applications and slow down profits.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *