Generative AI will account for 1.5% of global electricity consumption by 2029

AI News


The generating AI is big The massive hardware requirements to run these applications account for a large portion of the world's electricity consumption. “AI chips will account for 1.5% of electricity use over the next five years, representing a significant portion of global energy,” semiconductor research firm TechInsights said in a research note last month.

TechInsights relied on data from the U.S. Energy Information Administration to obtain a baseline measurement of total global electricity consumption from 2025 to 2029, totaling 153,000 TWh.

The research firm estimated that AI accelerators will consume 2,318 TWh of electricity worldwide during the same period, amounting to 1.5% of global electricity consumption.

The measurements were made assuming each GPU uses 700W, which is the power draw of Nvidia's flagship Hopper GPU. Nvidia's next-generation GPU, Blackwell, is faster but will consume 1,200W of power.

TechInsights' assumptions only include the power consumed by the chip and do not include measurements of the storage, memory, networking and other components used for generative AI.

“Such high utilization rates are achievable given the massive demand for GPU capacity and the need to run these expensive assets to generate profits,” TechInsights analyst Owen Rogers said in a research report.

Nvida Blackwell GPU requires 1200W of power (Source: Nvidia)

Green or not?

According to a McKinsey AI survey, 65% of respondents plan to adopt generative AI.

To meet demand, cloud providers and hyperscalers are investing billions of dollars in expanding GPU capacity. Microsoft uses Nvidia GPUs to power its AI infrastructure, and the company said Meta will have the equivalent of “roughly 600,000 H100” GPUs.

Nvidia shipped about 3.76 million GPUs in 2023, up from about 2.6 million GPUs in 2022, according to TechInsights.

Last year, Gartner made a more aggressive prediction about power consumption, saying AI “could consume up to 3.5% of the world's electricity.” Gartner's methodology was unclear, but it could include networking, storage and memory.

The AI ​​race is characterized by companies that offer the fastest infrastructure and better results. The rush to introduce AI into business is disrupting long-established corporate sustainability plans.

Microsoft, Google and Amazon are spending billions of dollars building out giant data centers equipped with GPUs and AI chips to train and serve ever-larger models, increasing their power burden.

Cost Issues

The servers can be purchased for $20,000, but businesses need to take into account rising electricity prices and challenges facing the power grid, Rogers said in a research note.

Data centers also need to be designed to accommodate the power requirements of AI, demand that can be driven by grid capacity and the availability of backup power capacity.

Energy suppliers will also be responsible for building power infrastructure such as power plants, solar farms, and transmission lines to prepare for the AI ​​era.

“If demand cannot be met, energy suppliers will take a market-based approach to managing capacity – increasing prices to reduce consumption rather than denying capacity – which, again, could have cost implications for users of AI technologies,” Rogers said.

The US government's first goal is to generate 100% clean energy by 2035, which will reduce strain on the power grid. This will also allow for the establishment of more AI data centers.

Productive use of energy

AI's electricity consumption reflects an earlier trend of cryptocurrency mining straining the power grid: According to a February report from the U.S. Energy Information Administration, cryptocurrency mining accounts for about 2.3% of U.S. electricity consumption.

But energy industry observers agree that AI can help put energy to more productive use. compared to Bitcoin mining.

Nvidia's focus on AI is also being applied to productive energy use: To reduce power consumption, Nvidia's GPUs incorporate proprietary chip technology, and the company is switching from air to liquid cooling in its hoppers.

“The opportunity here is to help you get the most performance out of your fixed-megawatt data center at the best possible cost,” Ian Buck, vice president and general manager of Nvidia's hyperscale and HPC computing business, said at an investor event last month.

HPC Providers, AI, and Sustainability

A panelist at the recent ISC 24 supercomputing conference mocked Nvidia for claiming that its 1000 watt GPUs are “sustainable.”

The government lab also said that GPUs and direct liquid cooling offered better performance scaling than past CPUs.

Lawrence Livermore National Laboratory, which is developing a 2 exaflops supercomputer called El Capitan, has added 18,000 tons of cooling capacity to 28,000 tons and increased its power supply for current and future systems to 85 megawatts.

LLNL's El Capitan (Source: HPE)

“El Capitan will be just under 40 megawatts, about 30 megawatts, but that's still a lot of power,” Bronis de Supinski, LLNL's chief technology officer, told the breakout group.

He acknowledged that while the El Capitan supercomputer may not be considered environmentally friendly, the focus should also be on the results it achieves within its capabilities and power budget: For example, the energy it uses may be well worth it if it solves the climate problem.

“A 30-megawatt supercomputer? I'm not saying that's a sustainable resource, but it could be very helpful in solving the societal problems that we want to solve,” de Supinski said.

Laboratories are also moving towards renewable energy and liquid cooling. Liquid cooling, for example, “saves around 50 percent of the energy needed for cooling,” LRZ Chairman Dieter Kranzlmüller said during the ISC 24 session.

A sustainable computing environment also considers carbon offsetting, waste heat capture and reuse, and material reuse.

HPC's past drives its future

Efforts to make supercomputers more energy efficient are underway to make better use of every watt consumed by AI processing.

At the HPE Discover conference last month, CEO Antonio Neri said the company is porting energy-efficiency techniques used in Frontier and El Capitan to AI systems powered by Nvidia GPUs.

“HPE has one of the largest liquid cooling manufacturing capabilities in the world. Why? Because we have to do it for supercomputers,” Neri says.

Nvidia CEO Jensen Huang, who was also on stage, joked that the future of liquid cooling will “bring about everything: increased performance, lower infrastructure costs, lower operational costs.”

AI Offloading

Consumer device makers are promoting PCs and mobile devices with neural chips for on-device AI, which can run AI models locally, reducing the load on GPUs in the cloud.

Apple full This is part of the vision for an on-device and cloud AI strategy, where if your iPhone or Mac determines that an AI task can't be performed on-device, it will reroute the query to cloud servers in Apple's data centers.

Apple users can also choose whether they want the AI ​​to run on their device or via the cloud.

Microsoft encourages the use of AI chips in Windows devices. Qualcomm's AI Hub allows users to run benchmarks to see how AI models perform on their devices, allowing users to decide whether to run inference on-device or in the cloud.

But there's no killer app for AI PCs that provides a concrete example of a PC offloading AI workloads to a GPU in the cloud.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *