Energy-hungry applications put strain on the power grid and threaten to slow the transition to cleaner energy sources.

Article Contents
Rosalind Stefanac
While artificial intelligence offers significant benefits to business operations and improved productivity, the energy-intensive applications it powers threaten to strain the power grid and slow the transition to cleaner energy sources.
According to Goldman Sachs Group's 2024 report, “Generational Growth, AI Data Centers, and the Future U.S. Power Demand Surge,” by 2030, U.S. data center electricity demand will increase by 160%, with AI accounting for one-fifth of that increase.
Ad 2
Article Contents
Researchers at the New York-based investment firm say that an average ChatGPT query requires six to 10 times more power than a Google search.Additionally, carbon dioxide emissions associated with data centers are expected to double by 2030, and renewable energy sources such as solar, wind and battery technology are not expected to scale to meet energy demands anytime soon.
“It's true that large language models like GenAI do consume energy, which translates into carbon emissions and all the problems that come with that,” says Krish Banerjee, managing director and leader of Data and AI at Accenture Canada, “but it's also important to recognize that there are methods and approaches that can be adopted as ways to mitigate that.”
One Size Doesn't Fit All
In addition to developing alternative, more sustainable energy sources to address immediate AI demands, Banerjee said companies need to strategically determine which problems truly need to be solved with large-scale language, energy-hungry models.
“If you have to solve every problem with the biggest hammer, then you're probably not using your tools effectively,” he said.
Article Contents
Ad 3
Article Contents
When companies don't know how to route queries to the right problem-solving tools, they will inevitably turn to the applications that require the most energy and produce the most carbon dioxide, even if they are unnecessary, he said.
Banerjee also expects AI to play a key role in finding long-term solutions to current power issues.
“How can we use AI to better understand what energy sources we should be using,” he said. “The idea is to use AI for a greater good.”
For all the “game-changing solutions” that AI will help companies realize in the future, companies can't ignore the current growing sustainability issue, said Jas Jaaj, managing partner of AI and data at Deloitte Canada. He advises any organization considering building or modernizing a data center to choose their supplier carefully.
“Everyone needs to have strong measurement, monitoring and reporting mechanisms when it comes to (carbon) emissions,” he said.
To stay disciplined and meet global sustainability commitments by 2030, he said, companies need to push themselves, their suppliers and “all the players in the ecosystem we work with to level up in terms of using renewable energy sources and providing innovative solutions.”
Ad 4
Article Contents
Jaaj expects to see a more decentralized approach to data centers going forward as companies seek to reduce their reliance on external sources to power AI solutions, especially mission-critical processes.
“There may be cases where companies want to own their own infrastructure in their own data centers to be more resilient,” he said.
Companies such as Nvidia Corp., which designs systems and software to help data centers manage the computing demands of AI, are helping companies build their own data center solutions, Jaaj said.

“Their growth is a signal of how the world is changing,” he said.
Meanwhile, sustainable solutions to power AI-driven data centers may be much closer than expected: Miami-based startup Exowatt, backed by OpenAi OpCo LLC co-founder Sam Altman, is developing a self-contained solar-powered product that it plans to commercially deploy later this year.
The company's P3 system is designed to fit into the space of a standard 40-foot shipping container and combines a solar thermal collector, a thermal battery and a heat engine that can provide dispatchable electricity and heat.
Ad 5
Article Contents
“The beauty of this system is that you can linearly scale the number of modules you put into a project, regardless of the size of the project, whether it's a small megawatt or a large gigawatt[data center],” said chief executive and co-founder Hannan Parvidian, noting that there is already a “huge backlog” of customers and projects interested in bringing it online.
He has heard stories from companies developing data center projects that have bought everything from the real estate to the computing power they need, only to realize they can't afford to connect it to the power grid.
“We need to build additional generating capacity and we cannot wait another decade for new hydroelectric and geothermal plants to come online,” he said.
Editor's recommendation
-
TD Uses AI to Speed Up Mortgage Applications and Code Development
-
Generative AI is taking over customer service and it's working
Because P3 stores energy as heat rather than electricity, costs are minimal compared to current battery architectures, and the company's goal is to provide renewable energy for commercial, energy-intensive applications like AI at a cost of 1 cent per kilowatt-hour.
“I think customers are really looking to us in terms of market readiness,” Parvidian said. “The real tragedy is that data center customers are going back to fossil fuels because there are no renewable energy alternatives in the market.”
Please support our journalism by bookmarking our website: Don't miss the business news you need to know: Bookmark financialpost.com and sign up for our newsletter here.
Article Contents