Michael DempseyTechnology Reporter
Getty ImagesIt's difficult to imagine. Around the world, approximately 3tn (£2.2tn) will be spent on data centers supporting AI between now and 2029.
That estimate comes from investment bank Morgan Stanley, with about half of that total adding up to construction costs, and half adding half of the expensive hardware supporting the AI revolution.
To put that number into perspective, it was something that the entire French economy in 2024 was almost worth it.
It is estimated that in the UK alone, another 100 data centres will be built over the next few years to meet the demand for AI processing.
Some of these will be built for Microsoft, which announced a $30 billion (£22 billion) investment in the UK AI sector earlier this month.
When it comes to AI data centers, it's different from traditional buildings that include personal photos, social media accounts, and computer server ranks that keep humming their work applications.
And are they worth this amazing spending?
Data centers have grown in size over the years. The new term HyperScale described a site created by the technology industry to run power requirements of dozens of megawatts.
However, the AI has supercharged the game. Most AI models rely on NVIDIA's expensive computer chips to handle tasks.
Each Nvidia chip has a large cabinet of about $4 million. And these cabinets hold the key to why AI data centers are different.
Large-scale language models (LLMs) that train AI software require the division of language into every small element of meaning. This is only possible if the computer network works together and very close together.
Why is proximity so important? All distances between two chips add nanoseconds, one billionth of a second to the processing time.
That may sound too much time, but when a warehouse full of computers is stacking up these microscopic delays, diluting the performance needed for AI.
AI processing cabinets are packed together to eliminate this element of latency, create what the tech sector calls parallel processing, and create what works as one huge computer. It all spells out density, the magical word of the AI Structure Circle.
Density eliminates the processing bottlenecks seen when working with processors that are several meters apart from normal data centers.
Bloomberg via Getty ImagesHowever, these dense cabinets eat gigawatts of power and LLM training, creating spikes in your appetite for electricity.
These spikes amount to thousands of homes that turn the kettle on and off at once every few seconds.
This type of irregular demand for local grids must be managed carefully.
The Daniel Bizo Uptime Institute at Data Center Engineering Consultancy analyzes data centers for a living.
“Normal data centers are stable hams in the background compared to the demands that AI workloads do on the grid.”
Like these synchronized kettles, a sudden AI surge presents what Vizo calls a single problem.
“Single workloads at this scale are unprecedented,” Vizo said.
Data center operators avoid energy issues in a number of ways.
Speaking to the BBC earlier this month, Nvidia CEO Jensen Huang said that in the UK, he hopes to “remove more gas turbines off the grid and not take people to the grid” in the short term.
He said that the AI itself will design better gas turbines, solar panels, wind turbines and fusion energy, producing more cost-effective and sustainable energy.
Microsoft is investing billions of dollars in energy projects, including a contract with constellations energy that sees nuclear power generation again on the three-mile island.
Owned by Alphabet, Google is investing in nuclear power as part of its strategy to operate with carbon-free energy by 2030.
Meanwhile, Amazon Web Services (AWS), part of retail giant Amazon, says it is already the largest corporate buyer of renewable energy in the world.
Bloomberg via Getty ImagesThe data center industry is keenly aware that lawmakers are monitoring the shortcomings of AI factories with intense energy use that potentially impact local infrastructure and the environment.
One of these environmental impacts involves a large supply of water to cool the cooling chips.
As Virginia, the US state of America is expanding data centers that keep high-tech giants like Amazon and Google in business, a bill is under consideration to approve new water-consumption sites.
Meanwhile, the proposed AI factory in northern Lincolnshire, UK, is challenging from Anglia Water, which is responsible for maintaining taps in the area of the proposed site.
British water points out that there is no obligation to supply water for non-domestic use, suggesting recycled water from the final stage of wastewater treatment as a coolant rather than drinking water.
Given the practical issues and enormous costs facing AI data centers, is the whole movement really a big bubble?
One speaker from the recent Data Center Conference created the term “Bragawatts” and explained how the industry is talking about the size of the proposed AI site.
Zahl Limbuwala is a data center specialist at Tech Investment Advisors DTCP. He acknowledges the big questions about the future of AI data centre spending.
“It's very difficult to believe in the current trajectory. There's certainly a lot of bragging going on. But the investment has to offer returns, otherwise the market will fix itself.”
With these caution in mind, he still believes that AI deserves a special place in terms of investment. “AI will have more impact than previous technologies, including the Internet. So we need all of these gigawatts.”
He says he boasts that AI data centers are “real estate in the high-tech world.” Speculative technology bubbles such as the 1990s dot-com boom did not have brick and mortar bases. The AI data center is very solid. But the spending boom behind them cannot last forever.

