EDB offers intelligence per watt for AI data layer

Applications of AI


EDB has introduced what it calls an “intelligence per watt” framework in EDB Postgres AI with the goal of reducing token consumption and data center emissions associated with AI workloads.

This announcement focuses on performance claims across data layer operations, including vector indexing, query processing, and infrastructure usage, as enterprises deploy more AI agents and search systems.

EDB argues that the energy debate around artificial intelligence focuses too much on models and graphics processors, and too little on the databases and data dynamics behind AI systems. Companies say they have more direct control over efficiency at the data layer than at the model layer.

The platform targets energy usage on two fronts: the core infrastructure required to run enterprise applications and the database activities behind agent AI such as search, retrieval, and vector indexing.

An analysis of three banking, financial services, and insurance customers with more than 120 data centers found that their computing resources were significantly reduced. According to EDB, this initiative has been independently validated by Incendium Consulting and shown to reduce computing cores by up to 94% in some cases and reduce projected emissions by up to 87%.

According to EDB, the level of emission reductions corresponds to a reduction of approximately 153,000 tonnes of CO2e, roughly equivalent to removing 33,000 cars from the roads.

Data layer focus

EDB also announced benchmark numbers for vector indexing, a process used in AI search and retrieval systems. EDB Postgres AI says it achieved vector index builds that were 5x to 12x faster while processing 1 billion vectors on a 128 GB server, compared to over 1,000 GB for traditional vector engines.

Another result came from a pilot with a global communications provider. In its tests, EDB said its software reduced AI token usage by up to 57% while maintaining 90% quality, and recorded a 72% scenario win rate.

These numbers are important because token usage is closely related to the running costs of large language model-based applications. As enterprises deploy more AI agents into production environments, repeated database calls, indexing tasks, and data transfers can increase both power demand and operational costs.

EDB made this announcement in response to a widespread increase in the consumption of AI infrastructure. According to the report, enterprises are expected to expand their use of AI agents over the next few years, with systems consuming trillions of tokens every day as deployment scales up.

He also noted that global data center power demand is expected to more than double by the end of 2010, driven largely by AI. This backdrop has shifted attention beyond the costs of training and inference to the storage, retrieval, and orchestration work that supports those processes.

“The AI ​​energy conversation has been about what happens with the models and the GPU. Almost no one is talking about what happens at the data layer, which every agent, every model, every inference call relies on,” said Quais Taraki, chief technology officer at EDB.

“You can’t control consumption at the model layer. Agents consume what they consume. But you can control efficiency at the data layer. For most companies, that’s really the only lever they have,” Taraki said.

three principles

EDB said the framework is based on three principles: measurement, optimization and governance. In practice, this means quantifying the energy and infrastructure costs of AI output, reducing the compute, storage, and network demands of each operation, and maintaining oversight of the databases, indexes, and pipelines created by autonomous systems.

The company tied its framework to broader claims about its platform’s operational efficiency, saying EDB Postgres AI has shown up to 58% lower costs and lower concurrent performance degradation compared to competing cloud analytics platforms, as well as completing analytic workloads 50x to 100x faster on live operational data.

For EDB, the economics and environmental impact of AI will increasingly depend on how data systems are designed. This view reflects broader industry debate about whether improvements in infrastructure efficiency can offset rapid growth in demand from generative AI and agent-based software.

Kevin Dallas, CEO of EDB, linked energy efficiency to economic benefits from AI projects.

“Companies that are successfully deploying AI at scale are 275% more likely to prioritize energy-efficient data infrastructure than other companies in the market, and they also see five times the ROI. Much of the industry is missing out on that. This concept of ‘intelligence per watt’ is not just an environmental metric, it’s a performance metric. Companies that are getting the most from AI are also demanding the most from their data layer,” Dallas said.



Source link