Meeting AI's massive energy demands

AI News


Writing in Fast Company , Moshe Tanach argues that “the Federal-Aid Highway Act offers valuable lessons on one of the biggest challenges facing the tech industry today: how to produce enough electricity to power the growing number of AI-specific data centers.”

Tanach explains that the computing power required to advance AI is doubling roughly every 100 days, calling for strategies to address growing energy demands.

This is where the highway model comes in: we can choose to find ways to provide more energy to power our AI (build more highways), or find ways to make our AI's energy costs cheaper (invest in high-speed rail). One path leads to a power-hungry, climate-destroying future, while the other is sustainable and profitable.

Tanak describes an “efficiency-first” approach focused on processing AI tasks using less energy and breaking the “vicious cycle of increased usage leading to increased energy consumption.” Tanak specifically supports finding ways to eliminate the central processing units (CPUs) in AI inference servers: “For the future of AI, we can either invest heavily in outdated power delivery methods that put further strain on our current power grid, or we can find ways to reduce costs at the source – the AI ​​data center itself – with embedded systems engineering that does most of the heavy lifting.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *