Device brings significant energy savings to AI applications

Applications of AI


Digital transformation abstract colorful lines growth leadership
Image: ©Just_Super | iStock

Engineers at the University of Minnesota, Twin Cities have unveiled a cutting-edge hardware device that promises to change the game for artificial intelligence (AI) computing.

Their research, published in the prestigious journal npj Unconventional Computing, presents cutting-edge technology that can reduce the energy consumption of AI tasks by an unprecedented factor of over 1,000.

Energy and AI

Traditional AI processes often generate large energy demands due to the frequent transfer of data between logic and memory components.

To address this challenge, a team from the University of Minnesota developed Computational Random Access Memory (CRAM).Unlike traditional approaches, CRAM allows data to be processed entirely within the memory array, eliminating the need to shuttle data between the memory and the processing unit.

Jan Ruff, principal investigator and postdoctoral researcher at the University of Minnesota, describes CRAM as a revolutionary leap in efficiency, saying: “This work is the first experimental demonstration of a CRAM that can process data entirely within the memory array, without ever leaving the grid where the computer stores the information.”

Making AI more energy efficient

Innovations like CRAM come at a critical time, with global energy consumption for AI expected to double by 2026. According to Lv and his colleagues, CRAM-based systems can deliver up to 2,500 times the energy savings over traditional methods, establishing a new benchmark for energy-efficient computing.

CRAM leverages magnetic tunnel junctions (MTJs), advanced nanostructures that use electron spin instead of charge to store data. This spintronics approach not only improves speed and efficiency, but also ensures durability in demanding environments.

Ulya Karpuzcu, an expert in computing architecture and co-author of the paper, highlighted CRAM's versatility, saying: “CRAM is a highly energy-efficient, digital-based, in-memory computing substrate that is extremely flexible in that computations can be performed anywhere within the memory array.”

Going forward, the University of Minnesota team plans to collaborate with semiconductor industry leaders to scale their CRAM technology and pave the way for its integration into mainstream AI applications. Their efforts are supported by grants from agencies such as DARPA, NIST, NSF, and Cisco Inc., demonstrating the national importance of this project and its potential for global impact.

The launch of CRAM is both a technological milestone and represents a future where AI can achieve unprecedented efficiency without compromising performance.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *