DARPA wants AI to know that it's an energy pig • Register

Machine Learning


While it is notoriously difficult to consistently measure the energy usage of AI models, DARPA wants to end that uncertainty with a new “energy-aware” machine learning system.

The Mapping Machine Learning (ML2P) program, which began solicitations on Tuesday, aims to do simple things at least on paper. As the name suggests, we want to map the efficiency of various forms of machine learning directly to physics. In this case, the Bureau of Advanced Defense Research Projects uses “accurate granular measurements of Joules.”

“When building machine learning models today, we optimize them for performance only and overlook other characteristics. A very important feature is the amount of energy we are using.”

“What we want to do is consider mapping the performance of machine learning models to physical properties,” added McShea. “We want to do this so that we can balance the performance of the model with the amount of resources it covers.”

In the case of DARPA, concerns about the use of AI energy are particularly appropriate given the use of battlefields and edges of technology the Pentagon wants to put into the hands of soldiers. In this field, systems are generally battery powered and fighters can be shorter themselves if they are not optimized for performance and energy use balance.

“We want to optimize for accuracy alone and instead understand what level of performance we want to get back with every joule of electricity,” says McShea. “This allows us to build smarter, more lean, and more useful AI.”

However, the ML2P team is not going to stop there. Performers selected for the program must release documents, algorithms, code, and tutorials under a generous open source license. Therefore, the tools that AI builds to help track its own energy use will be available to the wider research community.

“The main migration objective for ML2P is to make ML2P software a gold standard for simulation of ML construction and power usage trade-offs,” McShea said. Register By email.

Others have tried to estimate the energy usage of AI previously, but these numbers are often incomplete due to the lack of visibility into closed-source models that dominate the field. Factors such as hardware, workload type, location, and even operating conditions can all affect the energy used per query. McShea said he hopes that ML2P explains all these upstream and downstream machine learning design choices and returns accurate, real-world data.

“ML2P redefines its power to become a 'first-class citizen' throughout the ML lifecycle,” DARPA added in its program solicitation document.

Selected performers can cut their work for them, and they will have a tight schedule to complete their tasks. DARPA plans to split ML2P into two 12-month phases. The first six months will be dedicated to the setup of the experiment, and the remaining 18 months of the two-year program will be used to collect experimental data on how the team will develop. Starting in the seventh month, government testing and evaluation teams will run in parallel for the rest of the programme to examine the findings of performers and identify which approach is best.

MCSHEA also hopes that ML2P will lead to the development of more energy-efficient AI hardware.

“Enabling principled simulation of machine learning models' performance in general-purpose computing systems provides insight into how hardware can be optimized for AI workloads,” says McShea.

Stakeholders that DARPA wants to draw from fields including electrical engineering, mathematics, logic, machine learning and more must submit proposals for a slice of the expected $5.9 million ML2P budget until December 8th. ®



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *