Do you really need a GPU or NPU for your AI apps? • The Register

Machine Learning


kettle There's no avoiding AI and LLMs this year, with technology being crammed into everything from office software to phone apps.

Nvidia, Qualcomm and others are aggressively pushing the idea that this machine learning work should be done on an accelerator, whether a GPU or an NPU. Arm argued on Wednesday that its CPU cores, used in smartphones and other devices around the world, are ideal for running AI software without the need for a separate accelerator.

In this week's Kettle episode (playable below), our vultures discuss the benefits of running AI workloads on CPUs, NPUs, and GPUs, the power and infrastructure needed to do so from personal devices to massive data centers, and how this artificial intelligence is being used, including how Palantir's AI targeting system is being deployed across the U.S. military.

Youtube Video

This week, host Ian Thomson is joined by Chris Williams, Tobias Mann and Brandon Vigliarolo. Producer/Editor is Nicole Hemsos-Pritchett.

If you just want to listen to the audio, the Kettle series is available as a podcast via RSS and MP3, Apple, Amazon and Spotify. Let us know what you think in the comments. ®



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *