
Apple's MLX Machine Learning Framework was originally designed for Apple Silicon and has a CUDA backend. This means that developers can run MLX models directly on NVIDIA GPUs. This is quite a big deal. This is why.
This work is led by developer @Zcbenz on Github (via AppleInsider), I started prototyping CUDA support a few months ago. Since then, he has split the project into smaller pieces and gradually integrated them into Apple's MLX main branch.
The backend is still a work in progress, but some core operations such as matrix multiplication, softmaxing, reduction, sorting, indexing have already been supported and tested.
Wait, what is cuda?
fundamentallyCUDA (or Compute Unified Device Architecture) is a metal in Nvidia hardware. A computing platform that the company runs on its own GPU and specially created to make the most of its high performance parallel computing tasks.
For many, CUDA is the standard way to run machine learning workloads on Nvidia GPUs and is used throughout the ML ecosystem, from academic research to commercial deployments. Frameworks like Pytorch and Tensorflow are increasingly familiar names outside of deep ML circles, all relying on CUDA to utilize GPU acceleration.
So why does Apple's MLX support CUDA?
MLX is tightly integrated with the metals on the Apple platform, so it wasn't originally built to run outside of MacOS.
However, adding a CUDA backend will change that, providing researchers and engineers with prototypes locally on Macs, using metal and Apple silicon, then running the same code on a large Nvidia GPU cluster.
That said, there are still limitations, and most of them are ongoing work. For example, not all MLX operators are still implemented, and AMD GPU support is even further.
Still, bringing MLX code to an Nvidia GPU without rewriting opens the door to faster testing, experimentation and research use cases that AI developers can hear.
If you want to try it yourself, the details are available on github.
Great deals on Apple Watch
FTC: We use income-earning car affiliate links. more.
