Top
Best
New

Posted by libpcap 4/16/2025

Microsoft researchers developed a hyper-efficient AI model that can run on CPUs(techcrunch.com)
146 points | 64 commentspage 2
instagraham 4/17/2025|
> it’s openly available under an MIT license and can run on CPUs, including Apple’s M2.

Weird comparison? The M2 already runs 7 or 13gb LLama and Mistral models with relative ease.

The M-series and Macbooks are so ubiquitous that perhaps we're forgetting how weak the average CPU (think i3 or i5) can be.

nine_k 4/17/2025|
The M-series have a built-in GPU and unified RAM accessible to both. Running a model on an M-series chip without using the GPU is, imho, pointless. (That said, it's still a long shot from an H100 with a ton of VRAM, or from a Google TPU.)

If a model can be "run on a CPU", it should acceptably run on a general-purpose 8-core CPU, like an i7, or i9, or a Ryzen 7, or even an ARM design like a Snapdragon.

1970-01-01 4/17/2025|
..and eventually the Skynet Funding Bill was passed.