Top
Best
New

Posted by UmYeahNo 1 day ago

Ask HN: Anyone Using a Mac Studio for Local AI/LLM?

Curious to know your experience running local LLM's with a well spec'ed out M3 Ultra or M4 Pro Mac Studio. I don't see a lot of discussion on the Mac Studio for Local LLMs but it seems like you could put big models in memory with the shared VRAM. I assume that the token generation would be slow, but you might get higher quality results because you can put larger models in memory.
44 points | 27 commentspage 2
stoneforger 18 hours ago|
M4 mini pro 24gb qwen3-8b-mlx and others. Speed is fine, problem is context window. In theory CoreML would be better from an efficiency perspective but I think it's non-trivial to run models with CoreML ( could be wrong )
b_brief 8 hours ago||
My experience with Mac Studio is that memory bandwidth matters more than raw cores for reasonable LLM throughput locally; curious what others find for models >13B parameters?
manarth 23 hours ago||
There's this post and thread from 7 weeks ago: https://news.ycombinator.com/item?id=46319657
giancarlostoro 1 day ago||
Not a Mac Studio but I use a basic Macbook Pro laptop with 24 GB of RAM (16 usable as VRAM) and I can run a number of models on it at decent speed, my main bottleneck is context window size, but if I am asking single purpose questions I am fine.
UmYeahNo 1 day ago||
Yeah. I'm currently on an Mac Mini m2 Pro with 32GB or ram, and I was so curious how much more I could get out of the Apple ecosystem. Thanks for your perspective.
StrangeSound 1 day ago||
What models are you running?
giancarlostoro 4 hours ago||
The most I ran was a GPT 20b model, I also run SDXL it runs rather quickly via the Draw Things app. There's an 8 step LoRa that lets you generate images in just 8 steps.
callbacked 12 hours ago||
can only speak for myself here, but the prompt processing speeds on Apple Sillicon is too slow, especially for any meaningful usage
mannyv 1 day ago||
Mine is a M1 ultra with 128gb of ram. It's fast enough for me.
UmYeahNo 1 day ago|
Thanks for the perspective!
Adanos 21 hours ago|
Nope, my Macbook Pro is enough for now