Posted by guiand 3 days ago
Is this part of Apple’s plan of building out server side AI support using their own hardware?
If so they would need more physical data centres.
I’m guessing they too would be constrained by RAM.
(The cord is $50 because it contains two active chips BTW.)
The ability to also deliver 240W (IIRC?) over the same cable is also a bit different here, it's more like FireWire than a standard networking cable.
rdma_ctl enable
Using more smaller nodes means your cross-node IO is going to explode. You might save money on your compute hardware, but I wouldn't be surprised if you'd end up with an even greater cost increase on the network hardware side.
I don't think I can recommend the Mac Studio for AI inference until the M5 comes out. And even then, it remains to be seen how fast those GPUs are or if we even get an Ultra chip at all.