Top
Best
New

Posted by zdw 5 days ago

Intel Arc Pro B70 Review(www.pugetsystems.com)
175 points | 105 commentspage 2
userbinator 12 hours ago|
It should be possible to use the VRAM as extra swap space, when you're not using it for AI or gaming or anything else. 32GB is already more than a lot of computers have as just regular RAM, even sufficient to hold an OS installation:

https://www.tomshardware.com/news/lightweight-windows-11-run...

ycui7 9 hours ago||
It is weird that the reviewer does not mention RTX PRO 6000 96GB, but mentioned RTX PRO 5000 72GB. 72GB RTX PRO 5000 is a special order, and much less people are aware of it. RTX PRO 6000 is known by mostly everyone in the LLM world.

I cannot understand why would a tech reviewer do that.

jbellis 13 hours ago||
How should I update my simplistic understanding that decode is bw-bound with these results that show the B70 decoding faster than a 4090 (about 50% more bw)?
rao-v 12 hours ago|
I doubt you'd get the same sort of result on a modern-ish MOE or dense model via a more standard inference engine like llama.cpp or VLLM. I don't think MLPerf is a reasonable benchmark at this point.

Edit: Here is a simple llama.cpp compare where the token gen results match the rule of thumb.

https://www.reddit.com/r/LocalLLaMA/comments/1st6lp6/nvidia_...

XCSme 15 hours ago||
Can you use those AI cards for gaming too?

Or the makers intentionally nerf them, in order to better segment the markets/product lines?

ZiiS 15 hours ago||
The drivers often need per game optimisations these will be missing but I doubt Intel would nerf them, just rely on you not paying a lot for RAM the game won't use.
XCSme 15 hours ago||
I actually meant it in a different way. I would get it for local AI stuff, but being able to game on it would be a huge plus, otherwise I would need two different machines.
ZiiS 4 hours ago|||
Much as I want diversity; a 3090 would be a billion times better for games and can probably hold its own for a broader AI workload. Anything other then running highly quantised models that don't fit in 24GB with realativly small contexts.
XCSme 2 hours ago||
A 3090 is what I have now.

But I hope to somehow have 48Gb or 64GB VRAM in a GPU that's also gaming-ready.

I was looking for maybe getting a mac studio for this reason, but I don't think a mac is really good for for gaming.

MrDrMcCoy 4 hours ago|||
It'll work just fine for gaming. It's what the B770 would have been if it had 32GB RAM and ever got released.
wmf 14 hours ago||
They nerf gaming cards to make money on the pro cards. Since this is a pro card it's not nerfed.
wg0 3 hours ago||
Can we not have a PCIe card that's ASIC (and isn't GPU) with even DDR 4 or DDR 5 memory (Let's say 128 GB) onboard and being able to shove four of them on a consumer grade motherboard and then being utilized in parallel?

Noob question.

alex43578 49 minutes ago|
DDR4 or 5 would really run into bandwidth and value issues.

Bandwidth on that memory interface and setup for dual channel would be significantly worse than Strix Halo, which already exists and could be an entire compute setup with no need for an ASIC.

numpad0 14 hours ago||
$950 for 23TF fp32? Have GPU performance grew in past 5-10 years at all?
wmf 13 hours ago||
Are you comparing against gaming or workstation cards?
numpad0 10 hours ago||
1080Ti had >10TF in 2017. Or Titan XP too for that matter. ~10 years ago.
Readerium 7 hours ago||
AI workloads are all about memory size and bandwidth not compute
100ms 15 hours ago||
These seem amazing for hobbyist, but that TDP given the perf might be an issue deploying a lot of them
zrm 15 hours ago|
Its performance is pretty unbalanced. If you're using it for the couple of things that it's good at, the TDP is competitive.
unethical_ban 14 hours ago||
It looks like, if one can afford it, the R9700 is worth the extra money.

I read that Intel is getting out of the dGPU space, but then again, their iGPUs are really getting good. I can't understand why they'd give up the space when the AI market is so insane.

timschmidt 14 hours ago||
Rumors of their exit from dGPU predate Battlemage. So I wouldn't put a ton of credence to them. But Intel's is quite talented at snatching defeat from the jaws of victory.
yurishimo 14 hours ago|||
I hope not. They’ve been flip flopping too much and the market needs more dGPU competition.

The team working on drivers is doing a good job playing catch up and I hope intel will continue to invest in cards that focus on graphics workloads and not just on AI inference.

ycui7 5 hours ago||
Exiting dGPU for gaming, but staying in the LLM world.
cubefox 14 hours ago||
Why are they still using their old Xe2/Battlemage architecture rather than their new Xe3/Celestial? They already used it in their Panther Lake chipset.
phonon 14 hours ago||
That's coming out in https://www.phoronix.com/review/intel-crescent-island by around the end of the year.
fc417fc802 11 hours ago|||
Another comment here claims Celestial is cancelled. Has Intel indicated their intentions for the consumer dGPU space?
cubefox 5 hours ago||
There are only rumours apparently: https://www.club386.com/intel-arc-celestial-cancelled-leak/
wmf 13 hours ago||
It looks like B70 was delayed 1-2 years for some reason.
driverdan 15 hours ago|
From what I've read the Intel drivers are terrible and holding back using them for LLMs.
martinald 15 hours ago||
Don't think that's true. The drivers are bad (not sure terrible is fair, they have improved a lot) esp for older directx etc games. But Vulkan support is pretty good and that's all you need for LLMs really.
marshray 14 hours ago|||
I don't know about LLMs, but I tried an Intel card when Ubuntu Wayland couldn't initialize a 2 year old Nvidia. It just works.
lukan 13 hours ago|||
That is just Linux and politics. Linux wants to force vendors to open source theirs, Intel plays along, Nvidia as the market lead does not, so you have to use their proprietary one, which most distros do not ship by default.
driverdan 11 hours ago|||
Interesting. I had read that Intel's Linux drivers were far behind their Windows versions. I haven't checked in a few months though.
otherme123 5 hours ago||
That is compatible with what the comment you are replying: you don't need much to beat nVidia open drivers for linux. Intel linux drivers might be behind their Windows drivers, still ahead of nVidia's.

nVidia has zero incentives to play open for linux, they release the binary blobs, next to zero docs and support, and you deal with it. The last nVidia card I bought was 20 years ago, and it was so bad for linux (low perf and freezes for the open drivers, manual re-install hell and pray on each kernel update for the binaries) that I switched to ATI. Since then, ATI or Intel always were decent with zero headaches.

999900000999 15 hours ago||
Everyone has terrible drivers here aside from Nvidia.

Intel looks like they'll leave the dedicated GPU space, so it's a bit doubtful if the drivers will ever catch up.

reallytD91 7 hours ago||
What makes you think Intel will leave the GPU space?
999900000999 6 hours ago||
https://www.tomshardware.com/pc-components/gpus/intel-has-re...

I've seen several stories like this. Which is a shame since Intel offers the best value GPUs on the market.

I guess it's possible they'll still make workstation GPUs while skipping the consumer market.

reallytD91 1 hour ago||
I guess they need to up their marketing game, as a lot of people I know are still unaware of Intel GPUs. It's either Nvidia or AMD.