Top
Best
New

Posted by ingve 7 hours ago

Raspberry Pi's New AI Hat Adds 8GB of RAM for Local LLMs(www.jeffgeerling.com)
157 points | 116 commentspage 2
JustFinishedBSG 2 hours ago|
It's useless for LLMs and it's actually slower than Hailo 8H for standard vision tasks, so, why ?
speedgoose 6 hours ago||
Is there any usefulness with the small large language models, outside perhaps embeddings and learning?

I fail to see the use-case on a Pi. For learning you can have access to much better hardware for cheaper. Perhaps you can use it as a slow and expensive embedding machine, but why?

kouteiheika 4 hours ago|
A natural language based smart home interface, perhaps?

Tiny LLMs are pretty much useless as general purpose workhorses, but where they shine is when you finetune them for a very specific application.

(In general this is applicable across the board, where if you have a single, specific usecase and can prepare appropriate training data, then you can often fine-tune a smaller model to match the performance of a general purpose model that is 10x its size.)

michaelmior 4 hours ago||
I think there's a lot of room to push this further. Of course there are LLMs being used for this case and I guess it's nice to be able to ask your house who the candidates were in the Venezuelan presidential election of 1936, but I'd be happy if I could just consistently control devices locally and a small language model definitely makes that easier.
vander_elst 4 hours ago||
I had a couple of Pis that I wanted to use as a Media center, I always had some small issues that created a suboptimal experience. Went for a regular 2nd hand amd64 with a small form factor and never looked back, much better userspace support and for my use case a much smoother experience, no lags no memory swapping and if needed I can just buy a different memory bank or a different component. I have no plans to use a raspberry pi any time soon. I am not sure these days if they really still have a niche to fill and if yes how large this niche is.
Barathkanna 6 hours ago||
As an edge computing enthusiast, this feels like a meaningful leap for the Raspberry Pi ecosystem. Having a low-power inference accelerator baked into the platform opens up a lot of practical local AI use cases without dragging in the cloud. It’s still early, but this is the right direction for real edge workloads.
saagarjha 5 hours ago|
Which ones?
Croak 3 hours ago|||
For example camera based object detection. People have been using usb Dongel Google Tensors for that previously.
Croak 3 hours ago|||
For example camera based object detection. People have been using usb Dongel Google Tensors for that previously
phito 7 hours ago||
Sounds like some PM just wanted to shove AI marketing where it doesn't make sense.
Lio 4 hours ago||
I've seen the AI-8850 LLM Acceleration M.2 Module advertised as an alternative RPi accellorator (you need an M.2 hat for it).

That's also limited to 8Gb RAM so again you might be better off with a larger 16Gb Pi and using the CPU but at least the space is heating up.

With a lot of this stuff it seems to come down to how good the software support is. Raspberry Pis generally beat everything else for that.

nottorp 4 hours ago||
Hmm. Can this "AI" hardware - or any other "AI" hardware that isn't a GPU - be used for anything other than LLMs?

YOLO for example.

dismalpedigree 3 hours ago|
Yes. The Hailo chips are mainly for AI vision models. This is the first time I have seen them pushed for LLM. They are very finicky and difficult to setup outside of the examples. Documentation is inconsistent and the models have to be converted to a different format to run. It is possible to run a custom yolo8 model, but is challenging.
wyldfire 4 hours ago||
I wonder -- how does this thing compare against the Rubik Pi [1]?

[1] https://rubikpi.ai/

philipallstar 4 hours ago|
Is that affiliated with raspberry pis in some way, or are they just freeloading on the "pi" suffix to confuse people?
wyldfire 1 hour ago|||
They are "freeloading" indeed but in this case it is a valuable indicator that it should be physically compatible with many add-ons/attachments that are compatible with the Raspberry Pi.
Elfener 4 hours ago|||
> are they just freeloading on the "pi" suffix to confuse people?

Yes, but that is normal I guess:

- https://banana-pi.org/

- http://www.orangepi.org/index.html

- https://radxa.com/products/rockpi

huntercaron 6 hours ago|
Glad Jeff was critical here they need a bit of a wake up call it seems.
More comments...