Posted by sidnarsipur 15 hours ago
Obviously not for any hard applications, but for significantly better autocorrect, local next word predictions, file indexing (tagging I suppose).
The efficiency of such a small model should theoretically be great!
The sheer speed of how fast this thing can “think” is insanity.
Anyway, I imagine these are incredibly expensive, but if they ever sell them with Linux drivers and slotting into a standard PCIe it would be absolutely sick. At 3 kW that seems unlikely, but for that kind of speed I bet I could find space in my cabinet and just rip it. I just can't justify $300k, you know.
> Write me 10 sentences about your favorite Subway sandwich
Click button
Instant! It was so fast I started laughing. This kind of speed will really, really change things
Which brings me to my second thing. We mostly pitch the AI wars as OpenAI vs Meta vs Claude vs Google vs etc. But another take is the war between open, locally run models and SaaS models, which really is about the war for general computing. Maybe a business model like this is a great tool to help keep general computing in the fight.