1. https://apps.apple.com/gb/app/locally-ai-local-ai-chat/id674...
Although the phone got considerably hot while inferencing. It’s quite an impressive performance and cannot wait to try it myself in one of my personal apps.
Still, absolutely fabulous. What a time to be alive!
I’m sure very fast TPUs in desktops and phones are coming.
Honestly, I was extremely impressed by the speed and quality of the answers considering this thing runs on a phone. It honestly makes me want to sit down and spin up my own homegrown AI setup to go fully independent. Crazy.
> We collect information about your activity in our services
[1] https://github.com/google-ai-edge/gallery/blob/main/Android/...
[2] https://github.com/google-ai-edge/gallery/blob/main/Android/...
The design quality is still poor. But that's the new Apple. Design is no longer one of their core strengths.
If you just go to https://apps.apple.com/ it does look better, but I agree, still a bit "off".
On my iPhone it opens on the App Store app, so it looks fine to me.
Screenshot of the header: https://i.imgur.com/4abfGYF.png
Edit: Seems like mix-blend-mode: plus-lighter is bugged in Firefox on Windows https://jsfiddle.net/bjg24hk9/
Anyone worked on hooking up OpenClaw to gemma4 running locally?