Posted by franze 2 days ago
https://github.com/ehamiter/afm
It's really handy for quick things like "what's the capital of country x" but for coding, I feel that it is severely limited. With such a small context it's (currently) not great for complicated things.
Can you share a working example?
trying to run openclaw with it in ultra token saving mode, did totally not work.
great for shell scripts though (my major use case now)
apple does have an on device rag pipeline called the semantic index that feeds personal data like contacts emails calendar and photos into the model context but this is only available to apples own first party features like siri and system summaries.
it is not exposed through the foundationmodels api.
The mic button requires clicking to transcribe and start listening again, and default voice is low-quality (I assume it can be configured).
In general I'm looking for a way to try the on-device hands-free voice mode.