Top
Best
New

Posted by matthiaswh 9 hours ago

Ensu – Ente’s Local LLM app(ente.com)
304 points | 133 commentspage 3
dgb23 7 hours ago|
The (hn) title is misleading (unlike the actual title): It's an LLM _App_ not an LLM.
razvan_maftei 5 hours ago||
Looks like something spun up by Claude Code without thorough testing or design behind it sadly.
mkagenius 7 hours ago||
Had used cactus before - https://news.ycombinator.com/item?id=44524544

Then moved to pocket pal now for local llm.

socalgal2 5 hours ago||
What's special about Ente?

How does it compare to Jan AI for example? or LM Studio? or ????

alterom 3 hours ago|
It's available on Android and iOS, and specifically on Play Store and Apple Store (so no "developer mode" hoops to jump through to install).
todotask2 4 hours ago||
Model is last trained on Dec 2023, that's consider outdated.
FitchApps 7 hours ago||
Have you tried WebLLM? Or this wrapper: CodexLocal.com Basically, you would have a rather simple but capable LLM right in your browser using WebLLM and GPU
daikon899 6 hours ago||
The "What's next" section is more interesting than what shipped. A general-purpose chat wrapper around a 1-4B model occupies a crowded space — PocketPal, Jan, LMStudio, GPT4All all do similar things. But the ideas they gesture at (a persistent "second brain" note, an LLM-backed launcher, long-term memory that grows with you) are actually differentiated
imadch 7 hours ago||
What do you mean by IA in your device ? is it a local LLM ? if yeas how much params 4B or 8B...?? device requirements not mentionned too
kennywinker 6 hours ago|
Looks like it checks your device specs and downloads whatever the best model that will work? On mine it’s using a 3.5b version of llama
vvilliamperez 6 hours ago||
I just use open claw as a local memory management system. Not sure from TFA what's new here.
sbassi 6 hours ago|
For local LLM there are Ollama and LM Studio. How is this different?
kennywinker 5 hours ago|
This seems like a great question to ask an offline llm that runs on your mobile device. Does ollama and Lm studio run on your mobile device?
More comments...