Top
Best
New

Posted by matthiaswh 13 hours ago

Ensu – Ente’s Local LLM app(ente.com)
324 points | 144 commentspage 5
Pythius 11 hours ago|
[dead]
prism56 8 hours ago||
[dead]
white_dragon88 10 hours ago||
[dead]
Arn_Thor 12 hours ago||
[dead]
chocks 12 hours ago||
This looks amazing! As I learn and experiment more with local LLMs, I'm becoming more of a fan of local/offline LLMs. I believe there's a huge gap between local LLM based apps and commercial models like Claude/ChatGPT. Excited to see more apps leveraging local LLMs.
juliushuijnk 12 hours ago|
I'm working on a rather simple idea; a Wordpress plugin that allows you to use a local LLM inside your wordpress CMS.

It requires a Firefox add-on to act as a bridge: https://addons.mozilla.org/en-US/firefox/addon/ai-s-that-hel...

There is honestly not much to test just yet, but feel free to check it out here, provide feedback on the idea: https://codeberg.org/Helpalot/ais-that-helpalot

The essence works, I was able to let it make a simple summary on CMS content. So next is making it do something useful, and making it clear how other plugins could use it.

HelloUsername 12 hours ago||
Spam? Ad?

Also: "Your AI agent can now create, edit, and manage content on WordPress.com" https://wordpress.com/blog/2026/03/20/ai-agent-manage-conten...

juliushuijnk 12 hours ago||
Spam for what? This is hackernews, I'm "hacking something" to push more control to users.

I'm talking about connecting Ollama to your wordpress.

Not via MCP or something that's complicated for a relatively normal user. But thanks for the link.

juliushuijnk 12 hours ago||
It seems your link about the Wordpress variation validated my idea :).

If the new Wordpress feature would allow for connecting to Ollama, then there is no need anymore for my plugin. But I don't see that in the current documentation.

So for now, I see my solution being superior for anyone who doesn't have a paid subscription, but has a decent laptop, that would like to use an LLM 'for free' (apart from power usage) with 100% privacy on their website.

bilekas 12 hours ago||
> use a local LLM inside your wordpress CMS

For when wordpress doesn't have enough exploits and bugs as it is. Also why bother with wordpress in the first place if you're already having an LLM spit out content for you ?

juliushuijnk 11 hours ago||
What's your point? Don't use LLM for CMS content? That my code is buggy? Or that people shouldn't trust the LLM they run on their computer on their own website?

You can check the code for exploits yourself. And other than that it's just your LLM talking to your own website.

> Also why bother with wordpress in the first place

Weird question, but sure, I use WordPress, because I have a website that I want to run with a simple CMS that can also run my custom Wordpress plugins.