Posted by nip 2 hours ago
I built SimplePDF Copilot: an AI assistant that can interact with the PDF editor. It fills fields, answers questions, focuses on a specific field, adds fields, deletes pages, and so on.
It's built on top of SimplePDF that I started 7 years ago, pioneering privacy-respecting client-side pdf editing, now used monthly by 200k+ people.
As for the privacy model: the PDF itself never leaves the browser. Parsing, rendering, and field detection all run client-side.
The text the model needs (and your messages) goes to whatever LLM you point at. By default that's our demo proxy (DeepSeek V4 Flash, rate-capped), but you can BYOK and point it at any cloud provider, or go fully local (I've been testing with LM Studio).
Unlike the existing "Chat with PDF" tools that only retrieve the text/OCR layer, Copilot can act on the PDF: filling fields, adding fields (detected client-side using CommonForms by Joe Barrow [1], jbarrow on HN with some post-processing heuristics I added on top), focusing on fields, deleting pages, and so on.
I built this because SimplePDF is mostly used by healthcare customers where document privacy is paramount, and I wanted an AI experience that didn't require shipping PII to a third party. Stack is pretty standard:
- Tanstack Start
- AI SDK from Vercel
- Tailwind (I personally prefer CSS modules, I'm old-school but the goal since I open source it, I figured that Tailwind would be a better fit)
The more interesting part is the client-side tool calling: events are passed back and forth via iframe postMessage.
If you're not familiar with "tool calling" and "client-side tool calling", a quick primer:
Tool calling is what LLMs use to take actions. When Claude runs grep or ls, or hits an MCP server, those are tool calls.
Client-side tool calling means the intent to call a tool comes from the LLM, but the execution happens in the browser.
That matters for: speed, you can't go faster than client-to-client operations and also gives you the ability to limit the data you expose to the LLM. For the demo I do feed the content of the document to the LLM, but that connection could be severed as simply as removing the tool that exposes the content data.
The demo is fully open source, available on Github [2] and the demo is the same as the link of this post [3]
What's not open source is SimplePDF itself (loaded as the iframe).
I could talk on and on about this, let me know if you have any questions, anything goes!
[1] https://github.com/jbarrow/commonforms
[2] https://github.com/SimplePDF/simplepdf-embed/tree/main/copil...
[3] https://copilot.simplepdf.com/?share=a7d00ad073c75a75d493228...
Anything you see missing in Copilot to achieve that?
Not sure if you noticed, but there's an arch-diagram in the info popup [1].
[1] https://copilot.simplepdf.com/?share=a7d00ad073c75a75d493228...
But you're right that it's not as evident as I wanted to, I'm making a small copy update to make it clearer: "Public demo. Your chat messages leave your device and are sent to the selected AI provider. Use sample data only."
(Since there's support for local models, the popup is only displayed when NOT using your own model)
Thanks!
EDIT: the copy update is live, thanks again!
Use cases range from:
- Filling foreign-language forms
- Navigating a contract before signing: "can I trust ALL the clauses here?"
- Pre-filling repetitive forms from existing data sources (CRM, EHR, etc. via MCP/RAG)
Copilot is designed to be embedded; our customers ship it white-labeled inside their own products.