Posted by cmbailey 13 hours ago
That said, the headline is misleading. This article is about a social engineering attack that requires the user to actively reject multiple safety warnings in Obsidian. As far as I know this is a proof of concept, I haven't seen any reports of users being affected by this attack.
In 2026, applications, third or even first party, don't need to have full-disk access, and are not given either. They see a jailroot environment. I give full disk access to the terminal app, and a handful of others. 90% of them, nope.
At least that's the case in macOS, I'm pretty sure Windows can do that too. Linux of course has had such capability since forever, but I guess most distros you need to manually take care of it.
Would love to enable this for all apps, and add exceptions for the ones that need more access.
I installed Lulu and BlockBlock recently, and want to do more to harden my Mac.
When an app tries to access something outside of its sandbox, you get a notification asking to approve or deny. Full Disk Access I think needs to be explicitly given on System Settings (Privacy & Security -> Full Disk Access).
I have no idea how to do that in Windows though.
I use Obsidian because it does not treat me like a child. They can add more nags and banners for normies, but the capabilities should remain.
I think by that logic dangerously-skip-permissions and openclaw should've never been a thing. I agree that people use them too liberally, but I think at some point you have to find a balance between systemic safety risks and individual freedom.
One can reduce every tool to a toy and justify it with some hand-wavy security slop, but removing capabilities destroys use cases.
The ability to control your tools is good. You should be able to run anything on your devices. Therefore, those who propose the toyification of tools should carry the burden of justifying the change.
The same infantilization of users currently happens with Signal, where high-level decision makers are asked by strangers to share their deepest secrets. Since these strangers introduce themselves very nicely, users start blurting out their secrets. ... now everyone is pretending this is a Signal problem. It is not. The world is not a kindergarten and people have agency.
A good compromise is to set a safe mode as the default and include an option that lets users confirm they know what they are doing. Obsidian already does this. Given that, I do not understand why anyone would demand to make the entire tool worse.
I wonder: What level of user effort would make you comfortable with users exiting safe modes? Would you want users to be able to run software with full permissions at all?
If you mean for the security of the app without plugins you can currently inspect the app's code in app.js and review third-party audits:
I've been using open source alternatives for different purposes for some time.
Obsidian would've been a great choice as open source note taking software. As it is now, it's just one sale, one exploit or one corporate rug pull away from being turned into something else.
Third party audits are meaningless. They were done for one specific version of the code at one point in time. There's literally nothing preventing a malicious version of the software from being shipped. The same goes for plausible deniability on security vulnerabilities in the context of plugins (even with these alleged prompts that the user has to skip on purpose).
Is this like a popup? which most people actively accept without blinking
I think plugin/extensions should be a bit harder to run by default. I get the user friction from extra hurdles before using their plugins etc., but I don't think there is an actually safe way to execute arbitrary code, unaudited, without sandboxing, or other restrictions.
There's no protections beyond that, community plugins can do whatever they want. Thankfully, the vast majority of them are open-source.
IMO that's an issue in and of itself, but it doesn't read that way in the (very unclear) original article.
Plugins like Tasks do offer a Query functionality that allows me to list e.g. weekly tasks on my daily template, replicating most of Noteplan's workflow, except Noteplan relies on being able to easily link those tasks into daily template by drag and dropping them, which internally assigns a unique but hidden by default ID in ^129abz notation (https://help.noteplan.co/article/138-synced-blocks). The latter is already supported by Obsidian, it's just not as "clean" and, AFAIK, impossible to get done when drag and dropping.
So I did the (imho) only sensible thing, and run Obsidian in a sandbox (bwrap). By doing so, I also made sure it runs in a separate networking namespace. For now, I disallow any internet access.
The amount of rage I see here is a bit strange, the whole attraction of Obsidian is that you can turn it into a Swiss army knife (that can hurt you too ofc).
@kepano: you would greatly help me if you could force plugin authors to list the urls they want to access inside the manifest, then let the user per url decide if they want to enable it. I still see some stupid plugin authors download their assets from a CDN or a vague website, from deeply buried in their code. Making url depencies explicit helps firewall automation at a first step. Maybe you could revoke direct network access from plugins, but i am not too knowledgeable about Electron.
I have to imagine the Obsidian team is going to respond seriously to this and I look forward to seeing what they do. They have my full confidence. I'm surprised the system was initially designed as it is without those better permissions and sandboxing, though.
Obsidian has the proper protections in place to prevent this type of attack, and the victims are being convinced to ignore them. This is just a successful social engineering event. I hate to see Obsidian dragged down by this headline, since this attack is not exploiting a vulnerability in it or its plugin system.
>Due to technical limitations, Obsidian cannot reliably restrict plugins to specific permissions or access levels. This means that plugins will inherit Obsidian's access levels. As a result, consider the following examples of what community plugins can do:
Community plugins can access files on your computer.
Community plugins can connect to internet.
Community plugins can install additional programs.
Obsidian has no protection at all. Installing a plugin gives it full access to your computer.This was only a matter of time, and honestly I think it's inexcusably negligent that they shipped a plugin system like this at all since about 2010 (or arguably much earlier).
Folks will reply "but I use it every day without plugins".
That position disregards software usability as a formal discipline, along with decades of UX research and standards.
It's horse hockey. Plenty users use the vanilla Obsidian.
> Folks will reply "but I use it every day without plugins".
Because they do. You're saying that they should lie about their usage to fit your narrative?
Because in general, "usable" means "people use it". Which they do for Obsidian without community plugins without issues.
All I want is a top-notch Markdown editor with a mobile app and trustworthy sync, and that's what Obsidian gives me. And if ever Obsidian goes away or is enshittified, I'll still have a perfectly good folder of Markdown documents that I can take elsewhere.
Seriously though, I agree with your sentiment that community plugin security can and needs to be improved, but how does someone saying they use it every day "disregard software usability as a formal discipline, along with decades of UX research and standards"
Obsidian Plugins are still incredibly vulnerable. A compromised plugin will essentially take over your machine. There's no sandboxing of any kind. It's even more insecure than browser extensions (that could steal your auth tokens, but at least don't have unfettered access to your filesystem).
This is really unfortunate. I love Obsidian and am a paid subscriber for many years, but the community plugins needs a security overhaul asap, before someone gets hurt.
- Home folder read/write access
- System folder media
- System folder mnt
- Microphone access and audio playback
- And more...
The Obsidian snap [1] is installed with the --classic flag, which also grants access to the whole home directory, but at least you have to consciously specify the --classic flag to grant this permission.
If you're running GNU/Linux, chances are you'll have hundreds, if not thousands, of pieces of software that run totally unsandboxed.
Yes, a very small minority of applications are unfortunately primarily distributed via flatpak or snap, and the distributors don't care about the user experience, so it's error-ridden and problem-ridden, but chances are you can get a "normal computer program" version of it unencumbered by such grossness.
Besides. They said "all software on your machine". That is trivially false, to a significant degree.
This combination of software relying on third parties without security seems to be untenable. Personally I've gotten rid of just about as many extensions as I can anywhere and switched to batteries included software.
If you install a dozen mini-apps from random developers you never heard about, you can't complain if one is malware.
Krita also has a plugin system based on Python. Any "plugin" has the same level of access as running a python script.
Personally I blame operating systems for not providing a way to isolate how programs interact with user files.
There are of course complications, costs, and downsides associated with doing that. It might not be worth it currently, or performance costs might be too high, or the community might be overwhelmingly using abandoned plugins that won't be updated, etc. It's still a decision to remain complacent until forced by attacks though, it's well beyond common knowledge that these things happen so you can't really call it ignorance.
WoW's whole UI is built in the same Lua environment as add-ons, and Blizzard has implemented some interesting restrictions (like the taint system[0]) to prevent add-ons from completely automating gameplay.
0. https://wowpedia.fandom.com/wiki/Secure_Execution_and_Tainti...
0. https://warcraft.wiki.gg/wiki/Secure_Execution_and_Tainting
1. https://wowpedia.fandom.com/wiki/Wowpedia:About_the_wiki#Bac...
I'm somewhat convinced that taint-influenced capabilities is a good future model to pursue. Computers are fast, I'm fairly confident that it chould be done at whole-computer scale and still be reasonable... though probably not with a million electron apps. Which is likely a good thing in aggregate (I say as a fan of web tech and the very compelling features such things offer. Great for minor or PoC, not for major pieces of software).
You simply can't expect every software that wants a plugin system to have the same security practices as the most used software in the world.
In fact, there are many reasons why you might want a plugin to have full filesystem and internet access, such as batch processing or simply adding things directly from webpages. Sandboxing this will just make plugins less useful.
In the end it's a problem of trust. You're installing software from untrustworthy developers because you trust the name of the application those plugins are associated with.
You could fix the problem in Obsidian, but the same problem will happen in other software. Some of which simply can't justify bothering with sandboxing plugins. This is just the way plugins are.
I'm not saying that I think they should, or that I expect them to. I'm saying that it's one particular implementation of sandboxing that has a bunch of interesting properties, and that makes it worth studying.
If I side-load a camera app, it still has to ask for camera privileges the same way any Play store app does.
Is there something in your message I missed about how it relates to this article or is this just being uninformed about side-loading?
I agree with the claim of negligence. I think they were more than happy to reap the benefits of a thriving community plugin ecosystem, and were hoping this page would provide enough CYA when security breaches inevitably occurred.
> TIP: If you're working with sensitive data and wish to install a community plugin, we recommend that you perform an independent security audit on the plugin before using it.
I wonder just how many plugins received a security audit.
I don't think they meant it this way, but I honestly consider unsafe official plugin systems to be negligent to the point of being actively malicious. By releasing one, if you ever become successful you have explicitly chosen to screw over an unknown number of your users to save yourself a relatively small amount of work in the short term. It might be single digit users, or it might be septuple digit users - is it really worth it?
(Unsafe unofficial plugins, like most games? Mildly unfortunate but I think that's fine. Though a healthy modding community around your stuff should be a VERY STRONG sign that you should introduce a safe version to protect your users, if it won't cause you to implode (it definitely can)).
* Android's permissions model where the user must approve specific potentially undesirable classes of actions (separate from the 24H delay, etc controversy)
* Optional sandboxing
I think the value of this disclosure is more in spreading awareness about plugins, and demonstrating the vector. Where less sophisticated users may think, "Oh, this is just a collection of markdown files. I don't need to be too worried about malicious code."
Same way I run any other application that could potentially execute untrusted code.
> It enables malicious versions of legitimate Obsidian plugins ('Shell Commands' and 'Hider') that are present in the shared vault.
Much easier to just skip that part.
So yes, it’s too much work (in the sense that you need to have a security-focused leadership that understands that this is a lot of work but the right thing to do).
(I actually use LogSeq, but same idea applies).