Shameless self plug, but my workout tracking app[1] uses a sync engine and it has drastically simplified the complexities of things like retry logic, intermittent connectivity loss, ability to work offline etc.
Luckily this is a use case where conflict resolution is pretty straightforward (only you can update your workout data, and Last Write Wins)
This is one way to look at it, but ignores the fact that most users use third party community plugins.
Obsidian has a truly terrible security model for plugins. As I realized while building my own, Obsidian plugins have full, unrestricted access to all files in the vault.
Obsidian could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
Or it could have a browser extension like manifest that declares all permissions used by the plugin, where attempting to access a permission that's not granted gets blocked.
Both of these approaches would've led to more real security to end users than "we have few third party dependencies".
When I was young there were a few luminaries in the software world who talked about how there is a steady if small flow of ideas from video game design into conventional software.
But I haven't heard anyone talk like that in quite sometime (unless it's me parroting them). Which is quite unfortunate.
I think for example if someone from the old guard of Blizzard were to write a book or at least a novella that described how the plugin system for World of Warcraft functioned, particularly during the first ten years, where it broke, how they hardened it over time, and how the process worked of backporting features from plugins into the core library...
I think that would be a substantial net benefit to the greater software community.
Far too many ecosystems make ham-fisted, half-assed, hair-brained plugin systems. And the vast majority can be consistently described by at least two of the three.
Game dev also rewards people for applying the 80/29 rule effectively and you see less of that in commercial software.
In each game generation there’s a game that would be easy to write on the next or subsequent generation of hardware and is damned difficult to implement on the current one. Cleverness and outright cheating make it work, after all fashion.
The game simulation will get more detailed/granular as aesthetics dial down in perceived value. You can always go bigger/wider/more procedural/more multiplayer.
This is also why every hard problem eventually shows up — games are just simulation + interaction, and eventually everything that can be simulated will have some attempted implementation out there, struggling along. (For some reason, this does not appear to stop at “interesting” things to simulate — see all the literal simulators on steam)
The simulations have yet to release photo-realism in lieu of event-perception, where simulation parallels reality, but that's not really playable as a game, only as a view.
I came to learn that even though in process plugins are easier to implement, and less resource demanding, anyone serious about host stability and security can only allow for plugins based on OS IPC.
And in general, it will take less hardware resources that the usual Electron stuff.
Kernel design is (to me) another one where ideas have flowed into other software fields - there were monolithic kernels, micro kernels, and hybrid kernels, and they all need to work with third party modules (drivers)
The lessons from all fields seem to be relearnt again and again in new fields :-)
Because learning how to make a proper one requires building your own broken one first.
It might be slightly sped up by reading up on theory and past experiences of others.
I am around mid life and I see how I can tell people stuff, I can point people to resources but they still won’t learn until they hit the problem themselves and put their mind into figuring it out.
A lot of stuff we think is top shelf today was tried on mainframes in the late 80’s through the 90’s. Cloud computing is mostly recycled 90’s “fashion”.
See also people trying to bring Erlang back into fashion.
Having recently read through a handful of issues on their forums, they seems to brush aside a lot of things. It's a useful tool but the mod / dev team they have working with the community could use some training.
If you're using a flatpak, that's not actually the case. It would have very restricted access to the point where you even would have to explicitly give it access to user /home.
I „love” such sandboxing defaults. Apps like Docker Desktop also share the whole home by default [1], which is pretty interesting if a big selling point is to keep stuff separated. No idea why node_packages need to have access to my tax returns :). Of course you can change that, but I bet many users keeps the default paths intact.
Yeah, I forgot there’s the intermediate VM level, and user folders are shared there so that folders could be mounted to the individual containers using host paths.
Interesting, I thought I had to turn that on for Obsidian!
The first time I started installing flatpaks I ran into a bit of permission / device isolation trouble and ever since then, I use flatseal after installing an app to make sure it actually has access to things.
I'm not claiming it's a security feature of Obsidian, I'm saying it's a consequence of running a flatpak - and in this situation it could be advantageous for those interested.
I have been using firejail for most of these kind of applications, be it Obsidian, Discord, or the browser I am using. I definitely recommend people start using it.
I feel like I should keep track of all my comments on HN because I remember writing a lengthy comment on firejail more than once. I cannot keep doing this. :D
For user-space, there is usually bubblewrap vs. firejail. I have not personally used bubblewrap, so I cannot comment on that, but firejail is great at what it does.
The last comment was about restricting clipboard access to either X11 or Wayland which is possible with firejail quite easily, so if you want that, you can have that.
So do you configure firejail to give each app their own separate, permanent home directories? Like "firejail --private=/home/user/firejails/discord discord", "firejail --private=/home/user/firejails/chromium chromium", and so on?
FWIW, once you start whitelisting, it will only have access to those directories and files only, so Discord has no access to anything other than its own directory and ${DOWNLOADS}, which I should probably change.
You should check out the default profiles for many programs / apps under directory "/etc/firejail".
[1] You run it via "firejail Discord" or "firejail ./Discord" if you name it "Discord.profile".
I treat LS as a privacy/anti-telemetry/anti-accident tool, not as anti malware.
Obviously it can detect malware if there’s a connection to some weird site, but it’s more like a bonus than a reliable test.
If you need to block FS access, then per app containers or VMs are the way to go. The container/VM sandboxes your files, and Little Snitch can then manage externa connectivity (you might still want to allow connection to some legit domains—-but maybe not github.com as that can be use to upload your data. I meant something like updates.someapp.com)
I believe LS has some protections against this. Never tried them, but there are config related security options, incl. protection against synthetic events. So they definitely put some thought into that.
Is this true on Mac? Usually I am notified when programs request access outside the normal sandboxed or temp folders. Not sure how that works in any detail though.
Funny enough, I thought this earlier about Arch Linux and it's deritives. It was mentioned on reddit that they operate on a small budget. A maintainer replied that they have very low overhead, and the first thought that popped into my mind was that most of the software I use and rely on comes from the AUR, which relies on the user to manage their own security.
If engineers can't even manage their own security, why are we expecting users to do so?
I'm shocked it is most of your software. I think I have under a dozen AUR packages. It has been that way for about a decade. I added a couple for gaming recently (mostly because Lutris just crashes for me), but nearly all of my software comes from the official repos.
I think this criticism is unfair because most common packages are covered by the core and extra repos which are maintained by Arch Linux. AUR is a collection of user build scripts and using it has a certain skill cliff such that I expect most users to have explicit knowledge of the security dangers. I understand your concern but it would be weird and out of scope for Arch to maintain or moderate AUR when what Arch is providing here amounts to little more than hosting. Instead Arch rightly gives the users tools to moderate it themselves through the votes and comments features. Also the most popular AUR packages are maintained by well known maintainers.
The derivatives are obviously completely separate from Arch and thus are not the responsibility of Arch maintainers.
Disagree. AUR isn’t any trickier than using pacman most of the time. Install a package manager like Yay or Paru and you basically use it the same way as the default package manager.
It’s still the same problem, relying on the community and trusted popular plugin developers to maintain their own security effectively.
I understood GP's point to be that because Obsidian leaves a lot of functionality to plugins, most people are going to use unverified third party plugins. On arch however most packages are in core or extra so for most people they wont need to go to AUR. They are more likely to install the flatpak or get the appimage for apps not in the repos as thats much easier.
yay or paru (or other aur helpers afaik) are not in the repos. To install them one needs to know about how to use AUR in the first place. If you are technically enough to do that, you should know about the security risks since almost all tutorials for AUR come with the security warnings. Its also inconvenient enough that most people wont bother.
In obsidian plugins can seem central to the experience so users might not think much of installing them, in Arch AUR is very much a non essential component. At least thats how I understand it.
> Its also inconvenient enough that most people wont bother.
> in Arch AUR is very much a non essential component.
While somewhat true, we are talking about a user who has installed Arch on their machine. If a user wanted to not bother with installation details, they would've installed Ubuntu.
> If engineers can't even manage their own security, why are we expecting users to do so?
This latest attack hit Crowdstrike as well. Imagine they had gotten inside Huntress, who opened up about how much they can abuse the access given: https://news.ycombinator.com/item?id=45183589
Security folks and companies think they are important. The C suite sees them as a scape goat WHEN the shit hits the fan and most end users feel the same about security as they do about taking off their shoes at the airport (what is this nonsense for) and they mostly arent wrong.
It's not that engineers cant take care of their own security. It's that we have made it a fight with an octopus rather than something that is seamless and second nature. Furthermore security and privacy go hand and hand... Teaching users that is not to the benefit of a large portion of our industry.
> It's not that engineers cant take care of their own security.
I dunno. My computer has at least 1 hardware backdoor that I know off, but that I just can't get hardware without any equivalent exploit.
My OS is developed with a set of tools that is known to make code revision about as hard as possible. Provides the bare minimum application insulation. And is 2 orders of magnitude larger than any single person can read on their lifetime. It's also the usable OS out there with best security guarantees, everything else is much worse or useless.
A browser is almost a new complete layer above the OS. And it's 10 times larger. Also written in a way that famously makes revisions impossible.
And then there are the applications, that is what everybody is focusing today. Keeping them secure is close to useless if one don't fix all of the above.
I'm developing an Obsidian plugin commercially. I wish there was a higher tier of vetting available to a certain grade of plugin.
IMO they should do something like aur on Arch Linux and have a community managed plugin repo and then a smaller, more vetted one. That would help with the plugin review time too.
The plugin is called Relay [0] -- it makes Obsidian more useful in a work setting by adding real-time collaboration.
One thing that makes our offering unique is the ability to self-host your Relay Server so that your docs are completely private (we can't read them). At the same time you can use our global identity system / control plane to collaborate with anyone in the world.
We have pretty solid growth, a healthy paid consumer base (a lot of students and D&D/TTRPG), and starting to get more traction with businesses and enterprise.
I think it's a matter of time until we see a notable plugin in the obsidian space get caught exfiltrating data. I imagine then, after significant reputational harm, the team will start introducing safe guards. At a minimum, create some sort of verified publisher system.
Don’t most plugin models work this way? Does VSCode, Vim, Emacs, and friends do anything to segregate content? Gaming is the only area where I expect plugins have limited permissions.
Browser extensions also have a relatively robust permissions-based system.
If they wanted to, one would guess that browser-ish local apps based on stuff like Electron/node-webkit could probably figure out some way to limit extension permissions more granularly.
I would have thought, but it has been how many years, and as far as I know, there is still no segregation for VSCode extensions. Microsoft has all the money and if they cannot be bothered, not encouraged that smaller applications will be able to iron out the details.
I think it's just because supply-chain attacks are not common enough / their attack surfaces not large enough to be worth the dev time... yet...
Sneak in a malicious browser extension that breaks the permissions sandbox, and you have hundreds of thousands to millions of users as an attack surface.
Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs...
>Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs..
Attackers just have to hit one dev with commit rights to an app or library that gets distributed to millions of users. Devs are multipliers.
The time has come. The nx supply chain attack a couple weeks ago literally exfiltrated admin tokens from your local dev machine because the VS code extension for nx always downloaded the latest version of nx from npm. And since nx is a monoreop tool, it’s more applicable to larger projects with more valuable tokens to steal.
The solution at my job is you can only install extensions vetted by IT and updates are significantly delayed. Works well enough but sucks if you want one that isn't available inside the firewall.
>Browser extensions also have a relatively robust permissions-based system.
Yeah and they suck now. We need a better security model where it's still possible to do powerful stuff on the whole machine (it's MY computer after all) without compromises.
>We need a better security model where it's still possible to do powerful stuff on the whole machine
That's not possible. If you can do powerful stuff on the whole machine by definition you have no security. Security is always a question of where you create a perimeter. You can hand someone a well defined box in which they can do what they want, you can give someone broader access with fewer permissions, but whether vertically or horizontally to have security is to exercise control and limit an attack surface.
That's even implicit in the statement that it's YOUR computer. The justification being that there's a dividing line between your computer and other computers. If you'd be part of of a network, that logic ceases to hold. Same when it comes to components on your machine.
vim and emacs are over 30 years old and therefore living with an architecture created when most code was trusted. Encrypting network protocols was extremely rare, much less disks or secrets. I don't think anything about the security posture of vim and emacs should be emulated by modern software.
I would say VSCode has no excuse. It's based on a browser which does have capabilities to limit extensions. Huge miss on their part, and one that I wish drew more ire.
I'd love to see software adopt strong capabilities-based models that enforce boundaries even within parts of a program. That is, with the principle of least authority (POLA), code that you call is passed only the capabilities you wish (e.g. opening a file, or a network socket), and not everything that the current process has access to. Thomas Leonard's post (https://roscidus.com/blog/blog/2023/04/26/lambda-capabilitie...) covers this in great detail, and OCaml's newer Eio effect system will has aspects of this too.
The Emily language (locked-down subset of OCaml) was also interesting for actively removing parts of the standard library to get rid of the escape hatches that would enable bypassing the controls.
Linux has seccomp, but I think that was changing the access for an entire process. The language-focused aspect seems useful to me, from that application aspect where maybe I want access to something, but I don't want to pass that access on to all the code that I might call from a library.
You have to get out the beaten path to get plugins into Vim/Emacs. It's not difficult, but you don't have access to a marketplace open to the world from the get go. I think Emacs have ELPA, but I would put that at the level of OS repos like Debian/Alpine.
iirc vscode has RCE by design when you use the remote editing feature (i.e. editing files on a server, which is obviously a bad idea anyway, but still a feature) and nobody gives a fuck.
> The code in Mods for Cities: Skylines is not executed in a sandbox.
> While we trust the gaming community to know how to behave and not upload malicious mods that will intentionally cause damage to users, what is uploaded on the Workshop cannot be controlled.
> Like with any files acquired from the internet, caution is recommended when something looks very suspicious.
I think they meant games that specifically come with a sandboxed scripting layer. Otherwise, I agree that most mods are indeed just untrusted patches for a native executable or .NET assembly.
I guess the intent behind Cities Skylines's support for mods is just removing the need for a mod manager and enabling Steam Workshop support.
I was thinking more Lua/Luaua which make it trivial to restrict permissions. In general, the gaming client has access to a lot more information than it shares, so to prevent cheats from plugins, the developers have to be explicit about security boundaries.
One of the large dependencies they call out is an excellent example: pdf.js.
There is no reason for pdf.js to ever access anything other than the files you wish to export. The Export to PDF process could spawn a containerized subprocess with 0 filesystem or network access and constrained cpu and memory limits. Files could sent to the Export process over stdin, and the resulting PDF could be streamed back over stdout with stderr used for logging.
There are lots of plugin systems that work this way. I wish it were commodofied and universally available. AFAIK there's very little cross-platform tooling to help you solve this problem easily, and that's a pity.
Another thought: what about severely sandboxing plugins so they while they have access to your notes, they have no network or disk access and in general lack anyway for them to exfiltrate your sensitive info? Might not be practical but approaches like this appeal to me.
I use “Templater” and “Dataview” but now I am rethinking my usage; they were required for the daily template I use (found here on HN) but this is probably overkill.
As someone who specifically started building Octarine, just for this reason, I understand.
Having to rely on random devs for the most basic functionality and passing it off as `community does what it wants` is weird. Either add it in yourselves, or accept the fact that given your app requires external contributors to work at a little above the basic level, there are going to be security issues.
Writing a whole blog post, and throwing shade on "other apps" that have far more dependencies than Obsidian is weird to me.
Anyway, it seems like you can't really talk bad about them, since there's a huge following that just comes at you, and that feels weird, cause they apparently can throw shade, others can't just talk back.
> could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
This app deals with very critical, personal, and intimate data – personal notes and professional/work-related notes, but proudly has an Electron app. This alone has seemed like a massive red flag to me.
There are better alternatives. It's just that people have convinced themselves they need the features Obsidian offers - because it makes them feel smart and important.
At the end of the day, you're just taking notes. If you write a journal, don't put it in something like Obsidian. Even Apple Notes is better (in security, privacy, etc) in this regards.
Well I’m pretty convinced I need obsidian because it’s just the best way to manage stuffs and I hate overcomplicated stuffs.
I use it to remember stuffs and classify important informations, for instance I had issue fo years with my government to end my enterprise. I made a note in obsidian « enterprise closure » and every time there was a mail, I would save it as pdf and import it into the note. Same for every letter, scan -> import. I could put out my thoughts, next step, … on the note.
Because it doesn’t crypt notes or attachments, spotlight can index my notes and I can still search for the attachments.
And the back links are just a bonus because you always end up having multiple side note linked to a main note.
I have yet to see another software that allows me to do these kind of things. Apple notes absolutely sucks on attachments.
Putting attachments in folder with notes at the top is hard to search, things always get lost or duplicated.
I believe there are companies tools for that kind of things like « SERM » but then obsidian just works and it’s free.
Not much different from naming the PDF and the TXT file with the same title (but different extension) and writing your thoughts inside the TXT file. Also searchable in spotlight.
Point is, you don't need Obsidian (or all of its plugin). People have been making do with Dropbox and plain text (.txt) files perfectly fine for years.
Wow I never knew I "can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem".
Plain-text folder on a cloud sharing service. Edit with notepad.exe or whatever editor you prefer. Others have been doing it with .doc files forever, or .rtf.
It's no worse than vscode. Sure there's permissions, but it's super common for an extension to start a process and that process can do anything it wants.
Because it is one of the most popular dev tools out there? If not the most popular. It also uses Electron, like Obsidian. Has thousands of plugins, like obsidian.
Plus vscode is maintained by a company with thousands of devs. Obsidian is less than 10 people, which is amazing. About plugins why blame the product, pls check what you install on your machine instead
My personal take is that the only way to be reasonably sure you're OK is to install as few apps as possible and then as few plugins as possible (and ideally stick to the bundled ones only). I don’t think it’s controversial, but for some reason this is not how many people think, even if in the real world you don’t give keys to your place to everyone who says they’re cool :)
Among others, this is a big reason I want effect systems to gain more attention. After having seen them, the idea that in most languages, the only option is that any function can do anything without keeping track of what it affects in its type signature is bonkers to me.
I agree Obsidian plugins do nothing about safety. But I'm not sure "most users use plugins", that's not my impression from reading the subreddit. I wonder if there's any data on it?
Is this true? Is there any source about how many obsidian users use third party plugins? For once I don't. Moreover, obsidian by default runs in "restricted mode" which does not allow for community plugins. You have to specifically enable it to be able to install community plugins, hence I assume somebody who does that understands the risks involved. How many people even get into enabling that?
For me it is not even about security firstmost, the whole appeal of markdown is simplicity and interoperability. The more I depend on "plugins" the more I am locked in into this specific platform.
Operating systems are different though, since their whole purpose is to host _other_ applications.
FWIW, MacOS isn't any better or worse for security than any other desktop OS tbh....
I mean, MacOS just had it's "UAC" rollout not that long ago... and not sure about you, but I've encountered many times where someone had to hang up a Zoom or browser call because they updated the app or OS, and had to re-grant screenshare permissions or something. So, not that different. (Pre-"UAC" versions of MacOS didn't do any sandboxing when it came to user files / device access)
Yes, on desktop, Obsidian plugins can access files on your system, unless you run it in a container. On iOS, iPadOS, and Android the app is sandboxed so plugins are more constrained.
This is not unique to Obsidian. VS Code (and Cursor) work the same way despite Microsoft being a multi-trillion dollar company. This is why Obsidian ships in restricted mode and there's a full-screen warning before you turn on community plugins.
VS Code and Obsidian have similar tradeoffs, both being powerful file-based tools on the Electron stack. This fear about plugins was raised on the Obsidian forums in 2020 when Obsidian was still new, and Licat explained[1] why it’s not possible to effectively sandbox plugins without making them useless.
So... what do you do?
The drastic option is to simply not use community plugins. You don't have to leave restricted mode. For businesses there are several ways to block network access and community plugins[2]. And we're currently planning to add more IT controls via a policy.json file[3].
The option of using Obsidian without plugins is more viable in 2025 than it was in 2020, as the app has become more full-featured. And we're now regularly doing third-party security audits[4].
But realistically, most people want to run community plugins, and don't have the technical skills to run Obsidian in a container, nor the ability and time to review the code for every plugin update.
So the solution that appeals to us most is similar to the "Marketplace protections"[5] that Microsoft gradually implemented for VS Code. For example, implementing a trusted developer program, and automated scanning of each new plugin update. We plan to significantly revamp the community directory over the coming year and this is part of it.
Note that Obsidian is a team of 7 people. We're 100% user-supported[6] and competing with massive companies like Microsoft, Apple, Google, etc. Security audits are not cheap. Building an entire infrastructure like the one I described above is not easy. We're committing to doing it, but it wouldn't be possible without our supporters.
I'm sorry but this is a gross oversimplification. You can also apply this to the human brain.
"<the human brain> cannot think, reason, comprehend anything it has not seen before. If you're getting answers, it has seen it elsewhere, or it is literally dumb, statistical luck."
Just to draw a parallel (not to insult this line of thinking in any way): “ Maybe it's because I only code for my own tools, but I still don't understand the benefit of relying on someone/something else to _compile_ your code and then reading it, understand it, fixing it, etc”
At a certain point you won’t have to read and understand every line of code it writes, you can trust that a “module” you ask it to build works exactly like you’d think it would, with a clearly defined interface to the rest of your handwritten code.
> At a certain point you won’t have to read and understand every line of code it writes, you can trust that a “module” you ask it to build works exactly like you’d think it would, with a clearly defined interface to the rest of your handwritten code.
"A certain point" is bearing a lot of load in this sentence... you're speculating about super-human capabilities (given that even human code can't be trusted, and we have code review processes, and other processes, to partially mitigate that risk). My impression was that the post you were replying to was discussing the current state of the art, not some dimly-sensed future.
It's not a title that suggests much subtlety or wisdom will be found in the article.
EDIT: ... which is unfortunate, bc it's actually a decent read w/ interesting points about the relationship between (programming preferences) and (emotions / psychology).
Didn't the parent comment compare Sonnet vs Codex with GPT5?