Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Warn HN: Monarch upgrade deletes all files in $HOME
114 points by divan on March 19, 2022 | hide | past | favorite | 19 comments
There is a neat tool for Flutter development called Monarch.

Due to the bug if you run 'monarch upgrade' it deletes all files in HOME :( https://github.com/Dropsource/monarch/issues/38



What a weakly worded changelog item they have published (https://monarchapp.io/blog/version-1.7.4):

> The monarch upgrade command used to delete old monarch files. Under certain conditions, a bug manifested where the command could delete non-monarch files.

The releases where it can happen should be immediately yanked from wherever they can be downloaded, and their (Dropsource) social media should be filled with warnings not to run the `upgrade` command if they truly feel bad about deleting peoples most important files ($HOME compared to any other directory on disk).

Seems they have ownership of wherever it downloads the update from "https://d2dpq905ksf9xw.cloudfront.net" so they can make it fail with a error message about it being unavailable, at least to stop from more people deleting files.

@divan: maybe rename this submission to "WARN HN: $title" or something to make it more clear what is happening.

On top of everything, it's not even clear what the problem was nor what the fix was. The authors just made a new release saying "It's fixed" without showing the patch itself. Not sure if Monarch is fully open source, but after spending 10 minutes digging through sources and releases, I can't find the place where it was adjusted to not delete all files in $HOME.


They have just rolled out this change to the API.


I'm glad to hear they didn't wait longer to try to avoid wiping people's $HOME which they initially planned. Looking forward to seeing their release notes being updated as well.


What's the 'best' way of protecting against this sort of thing?

Extremely frequent diff based backups? Firejails?

The latter seems a more appropriate/thorough solution, but isn't that convenient, seems mostly people use it for select programs they already slightly distrust.


Use the package manager your OS ships with and stay away from customized install scripts from various 3rd parties. This is not the first time this has happened, most famous one was probably that Steam bug some years ago that deleted everything for people.


OS packagers have bugs like this every so often, because they still run custom upgrade scripts behind the scenes. Pretty sure I remember one in Debian a few years back (sorry, name escapes), and it was not the first time I remember seeing that.


I might just have been lucky then as that has never happened to me in almost +30 years of using various distributions like Ubuntu, Debian, Arch, CentOS, Redhat and more. Literally not a single time this has happened to me on either desktop nor server. But as I said, I might just have been lucky. Looks like I will increase my backup frequency now.


The first time I installed Linux was around 2007. I use it as my main OS since 2010.

I have only lost data due to hard drive failure, or at my own request (i.e formatting over what I thought I had backed up).

I recently enabled TRIM on my full-disk-encrypted SSD system. No sign of data loss yet.

I have always used ext3/ext4.



Extremely frequent diff based backups?

My personal preference is to rsync to a backup host in a daily cron. Hourly for important directories. That backup host then runs rsnapshot [1] locally and the snapshots are made read-only. Important files are then backed up off-site. rsnapshot is just a perl script that creates multiple diff snapshots and uses hardlinks to save space. It is available for most distros. I believe it is also in macports and homebrew for mac.

[1] - https://wiki.archlinux.org/title/rsnapshot


Periodic btrfs or zfs snapshots are an easy solution to this class of issue, and simpler than full off-site backups (which are also important but can be done less frequently).


Years ago we used a simple 3rd party cloud backup an all employee computers. It was very easy to find old versions or files. Might be useful for more companies.

Google Drive and OneDrive has similar solutions today but they often only takes the home folder, the one we used could take a lot more.


Use Qubes OS. It will help you in different ways. Firstly, if a problem happens, only the specific environment is affected (such as the project you're working on), secondly any changes outside of the home directory will be reverted (this was not an issue in example in question but it can happen in other cases, such as a sudo make install deleting stuff from /usr/bin), finally if something really bad happens you can always revert the specific vm where the problem happened.

A lot of information about Qubes talk about protection against attacks, but it's also useful against unintentional issues, and I'd argue that's a much more common case where Qubes shines.


Use docker for local development (will protect files outside the project), version the project with git (will mitigate the damage if you mount the source folder as a docker volume) and make sure the .git folder does not end up in the container file system.


This is why I often say that the typical linux user-space has zero security. Yet people like to call sandboxes security policy, e.g. why shouldn't the OSX terminal be able to enter the Documents folder, etc -- for this reason exactly.


Hey, at least this is an actual bug and not so-called "protestware".


On Dropsource's corporate website it says they have a "Patented Dev Process". https://www.dropsource.com/


The product is really called drop source? That sounds like something Little Bobby Tables would type maliciously. In this case though, it is doing what it says on the tin.


I'm bookmarking this one for the next "just trust me" discussion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: