If any modern software company was given a spec sheet for Ventrilo (https://www.ventrilo.com/about.php) and asked to develop it, you would get back a few hundred Mb installer that needed to download additional components, would send telemetry to their servers all day and check for updates, add background services to startup, and consume half a gig of ram while continuously managing to use a percent or two of your CPU at idle and probably even tapping the GPU to do god knows what.
Meanwhile Ventrilo 3 has a 5.4 Meg installer, consumes 4 megs of RAM, and does none of those things. The newer bloated version has a 7.9 meg installer.
It should be noted that Ventrillo requires DirectX. This is part of where the "hundreds of megs" are hidden.
Though honestly, crossplatform software too often just bundles Electron rather than relying on the entire OS layer (on Mac and Windows anyways) having all of this stuff built-in. For bugs and reproducibility it's nice, but it really sucks for "downloading the same bytes over and over again". This is downstream of OS vendors just historically not fixing bugs, but in an alternate universe people would have working OS stacks and we would use that.
Everyone moved to Discord which does all the things the parent complains about, but somehow it’s still a better product. As if most people don’t really care about those, for the better or worse.
This is one of the luxuries of developing only in-house software for a business. You have none of the incentives to add consumerware bloat, and in exchange you get all that extra time to refactor and focus on making things svelte, performant, scalable, documented and concise.
As much as any company is willing to load their customer-facing or retail software with a million kinds of garbage, that's the last thing they want in their own toolkit.
One of my personal career mottos is to focus on a secondary objective, as I’m miserable with all the mess going on around the primary objective.
One of the things that inspired that was seeing how much better the mobile version of websites often were than their apps. Thinking about Facebook, Twitter, Reddit and similar. Their apps don’t have tabs, often drop state, and seem to occasionally focus on shiny over usable. My local newspaper prevents you from copying any text in the app. The mobile websites on the other hand tend to mostly have an annoying “our app provides a better experience” banner to bypass, otherwise they’re engineered to work, not push the envelope. They’re important enough for some resources, but it doesn’t seem like there is quite as much noise during the development for the ‘fallback version’. And I think that makes all the difference.
Highly dependent upon the app. Unfortunately the Instagram mobile website is atrocious. It's so bad I can only assume it's intentional on Meta's part because they'd rather drive us to their native app for tracking purposes.
Ventrilo is voice chatroom software (like discord) that is self-hostable.
It's been around for 21 years, has a wikipedia page [0], and is mentioned in at least one old meme-worthy DOTA song [1], so the OP likely didn't feel it needed an introduction.
ok, can you please explain in 2 sentences what the software is? Like MS Word is a software to compose documents, or KeePassXC is a password manager, for example. Is it too much to ask from an "About" page?
Under the headline (all at the top, next to the logo) it says "Surround sound voice communication software". So fancy Skype ;-)
But yeah, that page is not for people who have no clue! But maybe that's not that bad? Idk how many who comes to that page having no clue what that webpage is about, but I would guess a very tiny percentage. Maybe it makes sence to assume that the wast majority ending up there knows the basics already.
I didn't expect to see Ventrilo anywhere ever again after they've torpedoed themselves off the market when they've stopped distributing their server binaries and people could not self-host anymore.
I feel that we would've had better tooling for writing performant GUI apps today if electron JS never took off.
It certainly has its place, and I laud the authors for their efforts, but seeing how every startup is using electron for their native applications, I have little hope for lean software.
At the end of the day, developers need to finance their projects. No other toolchain out there [1] is going to give you the flexibility, development speed, and freedom to develop beautiful looking desktop apps using the muscle memory you trained while writing webpages. Of course, you can write the same application in Qt, GLFW, whatever, but I don't think anyone will disagree that it's much slower to build and prototype responsive UIs with these tools.
[1] Wry and Tauri (https://tauri.app/) might be noteworthy, but I don't know how much of a difference they make, as the runtime is still JavaScript, HTML, and CSS.
I have another feeling that while what you say is likely true, in a world without Electron, we would have a lot less common software available on Linux with companies opting for native UI toolkits on Windows and Mac only. Qt, though better than electron, isn't that lean [1], and is available under only LGPL or an expensive commercial license. I also don't think Electron is entirely responsible for the slowness in many applications (Slack), but rather the bloated web design itself.
I have been using linux for 20 years and I've yet to use one electron based software.
There is no problem with LGPL. From your link: " QT is free to use as long as you release your code as GPL"
That is false. It is free as long as you don't statically link qt, and you distribute your eventual changes to qt itself (which aren't needed in most cases).
Of course if we spread misinformation, we might draw different conclusions, which is why it is important to start from correct non-made-up premises.
But I also believe that most devs are not comfortable with linking/packaging/dependency management, and therefore don't really know how to handle LGPL (or to say it in a more politically correct manner: "don't have time to handle it properly"). Which is obviously a pity.
> most devs are not comfortable with linking/packaging/dependency management
Using `macdeployqt`, `windowsdeployqt` and `linuxdeployqt` takes care of that in mostly one click. I used to look at other open source apps workflows to figure it out. Now, others can look at my app's workflows for a complete packaging experience for Windows, Linux, and macOS[1]. Huge credit to contributors for my OSS app for implementing and perfecting it.
> At the end of the day, developers need to finance their projects. No other toolchain out there [1] is going to give you the flexibility, development speed, and freedom to develop beautiful looking desktop apps using the muscle memory you trained while writing webpages.
Why do you restrict yourself to only knowing how to write webpages?
> Of course, you can write the same application in Qt, GLFW, whatever, but I don't think anyone will disagree that it's much slower to build and prototype responsive UIs with these tools.
Maybe because you don't have the "muscle memory" to write Qt?
But since you mentioned Qt, one of the greatest hits to developing cross platform applications was the Nokia/Microsoft disaster.
Nokia bought Qt from Trolltech and made it LGPL, because their plan was to make money from the hardware not the software. Then they died, for reasons that have been commented on endlessly.
From the ashes of Nokia rose Digia or whatever it's called this week, a company that maintains Qt badly and thinks it's a good idea to threaten developers that download their LGPL product.
Web programmers massively outnumber systems programmers. The market has been this way for quite some time now. I'm not a web programmer myself, at least not professionally, but it is far too easy to prototype with Electron/Electrino/Tauri/whatever than any other toolchain out there. I've used Xamarin, Qt, ImGui (not the same league, I know), and several other lower-level rendering libraries like SDL, SFML, raylib, GLFW. Nothing comes close to the vast expanse of features that CSS exposes with - often - a single attribute like `transform` or `object-fit`. Of course you can do it with any other language/framework, but it'll be dozens more lines of code. More time. This is assuming equal proficiency in both tech stacks.
JavaScript might be a disaster of a language, but it is faster to make a UI with CSS. I can totally see why startups pick web programming to ship desktop apps.
Qt is definitely not dead, rather alive and kicking. I've built my latest note-taking app[1] in Qt C++ and QML and it's been one of my best decisions. What the Qt Company need to focus on is hunting bugs (there are many of those) and reduce their license fee (at least for indie developers), it's just too expensive[2].
Have they given up on asking you 3 times if you are absolutely truely sure that you can comply with the license terms when you download the LGPL version?
With that attitude I can't recommend it to any entity that can't afford a full time legal team.
What's wrong with LGPL use? From what I gather, if you don't link Qt statically and don't modify Qt's source code, and your app's license doesn't impose restrictions that conflict with the LGPL, you are good to go. Most apps would fall under these criteria.
> Why do you restrict yourself to only knowing how to write webpages?
That’s just how the market is. If you want to build your app with electron, you’ll find mountains of skilled developers everywhere. If you use QT, you’ll either have to pay an absolute fortune for the 10 people that know it, or accept hiring people that have never used it before.
Possibly easier to use a framework that includes a lot of things, rather than a language with no standard library that requires a soon-to-be-compromised dependency for every string function that QString already offers.
We'd be in a better place if web APIs to access system resources didn't suck or weren't intentionally gimped by companies like Apple. Tauri seems like a way out, but can't say without having tried it.
The web is the best cross-platform environment we have as evidenced by the fact that developers flock to stuff like Electron at all, but then you end up needing to ship your own entire browser engine to achieve a reasonable level of control. If PWAs didn't threaten App Store business models we'd probably be in a better place RE web app distribution (just use the browser you're already using anyway).
Doesn't address the issue that web app development is flooded with bad choices and opinions that lead people to decide that the whole ecosystem is overcomplicated and bloated, but that's not an opinion I hold very dear as someone who feels pretty comfortable with web tech stacks and understand where they came from.
I do think the skill bar is too high right now, where most engineers are likely to do a bad job RE performance and security with what we have unfortunately. But I'm not confident your average engineer would do any better if it were a different ecosystem.
> The web is the best cross-platform environment we have as evidenced by the fact that developers flock to stuff like Electron at all
I suspect that web developers who only know web development flock to electron because they think it's easier than learning a new technology.
In my professional experience, it is perhaps easier for a web developer who only knows JS to get a prototype working, but when you want a nice application you end up having to re-implement a number of things that any regular widget toolkit would already offer. So on the long term I don't think it's cost efficient at all, but at that point you've already spent resources on your electron GUI so you keep going forever.
Lack of toolchain is one problem, although I think that there are alternative toolchains out there that are quite feature-rich and could definitely achieve feature-parity with the web ecosystem, if only there was enough interest in using them.
JavaFX, even though it's outdated, is quite up to the job of replacing most Electron-based UIs. Qt is definitely extremely powerful, but has the drawback of being tied into the C++ ecosystem which seems rather dated now. Even some hobbyist efforts are worth mentioning in this category: AvaloniaUI (in the C# ecosystem), HaxeUI & FeathersUI (both in the Haxe ecosystem and building on game engines).
I think, the bigger problem is sourcing developers. Web developers are comparatively cheap and abundant, so a commercial entity is always going to have trouble justifying hiring a comparatively expensive and difficult-to-find developer in the C#, ObjC/Swift, or Java ecosystem, when the job can also be accomplished by a web developer.
You know, I thought I was a fool for writing my apps in JavaFX... but after trying many other things, it seems it's up there with the best of the bunch.
My non-trivial app can be built with jlink to be a no-dependencies on JVM binary for all Operating Systems.... each of which sits at just 30MB... and when run, it needs around 60MB of RAM, which is a lot but I am yet to find a multiplatform UI toolkit that delivers much less than that... except for some toy frameworks which can't really be used for realworld apps.
Telegram is quite good but I honestly don’t notice it being any better than discord. Telegram isn’t exactly light either. It frequently sits at about 400mb memory usage.
Telegram's apps, at least on Apple platforms, are not what one typically would call "native" since they roll their own widgets for basically everything.
I'm more cynical in this. I think another bloated foundation would have taken off in another language. A lot of it comes down to accessibility to the masses. JavaScript for all of its horrors, and the self fulfilling cycle of its horrors, has a low barrier to entry. Think of anything else that's similarly easy and that's where I think effort and attention would have gone. So, if not js, I'm thinking probably in python.
With Slint (https://slint.dev) we're trying to make a lightweight toolkit that doesn't use HTML/CSS. And that you can program either from low level languages such as C++ or Rust. As well as with higher level language such as JavaScript, and we want to extend to python too.
We hope to make it easy to develop desktop UI without using HTML.
Hi there! I would love to see a demo on the website that doesn't look intended for embedded/enterprise devices, maybe a simple Todo app? Best of luck with Slint!
That’s normal and expected since they both use the same browser. Difference is when you have multiple apps open. All Tauri apps, along with your browser use the same libraries and binaries loaded into the memory. Meanwhile every electron app loads another browser into the memory.
Ofc I’m simplyfying a bit since if you use Firefox as your browser and Tauri uses Chromium that’s two browsers worth of memory usage, but still not 5 or 10.
The software crisis was proclaimed to exist as early as 1969. We never solved it. But we did massively improve the scope of what software can do. The natural state of software is always to be almost but not quite at the breaking point. This is actually fine. Occasionally something goes wrong and then we fix it. And we move on. But the amount of stuff that works just fine is actually constantly growing.
The average software project sits on ginormous mountain of existing software. Libraries, components, tools, operating systems, etc. As a percentage of the overall source code, the tiny bit you add is a vanishingly small proportion. All this stuff exists, is being maintained by someone, and replacing it with something else has very low economic value. It adds negative value when it doesn't work because then you have to fix it or deal with the problems it is causing. But if it works as advertised, it just levels the playing field. Because everybody else is at that level as well.
Your attention as a software engineer should be focused mainly on things that others don't have that are valuable. It's always been like that. What has changed over time is the amount of stuff that you no longer have to build or worry about that much. That's the value of cloud based services. You get a lot of decent quality stuff that you pay a premium for that would be very expensive to match with in house development. Reinventing wheels like that is not lean but stupid.
This article is a bit hyprocritical. The example is an image sharing tool, but you don't resize images at all.
So your server binary is tiny, but you're serving full size unoptimized images and wasting bandwidth for every single user that visits the site, as well as the uploader. Even thumbnails are served full size.
In this case you're optimizing for the wrong thing.
> > In this post I briefly go over the terrible state of software security, and then spend some time on why it is so bad.... The security of software depends on two factors - the density of security issues in the source code, and the sheer amount of exposed code.... It is not just the amount of code that is worrying. It is also the quality, or put another way, the density of bugs.
> This article is a bit hyprocritical. The example is an image sharing tool, but you don't resize images at all.
Can you explain how serving full-sized images opens up additional security vulnerabilities?
I don't see the connection between your argument about bandwidth and the OP's argument about attack surface.
Attack surface is about all the ways in which your software might be used to harm you or your customers. It's more than just remote code execution or DOS attacks.
For many use cases stripping EXIF is a hard requirement for user privacy and security, and it's reasonable for OP to point out that cutting that out to cut lines of code would be inappropriate in many situations.
Privacy in that sense is security. Never heard of OSINT? EXIF tags are of course security relevant.
/e: to make it more obvious: if I know your neighbourhood I can just blackmail you, I don't even have to hack you. I can gather information by maybe finding out your identity, getting insight into security questions and how you might answer them. I can find newspaper articles you were maybe part of etc. etc.
If you put a photo with your location information on the internet, that’s your problem. It’s not the responsibility of whatever website you’re putting it on to decide for you if you want to share your location.
CVEs are not the be-all-end-all of information security. CVEs are usually assigned to software that is distributed, not to web-based SaaS products, social media services, or similar, which are all the places where EXIF data leaks come into play.
For example, there was no CVE issued for the security flaw that leaked private information of 530 million Facebook users before 2019 [0], but that was obviously a significant security flaw.
Edit: Also, regarding "privacy is not the same as security"—the line is a lot fuzzier than you think. At my org the same team ("infosec") is responsible both for the security of our products and the enforcement of rules regarding PII, because they're tightly interrelated—the main concern with security incidents is that we might lose PII. There's a reason why one of the 7 data protection principles in the GDPR is security [1]—without it there is no privacy.
Which is not a security issue per se, is it? If the goal of the project is to self-host it and share it with family, then keeping the EXIF may be a feature.
It opens up the security vulnerability of your software not meeting requirements and thus users deciding instead to use other software, workarounds, and hacks.
Middlebrow dismissal. Author has an experimental sandbox for running imaging operations but hasn’t integrated it yet. Aka not 100% finished. Details at the site/github.
While I generally agree with the article, the "reality check" project seems somewhat forced: there are many widely used projects that are fairly lean. Many of the GNU projects are like that, not to mention explicitly minimalistic ones, and generally on a GNU/Linux system (either server or desktop) you have a few notable resource hogs, but the rest you would barely notice. Many of those can be built without Python build-time dependencies, too. And there are tens of thousands of proper packages in common distributions' repositories, so surely it is possible. The mentioned Electron, Node.js, and SaaS examples probably happen more often in certain other settings, perhaps enterprise software tends to be like that. Likely its advocates would bring up speed and cost of development, and it may be argued that even security is improved with those, given the other constraints. A more interesting "reality check" may be to take a few actual (and preferably somewhat widely used) bloated projects, and implement leaner alternatives, while fitting into the same constraints: similar profits for commercial projects, maybe similar initial expertise levels, time spent, and perceived impact for non-commercial ones.
That aspect could be leaner, indeed, although all those processes put together can easily consume fewer resources than a single bloated program or a web page, especially if they do not include a particularly large DE. The systems I have in mind as references are Debian with Xfce (runs on an old Atom-based netbook, taking 600 MB of main memory altogether), or Debian on a server with a bunch of common services (web, email with related services, XMPP, Gopher, IRC bouncer, authoritative and caching DNS servers, etc), also consuming under 600 MB (under 400 without DNS cache), with CPU load coming mostly from fail2ban, but being pretty close to zero. On the other hand, there are KDE and GNOME, which would probably at least double that resource usage.
To be clear, I brought up GNU/Linux distributions as examples of container-free packaging, and as notable collections of relatively lean programs, but not necessarily as an example of the combined systems being particularly lean themselves. Though then again, compared to something like recent Windows versions, perhaps even the Linux-based systems with larger DEs would seem lean.
It really depends on the distro. But of course there is the same risk as ElectronJS: getting a huge Ubuntu distro that ships with everything is faster than building up from a lean distro.
Maybe we should drop the idealism and be realistic: for various reasons bloatware is not going away. If we were to admit something like that, the next question is what can be done? We need better tooling for generating and reasoning about software manifests and supply chains, and we need better tooling and training for at least lightweight formal methods in design / development phases. Industry adoption of such things is not impossible but it needs to be more accessible to devs without phds and it needs to be faster/cheaper.
It's not always that we need better tooling. Here I think we need better developers, as in "developers who care about this issue".
Developers are the ones including the bloat, right? If all the developers started working slower but including less bloat, what would the managers do? Probably the managers have no clue about the complexity of what their devs do: they just compare the tasks with t-shirt sizes (with estimations that are generally completely wrong, but they still use them).
The problem is that if one developer works slowly writing less bloated code, and their 3 coworkers keep adding bloat, then not only it's still getting bloated, but the first dev appears to be less productive.
It's a kind of competition between developers, where those who do the better job lose.
Bloat is the only endgame as capabilities and complexity increase, so tooling is the only answer. Stacks on stacks of software with millions of lines is the present and the future, so we have to get used to it, because it happens in a world with perfect developers Or in a world without whatever problem languages etc.
Minimize the usage of JS-based libs/tools, perhaps?
Yes, I'm looking my daily tools like VSCode, Postman etc which are Electron-based. Perhaps rewriting it into Go/C++/Pascal could shrink the bloat.
Mesa alone is like 50mb of tightly optimized x86-64 machine code (Windows 95, by comparison, came on 8x1.4mb floppies); and that's just to talk to the actual kernel drivers - you need that for the GPU to draw things on the screen (unless you want software rendering = idle silicon).
Does every program need all of that code? There's libraries in there to handle OpenGL (in half a million different versions), Vulkan, AMD, Intel, Nouveau, etc... Nope, you usually need just the tiny bits relevant to your application+hardware. But what's easier, figuring out which bits you don't need - or making the stack more portable and future-proof, by always shipping everything?
A lot of complexity is accidental, but most of it comes from conscious choices to make life simpler for everyone. Of course taken to the logical extreme, we do end up with Electron, but where are we supposed to draw the line?
Even these "simple" operating systems managed to pack an incredible amount of complexity - again, just to deal with hardware, portability, different APIs, etc. Consumers - we, we demanded all of that.
If we really wanted/needed simpler software, OpenBSD is right around the corner. I've used it on&off as a daily driver for a bit, and it has an incredibly high ratio of code quality/readability vs how practical it is for everyday things (while remaining very portable). But simplicity is an uphill battle.
"Sane" package managers is how you end up hiding the difference in complexity between installing a static Go executable, an Electron app, a C++ Gtk/Qt app, a Python app, etc.
Hiding the complexity is not necessarily bad, but that decision should be made consciously. Some complexity is inherent to the problem space, some is accidental - but even the latter must sometimes be tolerated. So again, how do we draw the line?
> How is a system package manager hiding the difference in complexity between Go executables, Electron apps, etc?
Consider:
apt-get install myapp
You have no way of knowing if this will install a bundled copy of Chrome or not.
CGO_ENABLED=0 go install example.com/myapp@latest
You can be 100% certain that it will produce exactly one static executable (OR fail to build anything), that all dependencies are pure Go, inspect the SBOM, read the entire source (including the compiler, which is reproducible), and so on. There are certain ways to sidestep these guarantees, but these involve very dark magic and are non-portable.
Also consider that the former command merely installs a pre-built artifact - try building Chromium from source. How is this not hiding complexity?
> My point was that a sane system package manager has maintainers for the packages and should not accept to ship a full OS in a single package.
How do you make that choice? Where do you draw the line? Why does Emacs make the cut, but VS Codium doesn't?
Wait, you are defending multiple points at the same time: "only pure Go", "only open source", "reproducible builds", "static linking". Not that I disagree with all of that, but I think that they don't all go as an argument against shared libraries (shared libraries can benefit from reproducible builds, for instance).
Also you can go check the recipe of your package, and if you choose a distro that only ships open source software, then it gives you the sources that were used to built it. You can even build it yourself. But again those are not points that I believe go into a "static vs shared" debate. Or do you disagree with that?
> How do you make that choice? Where do you draw the line? Why does Emacs make the cut, but VS Codium doesn't?
That's a distro philosophy. That's all a distro does, and you choose the distro that you like. Some distros will ship everything they can, some will be minimal, some will ship only free software, etc.
IMO it is not the developer's decision how I want my software packaged. It's a distribution question. The developer should provide the code open source, and let package maintainers build it and ship it in their distribution. If the developer (and the language tooling) can make it easy for maintainers to make their choices, then that's good. On the contrary, when the language officially refuses shared libraries, I think it oversteps.
> Wait, you are defending multiple points at the same time: [...]
I've been asking all of this time: where to draw the line between striving for simplicity, and just shipping software? How much complexity is acceptable because it's absolutely necessary, and how much of it is acceptable because we're busy doing more interesting things?
I keep pulling counter-arguments from both sides, because I don't think this line is clear. The closest thing we have to an objective measure seems to be the patience of the humans who have to deal with said software.
> The developer should provide the code open source, and let package maintainers build it and ship it in their distribution.
We're going a bit off topic here, but this has proven to be a poor model for many applications - see Linus' lament on trying to ship Subsurface builds for Linux, and his endorsement for AppImage.
Also: every single proprietary app in existence, which is a many-trillion-dollars industry.
> On the contrary, when the language officially refuses shared libraries, I think it oversteps.
Well that's an example of the practical trade-off that Go has made: they'd probably prefer to live in an ideal world, where you ship a static binary 100% of the time and completely refuse non-Go code, but instead they built cgo, which made C interop easy and practical.
Well I agree: the line is not clear at all. It is a matter of use-case and also of preference. Hence my point: developers should let those who build and distribute the code decide what they want, and organize their code such that both work (usually it does not require a lot of effort from the developer).
> see Linus' lament on trying to ship Subsurface builds for Linux, and his endorsement for AppImage.
I would think that this is his preference. My preference is to be given the sources and to build it myself, which allows me to maintain a package for my preferred distro.
> but instead they built cgo, which made C interop easy and practical.
Yes, I think this is great! On the Rust side, is cargo-c an official project though? It doesn't seem like it...
The article doesn't mention the most basic cause and issue: motivations. One guy writes absurdly bloated and bug prone code, but creates workable components/products/whatever in a month. Another guy writes tight, elegant, and rock solid code for the same thing in 6 months. The first guy's getting promoted and the second guy's going to end up having to explain his 'lack of productivity.'
It creates a motivation system of to get something working, or even more cynically to just get something that looks like its working. The half dozen intermittently en vogue development acronyms also further this mindset. I don't see how to overcome this issue, because it's something like a tragedy of the commons. Nobody (at the top) wants to reduce bloat because it would likely reduce rather than increase profit on short time frames. Yet, at scale, it's leading to a complete enshittification of all software.
The funny thing to me about this is that Trifecta is still less lean than the minimal “image sharing site”: copy the images to a web root (for drag and drop, use Samba, NFS or your favorite remote mounting protocol over an internal network). Have a minimally-configured web server serve the image (perhaps with some html generator to make an actual page).
A while back, I wanted to host a pastebin for sharing bits of code and other text. Then I realized that about 10 lines of (compile-time) elisp gave me everything I needed to turn any webserver I could ssh to into a pastebin with no runtime dependencies aside from nginx:
https://fwoar.co/pastebin/3daaf7ce49ca221702c70b0d10ac5caec8...
I don't disagree, but few non-techies will be able to download an image off of an FTP server or mount an NFS share. Thats where such a service adds value.
I know my perspective is really skewed, but don't filezilla and the like make ~FTP easy enough? Or failing that, even Windows must have the hooks to (implement an add-on to) mount SFTP and treat it like a thumb drive? Why is it harder to use FTP/SFTP than a fancy web frontend?
Because a lot of people nowadays just use a browser. That’s 99% of what a computer is to them.
“What’s a filezilla?”
“go here, drop files there” instead of “download this, put this here, connect, drop your files here” makes the difference between people using your service and not using your service.
> Because a lot of people nowadays just use a browser. That’s 99% of what a computer is to them.
Doesn't mean they could not learn that it is not. I think that developers underestimating users is a problem. "They are too dumb to run a desktop app, so let's put everything in the browser" is a weak argument to me.
This is less true of software like the one mentioned in the article that are intended to be self-hosted, though: I have to walk my users through using it anyways and so I can show them FileZilla or install a SFTP filesystem on their computer or setup syncthing/dropbox for them to handle uploads. There’s lots of ways to avoid exposing a custom file upload service for this sort of thing.
I’m struggling with this post because it seems to imply that software quality has gotten worse over time. Bluntly put: I think this is nonsense.
I remember using Windows 9x, the running jokes about poor quality and security of all MS products. Adobe’s formats came from those early days and are roundly mocked. Hell, I’ve built replacements for 90s software and, I can assure you, what I replaced was not high quality or robust at all.
On this very site, we discussed Horizon: a project started in the 90s and 20000s that was so badly built that it led to hundreds of innocent sub-postmasters being imprisoned, bankrupted and a number committed suicide.
Is the author just romanticising the “good old days?”
I think there was a kind of "golden period" that goes in between.
In the 90s, the economics around software had already heated up to the point where there was an insatiable appetite for software engineering manpower, but the university system wasn't yet geared to churning out specialists in this field in such large numbers, so a lot of software engineers back then were people coming from other professions who picked it up autodidactically and were just not very good. At the same time programming languages and tooling weren't yet at a point where they were good at guiding people towards good software engineering practice, and this lead to a kind of software quality crisis.
But this situation changed fast. I would say from maybe roundabout 2003 to maybe roundabout 2013 there was a bit of a "golden period" where we had good reason to be optimistic about the future of software quality. The software quality crisis of the 90s was largely overcome through better education, better software engineering methodology, and better programming language ecosystems and toolchains. Back in those days we still had purpose-built tooling for doing things like desktop UIs. Windows Forms based in C# and Aqua-era MacOS GUI programming in ObjC were actually quite a good experience for both developers and users. We also had cross-platform ways of doing GUI programming like Swing on Java.
In the next ten years, i.e. the ten years leading up to now, things took a decided turn for the worse. If I were to speculate about the reasons, I would say it was related to the rise of mobile, and the continued rise in the importance of the web platform over the desktop platform, meaning that application development now had to straddle web, mobile, and desktop as three distinct development targets. This created a need for truly cross-platform application development, while Apple and Microsoft continued to make plays to fortify their monopoly power instead of giving the world what it needed. Swing/JavaFX lost its footing when enterprises decided that web was all they really needed.
So, to answer your intial question: Has software quality really gotten worse? I would say, yes, over the last 10-15 years definitely. If you compare now to the mid-90s, then maybe, maybe not.
> Has software quality really gotten worse? I would say, yes, over the last 10-15 years definitely.
By what metric?
Taking all your above examples, I (and many others) could argue that the move to web brought new techniques that overall improved software for developers and users. That's not to say I'm right, or you are, but to point out that everything you put forward is purely subjective.
What has objectively gotten worse in the past 10 years?
On the user's side: Just pick any set of well-established best practices such as Shneiderman's Eight Golden Rules or Nielsen & Molich's 10 Usability Heuristics, an then pick a typical 2024 electron app that has an equivalent from the 2003-2013 era and is written with a typical UI technology of the time (such as Windows Forms), and compare the two UIs with respect to those best practices. -- I'm pretty sure you will find usability blunders in today's software that you simply couldn't commit back then, even if you tried. -- Essential UI elements being hidden away (with no indication that such hiding is taking place) based on viewport size, leaving the user unable to perform their task is one thing that immediately comes to mind. Another example I happened to experience just yesterday: UI elements disappearing from underneath my mouse cursor when my mouse cursor starts to hover over them.
Also: Just look at the widget gallery in Windows Forms, providing intuitive metaphors for even quite subtle patterns of user interaction and check how many of those widgets you find implemented in modern web-based design languages and web component systems. ...usually you don't get much beyond input fields, buttons, and maybe tabbed-views if you're lucky. So today's software is relegated to using just those few things, where, 10 years ago, you had so many more widgets to pick and choose from to get it just right.
On the developer's side: Was JavaScript ever actually designed to do the things it's being used for today? Is dependency hell, especially in the web ecosystem, worse today than it was 10 years ago?
> Just pick any set of well-established best practices such as Shneiderman's Eight Golden Rules
Excellent, we have something objective to look at. Now, where's the studies, reports, etc. that this has declined in the past decade? I'm not asking for a double-blind, peer reviewed study, just something a bit more concrete than "stop the world, I want to get off."
> Was JavaScript ever actually designed to do the things it's being used for today?
> [...] Now, where's the studies, reports, etc. [...] "stop the world, I want to get off."
This argument is getting a bit tediuos. It started with you offering an opinion. I offered a counter-opinion, while clearly marking my opinion as such using language such as "I think ...", "I would say ...", "If I were to speculate ..."
I'm clearly not alone with my opinion (see original post), and you're trying to undermine your opponents' credibility by getting ad-hominem and pointing out that their position lacks the kind of research which you yourself did not provide either.
> > Was JavaScript ever actually designed to do the things it's being used for today?
> Was anything?
Hyperbole. Many things were designed to do the things they now do. Lua was designed as a language for embedding. SQL was designed as a language for querying databases.
...because I happened to come across it in my bookmarks just now, there's an article by Don Norman [1] that made the HN frontpage somewhat recently [2] sharing my pessimistic view about usability today. Admittedly, he has a conflict of interest, making money by telling people how bad their UIs are and how to make them better. But he definitely is very respected, and, in my opinion, deservedly so.
> [...] strong "kids today" vibes. [...] entire industry has forgotten how to do our jobs.
The OP seemed to be pessimistic, your initial point was "It was pretty bad in the mid-90s, and it's no worse today" which is really not very optimistic either, and my point was "Well, it was bad in the mid-90s, then got better, then got worse again". So, FWIW, I think that my point was actually somewhat more nuanced than the pessimistic context. I was also expressing optimism towards certain technologies while expressing pessimism towards others.
> your initial point was "It was pretty bad in the mid-90s, and it's no worse today" which is really not very optimistic either,
Partially but I suspect things are better than everyone keeps saying. The whole "it used to be better" is a meme I see in all walks of life and I want to see something to prove it beyond a bunch of opinions.
Do I think some things are worse? Yeah, probably, such is the way of life. Do I think the entire industry went to shit? That's something that even the most respected people will need to provide evidence for. It seems a bit strange that I somehow joined the industry in 2010 and spent my entire career getting shitter and shitter.
I've never seen a chat app taking gigabytes of RAM before Electron, for example.
I've extremely rarely seen applications going nuts, eating several CPU cores and draining my battery in 20 minutes before Electron, for example. Now it's a weekly occurence.
It's improved only for developers who only know web development. And we users pay for it in hardware costs, electricity costs etc.
> I've never seen a chat app taking gigabytes of RAM before Electron, for example.
Is that a general software problem or a problem specific to Electron? Is that a permanent problem or a problem right now because of the technology and your attitude towards it?
I say this because I do recall seeing complaints about Java being bloated in the 2000s. I briefly used Swing in my university days and it was pretty awful compared to HTML at the time. In 2044, maybe I'm going to be shaking my fist at the new-fangled tech and telling everyone how nice Electron apps were in comparison.
> complaints about Java being bloated in the 2000s.
It's bloated in the 2023s too. Last year I caught Android Studio (which I wasn't even using at the moment, just had it open from a quickie fix a few days ago) going over 4 Gb of ram. I had two projects open, under 20k lines of code total (ok, maybe I should count again).
But why bring Java in? We're talking about native applications vs applications that pull in a copy of Chrome and half of the npm registry. Java isn't native either.
> We're talking about native applications vs applications that pull in a copy of Chrome and half of the npm registry.
You might be but I'm not. I'm talking about the state of software in the 1990s, 2000s, 2010s and today and how a general "it's worse" isn't particularly useful (or probably even true).
Oh and... if you want problems specific to Electron... I'm pretty sure Discord was keeping all the cat pictures and memes that it had ever displayed uncompressed, in ram, for a long while. Memory usage of several gigabytes if you had the meme channel open. Even it displayed only the last 3 cats.
It's better these days but it was a problem for years. And tbh I'm not sure they fixed it or even realized or cared about the problem or one of the 2498127 npm packages fixed it.
> Oh and... if you want problems specific to Electron...
It seems you don't get my point so let me be explicit:
Pointing out issues with a single framework that powers a subset of software does not mean there is a general decline in software quality across the industry.
It's 2024, why are we still blaming everything except the Operating Systems?
> simple products importing 1600 dependencies of unknown provenance.
Put yourself back in 1984... you've got an IBM XT with 2 floppy disks. You made write protected copies of all your important disks, and even more copies of your boot disk.
You'd go to a computer show, or your user group, and come home with stacks of software of unknown provenance, and then just try everything out over the next few weeks.
You were safe because your system made it easy to know what you were risking when you ran a program. There was one simple rule that was easy to understand:
Only un-write protected floppy disks in the drives were at risk.
That quite limited computer system was, in effect, a capability based security system. Crude, but extremely effective.
Here it is 40 years later, and the ability to just run code with abandon like we used to seems to be a fantasy to younger people. Because we don't expect our operating systems to be at least as safe as MS-DOS on an IBM-XT.
"I hope that this post provides some mental and moral support for suffering programmers and technologists who want to improve things. It is not just you, we are not merely suffering from nostalgia: software really is very weird today."
What about suffering software users.
"! want to end this post with some observations from Niklaus Wirth's 1995 paper.
"To Some, complexity equals power. (...) Increasingly, people seem to misinterpret complexity as sophistication, which is baffling - the incomprehensible should cause suspicion rather than admiration.""
Who were the "some people" to which Wirth referred. A wild guess: software developers.
I know many people push for interoperability, but that is a very hard problem. Open APIs are easy; I should be able to write e.g. my own Slack client, for my specific platform.
It is not a problem that there exists an (official) ElectronJS Slack app. The problem is that I am forced to use it. And what does it bring to Salesforce, except an opportunity to add telemetry in the app? With an open API, they would still make companies pay 5$ per account per month.
I believe that open APIs would enable better clients for popular services.
That is why I've decided to develop all my apps in Qt C++ and QML. For example I've created Plume[1] - an alternative to the resource-hag Electron app Notion. It's the fastest block editor in my benchmarks (on the website). Faster than the fastest comparable native app on macOS.
Yes, Qt is quite a bloaty - the binary size is 139.2mb currently, but I think with static linking and some trimming, I can get it much lower.
If you use Qt under the GPL, you can statically link Qt with your application, but your application must also be licensed under the GPL. If not, then yes, you need to pay for a license. That's from what I know.
I already experience the change in perspective that the new legislation caused. The new legislation requires software to be developed compliant to standards that define detailed thread and risk analysis processes and requires suppliers to offer updates for all their customers if a security breach was found in their software or in a dependency. This often is a big logistically challenge.
Both requirements already lead us to rethink which dependencies we want to include in our software.
But you can't have the cake and eat it too. We can't expect every company to maintain native apps for every platform and a website on top of that; not relying on 3rd party packages and instead to write everything themselves. Maintaining all that, they're bound to f-up at some point and expose vulnerabilities.
Yep, for some reason only poor indie gaming companies do that.
It's unrealistic to expect companies the size of Microsoft to take a break from putting spyware in your operating system and, for example, revert the piece of shit Skype has become back into a native app.
The grafana/grafana-oss docker image has a bin/grafana static go binary that takes up 180MB. It compresses to 20% that size as does the long tail of .js.map files each of which are 1MB to 10MB in size. Presumably these are only there for debugging Javascript. I wonder why they were included?
The whole image is ~80MB, compressed. It is, indeed, impressively lean.
To commenters claiming this is nonsense, I can point at Firefox packaging for Linux, which is now on Snap, and it's like Docker container. A little version bump happens silently and eats up __extra__ ~500 MB (old versions are kept!), where it used to be (just ~5 years ago) like 50MB for the whole binary, and it was replaced upon update.
And recently, I've installed `clickhouse-client` (a new SQL database), which needs almost 900 MB for just a CLI client!!! Absolutely insane!
I use QGIS, which is an open source alternative to ArcGIS, and a non-IT friend asked something to draw maps and see imagery -- I recommended QGIS, and he wrote: "1 GIG download? WTF IS THAT?" Oops. We didn't notice the little alternative open-source app turned into such a behemoth. (https://download.osgeo.org/qgis/windows/weekly/?C=M&O=D -- actually, since last year, it grew by 20%!)
The reason for this kind of bloat to me seems the race for version updates. And it probably did make sense in late '00s, when you could claim Linux ecosystem be underdeveloped. But 15 years late, it's still here. Every package is updated at high pace, breaking stuff downstream, and now instead of settling on compatibility, everybody just started to ship docker containers.
> I've installed `clickhouse-client` (a new SQL database), which needs almost 900 MB for just a CLI client
clickhouse-client is just a symlink to the main ClickHouse binary. That binary also includes the server and a lot of useful utilities.
It's large, yes, but it's on purpose it's super useful when you need to have a single binary for server, client, Zookeeper server, Zookeeper client, local data analysis tool etc.
Google are the worst. The dependencies for something a simple as the official cloud storage client are horrendous. I just want to download a file from a bucket, why do I need all this cruft.
I just downloaded a freshly minted app from Notion called Notion calendar. It looks beautiful, and it's download size is 84Mb! Most likely an electron app.
I think his point was mostly about "writing software". As in you won't need specific software anymore and the computer/network would do what you tell it without specific "apps".
Meanwhile Ventrilo 3 has a 5.4 Meg installer, consumes 4 megs of RAM, and does none of those things. The newer bloated version has a 7.9 meg installer.