I think it was the inability for GUI's to stay stable. Gnome went off the rails, Unity completely ruined Ubuntu[1], and KDE always felt very cheap and hard to navigate. My opinion, yes, but I always felt the GUI changes in Linux were extremely gratuitous after Gnome 2.
Couple this with the fact that for a decade, the actual GUI was tied very, very heavily to the apps. So you had to use Gedit in Gnome, or whatever the KDE version of a fucking text editor is. Seemed really, really stupid. Can't we just have a relatively standard GUI text editor across platforms? Or were we all just expected to use vi and emacs?
When the windowing layers stopped moving around so much, and Greg Kroah-Hartman and co. had gotten the driver problems under control, we suddenly had a bunch of different Linuxes that were no longer compatible with each other due to packaging systems. So, now users had to download a deb, or a yum package, or a gentoo package... It was all very confusing for desktop users, especially beginners.
My quote to sum this up would be: "Why are these packages in my repository if they won't work after I install them because I'm using KDE/Gnome/Cinnamon?"
[1]: Unity took Ubuntu from being the default Linux I could put on anything, to a bloated, slow moving sack of crap. I could put Ubuntu on a 2001 laptop and be completely fine, fast enough, usable enough to do just about everything I needed.
Put Unity Ubuntu on that same machine, and it's unusable garbage. I can't even hit alt-F5 to get to another terminal because Unity slows the whole rig down to an absolute crawl. Like, can't even move the mouse, 5 minutes between keypress and action taking place. Fuck Unity. I blame Unity about 40% for killing the Linux desktop.
I don't think it's the gui. I'm a relatively savvy user running Debian on a Thinkpad for more than five years now. Printing is still touch and go. I spent countless hours to get sleep working correctly. It is broken now and I don't have the energy to fix it knowing that nothing I do is guaranteed to last through the next apt update. Wifi will mysteriously refuse to connect with some routers (a daily struggle at the uni).
Most of my frustrations come down to these three basic things: printing, power management, and wifi.
I was excited to delve deep into OS internals to troubleshoot in the beginning, bit now I'm just annoyed.
Printing still sucks. Out of the box Ubuntu with my WF-3540...
Add Printer shows the printer. Great! I hit Add and it says "Here's the OpenPrinting driver for it". Great!
I click "Use this driver". It spins. And then acts like nothing has happened. I click it again. Same. I hit Cancel. Oh. Now my printer is installed... with the Generic Text Driver.
So I go to OpenPrinting.org directly and download the driver myself. Oh, now I see the issue. A whole bunch of dependencies that are prompted for here, but weren't prompted for or installed in the printing dialog.
Okay, let's try this again.
Printer Found. Great. Add it. Okay, my driver is in the driver list. Progress. Select, and Add.
"Success!".
I hit Print Test Page. Nothing comes out. "Processing..." Nothing.
Then I look.
And despite having found my printer on the network, Add Printer has added it "connected to localhost". WTF?
Alright. Delete printer.
Add printer. Ignore the found printer, find my printer's IP address manually, and add it that way. Go through things and now I am actually able to print, and everything works great.
But holy crap.
It's 2017.
It's Ubuntu.
How the hell is something this simple this broken?
I have the same thing happen when I download the .deb file for Chrome from the site, open it in the Software Installer (or whatever it's called), it opens an icon in my tray and sits there forever, doing nothing. It's not until I do a sudo dpkg -i that I know that dependencies are missing and that's the reason it won't install. Explain that to a newbie.
Is it just me or did they forgot to add css? Nothing wrong with a HTML-BASED backend. But for Tux sake, get some fancy Frontend-Dev to have it look nice out of the box
Sleep has never been a problem for me on any Thinkpad with Ubuntu. Debian, on the other hand, was nothing like pain for bluetooth, wifi, suspend. And don't get me started about the UMTS dongle I once bought that was useless under Debian but worked out of the box (even with a wizard) on Ubuntu. Then again, that dongle never really worked under Windows either (the drivers it auto-installed caused blue screens, and when they didn't, the abominably ugly software that came with it, never found the dongle).
Just as a counterpoint, I use Debian on a Thinkpad as well, and while I don't print, wifi and sleep work without problems, and so did a SIM card I put into it for 3G - it 'just worked', and the only config I needed to do was pick which one of my ISP's "AP" names to use. That was surprisingly easy.
As another counter-counterpoint: I use Debian on a T460, and sound and wifi don't work out of the box. While it may be OOTB for some people that doesn't mean it's everyone. I think I've only had one laptop in the past 10 years (~4 laptops) that has worked OOTB with Linux.
I've had an IBM Thinkpad (2003-2009), a Lenovo netbook (2009-2011), a Sager-branded Clevo (2011-2016), and an HP Envy 15t (current). The Thinkpad...well, 2003 was still a rough time. Especially early on, the basics worked, but everything else was a fight. The netbook had to be connected to ethernet to grab the Broadcom wireless firmware. The Sager had Intel+Nvidia dual GPUs before that was a well-suported use case. The HP is the first machine I can think of where everything "just worked". I improved battery life by tweaking some things, but it was a longer fight to get Windows 7 working on it.
Comparing those to Windows installs on the same hardware at the time I bought them, the HP was lovely, the Lenovo was about equivalent, and the IBM and Sager stank (it'd be cheating to say that loading modern distros on them would be much nicer ;-) )
Oh, I just mean that some things work really smoothly - it's not just a basket case. I didn't have to install or modify anything with the 3G modem and the network manager just took care of it (and prefers a wifi connection if one is live), which was surprising. This is an option that the biggest prestige laptop maker doesn't even offer.
Debian in particular won't work OOTB on most computers because their philosophy excludes proprietary drivers in the base install... and wifi loves proprietary drivers. That's a 'Debian-philosophy-specific' issue, not a 'linux' issue.
> Put Unity Ubuntu on that same machine, and it's unusable garbage. I can't even hit alt-F5 to get to another terminal because Unity slows the whole rig down to an absolute crawl. Like, can't even move the mouse, 5 minutes between keypress and action taking place. Fuck Unity. I blame Unity about 40% for killing the Linux desktop.
Have you tried Xubuntu? It fills the gap that Ubuntu used to, and it's my go-to desktop OS now (apart from on my MacBook because only OS X is usable on high DPI displays). I've also installed it for non-technical friends and family and they've found it much more familiar than recent Windows versions.
In an age when the family computer is just as likely to be running Android or iOS, I think the Linux desktop has a niche and we need to start evangelizing it again. Even for the typical home user it feels more powerful than ever when you compare it to such "appliance" operating systems, which Windows and OS X seem determined to become.
Like the OP, I absolutely despise Unity and tried to find a different route. I came across xubuntu - tried it and absolutely love the simplicity. It's minimal, gets out of the way when you need it; but, best of all, it's USABLE. Can't praise it enough.
I've tried so many Linux distros over the years, and Xubuntu impressed me enough that I've abandoned all other distros for around 5 years now. It's got the perfect balance of trade-offs that make it fast, usable, and low-maintenance.
Thumbs up to Xubuntu - best GUI across vendors. The latest Unity desktop is a vast improvement over the initial rollouts (this my current work environment).
> I always felt the GUI changes in Linux were extremely gratuitous after Gnome 2.
Yes! Ironically this was what a lot of people hated about Microsoft, and why the XP->Vista and 7->8->10 changes were so unpopular. Everything moves around and works in a different way, imposing a re-learning cost. But the developers like it because they're bored with the old system and think the new one is better.
Exactly! Those keyboard shortcuts and menu positions are part of my daily WORKFLOW! Stop moving them around and changing them! It is utterly infuriating to have to relearn your desktop GUI behavior.
Why is there no such thing as LTS for GUI interface paradigms? There should be an LTS interface version of every GUI.
I installed Libreoffice 5.3 on my Ubuntu 16.04 LTS yesterday, from their PPA. I kept up to date with LO for 4 years in the same way when I was on 12.04. Same for PostgreSQL. There are plenty of PPAs maintained by the developers of single applications and they tend to support the LTSrd for a while.
I would be fine with not getting new OS features. The problem is that the repo system ties application software versions to OS versions.
I want to install SomeApp. My options are:
* `apt-get install someapp` and get v0.6.0.1abzrhotfix2 from 5 years ago
* Google to see if a third party repackaging repo exists, decide whether I trust the maintainer of that repo with root access to my machine, `apt-add-repo blah blah`, `apt-get update && apt-get install someapp`, hope that apt works and doesn't spit out a bunch of "Depends: blah blah but it is not going to be installed, you have held broken packages"
* Go to someapp.com, download someapp.deb if it exists
* Try to install it with the GUI, get an error, use dpkg -i instead, find out it has a bunch of dependencies, install them and hope they're abi compatible, also hope I don't get "Errors while processing:"
* Download source code, learn the flavour-of-the-month C++ build system, try to compile, fail, try to install dependencies, some fail because my distro only provides incompatible forked versions for political reasons, have to download the source for those too, have to either build statically or use LD_PRELOAD hacks
Meanwhile on other OSes, I can download the latest someapp.dmg/someapp.exe and a nice GUI guides me through installing it. It will probably work even if I have a rather old version of the OS, it almost certainly won't conflict with anything else I have installed, it will include everything it needs and handle updates in whatever way it thinks is most appropriate, and I definitely won't need to google any command line flags. And when I do upgrade my OS, I probably won't be forced to upgrade the app too if I've found a specific version that works exactly how I need it to.
> Meanwhile on other OSes, I can download the latest someapp.dmg/someapp.exe and a nice GUI guides me through installing it.
I much prefer the repo system to the Windows "I wonder if I can trust this site to deliver a safe .exe, and which of the four 'download' buttons is the real one and not just a malicious ad?"
> it will include everything it needs and handle updates in whatever way it thinks is most appropriate,
Yes, because every application needs its own phone-home updater (with questionable security), with bonus "value-add" crapware popping up to remind me it exists.
>I wonder if I can trust this site to deliver a safe .exe
Your first point is only applicable if you are downloading unknown, untrustworthy software from tophotfreeawesomedownloadsnow.com.
>and which of the four 'download' buttons is the real one and not just a malicious ad
I agree that this is a severe problem for non-technical users. For those users, there is a lot to be said for a walled-garden app store (which could take the form of a GUI apt wrapper, I suppose). More experienced users know what the link and download should look like, rendering those ads a mere annoyance. I doubt anyone reading hackernews has fallen for those things.
>Yes, because every application needs its own phone-home updater (with questionable security)
For what it's worth, I usually disable auto-updaters for any desktop app that isn't network-reliant. I manually update if I find a bug or want a new feature, knowing that I can always re-install the old version if I find the new one is worse. I appreciate that this has its own downsides (especially if the user doesn't keep themselves updated on security advisories), but the point is that I /can/ do this whereas it's pretty much impossible on desktop Linux. I would have to choose between several undesirable alternatives, as I detailed above.
>with bonus "value-add" crapware popping up to remind me it exists
I don't have this issue with software I use. If I noticed it, I would try the following steps in order:
* Block network, if the reminders are fetched from the web
* Disable any reminder features in the app's own settings menu
* Revert to an older version of the app which didn't do this
* Consider alternatives, of which there are probably several
Your first reaction is probably "none of that should be necessary and there is generally no crapware on desktop Linux", and this is true. But as soon as you try to use anything that isn't the version in your distro's repo, you are in for a world of painful incompatibilities and package manager hell. For every app you want to do this with.
You have /less/ freedom to set up the system how you want it than you do on Windows/OSX, because even though you technically /can/ install whatever you want, it is horribly time-consuming and aggravating to actually do so.
It's no coincidence. Libre software devs saw what MS and Apple were doing and went "oh no we have to do the same to stay relevant!" Couple that with the rampant egos typical for an OSS project lead and you have today's situation. Of the big 3 (or is it 4 now) DEs, I feel KDE has stayed the most sane and provides the same desktop metaphors. Then again, they mostly bet on the "semantic desktop" direction, which was thankfully stillborn and didn't impact the actual GUI much.
I feel like Xfce and LXDE actually do a decent job here, too (respectively). Both have stayed pretty consistent over the last decade from what I've seen.
openSUSE + Xfce is my go-to for installing a desktop system for a new Linux user. While openSUSE moves a couple things around relative to a typical Xfce installation, they tend to feel a bit more natural to Windows XP users (which, amazingly enough, still exist). Users can typically be relatively productive after some basic education to cover the day-to-day usage quirks ("use Writer instead of Word; use Firefox instead of Internet Explorer; etc.").
The only thing I significantly dislike about it for non-technical users is the scary warning that pops up if you drag a menu item to the desktop and try to run it; sure, it's easy to fix (click on "Mark Executable"), but that's still very unusual and unintuitive, and your average user doesn't know or care what "Mark Executable" means (they might know what "Launch Anyway and Don't Ask Me Again" means, though).
Both have stayed pretty consistent over the last decade from what I've seen.
I used LXDE until they removed the tree panel in the file manager, many years ago. After having left KDE because another update that killed my workflow and reading that GNOME was not any better for this matter, it made me abandon Linux desktop completely, after two years of exclusive use.
I wonder if they brought the tree back, but... too late.
Having worked at Microsoft, I'm convinced that, internally, running applications through the Microsoft menu jumbler is the main justification for most upgrades.
So it's like Calculus textbooks. 13th edition versus 12th edition: let's add 8 pages of new contents, mostly about the early life of Leibniz, but most importantly, let's scramble all the exercise numbers.
Comments like these really make me believe that it is people like you who are killing the Linux desktop.
Of all the window managers and desktop environments I have tried the last years on Linux, Unity, Gnome3, and Cinnamon are the only ones that don't look and behave as if they are from fucking 2003. Those others might be enough for those people who have stopped adapting and like to be left alone under their rock, but for me it's just frustrating having to use Desktops that are basically some half baked variation of Windows XP.
Let me put it this way: if the desktop environments you so eloquently despise wouldn't exist, I would be writing this comment on a Mac and not on Linux.
> Of all the window managers and desktop environments I have tried the last years on Linux, Unity, Gnome3, and Cinnamon are the only ones that don't look and behave as if they are from fucking 2003.
As opposed to, for example, macOS, whose UI has changed almost nothing since OS X launched in 2001, and even then was not a dramatic departure from Classic which dates to the 1980s.
Because it's not broken and change for the sake of change is not an improvement.
There is nothing wrong with creating UI experiments, the problem is that they keep discontinuing support for the existing one which is not broken and people want to continue using. (Microsoft is even worse than Linux in this regard). If you want to create a new UI, create it independent of the old one and let people choose. If more people continue to choose the old one, you have lost. Try again until you have something which is enough better that it actually justifies relearning everything.
An example of this is that a few years ago everyone seems to have decided that 3D transitions are inherently necessary. Not "nice to have but you can turn it off if it causes trouble," but "if you don't have working 3D hardware you can't run this anymore."
Which would be almost excusable if the state of Linux 3D drivers wasn't notoriously bad and installing Linux on older PCs wasn't extremely common.
And they justify these things by pointing to Microsoft and Apple, but those companies have ulterior motives. Microsoft rearranges everything on a regular basis to pressure people to upgrade. Apple increases the hardware requirements for newer versions of their OS because they want you to buy a new Mac. This is not behavior worthy of being emulated.
I think the problem here is if people were happy with what they had in 2003, why change it?
Not everything UI-wise in 7 or later is better than in XP I'd say. And apparently many people agree.
Actually I am still using 7 and all the UI stuff that's nixer than XP or even 2000 is just eyecandy and nothing "I would totally never go back". Because I would - if I needed a stable Windows that only came with XP's UI - so be it.
FWIW, I noticed recently that I've been using xmonad for 5 years (+/- 2 weeks) and still 100% happy. It lets me work efficiently and nothing has changed.
If people were happy with Windows XP in 2003, why change it and install Linux? We deviated from what was good enough 15 years ago! Let's put XP back on the clients and Solaris back on the servers! /s
When I started using Linux, it was about thinking out of the box and about change. It had many exciting window managers that put everything in Windows to shame when it came to customizing and style.
Nowadays I have the feeling, Linux has become the resort of the grumpy and disenfranchized, clutching to whatever was cool when they were young. The levelheaded and progressive jumped ship to OSX and are now probably contemplating switching to Windows 10 after Apple dropped the ball for them. This leaves all Linux progress like Gnome and Systemd maintained by inept pompous hipsters.
If XP was secure, performant on modern hardware, stable and supported by new programs i would still be using it. More then that the reason i quit windows98 was that warcraft 3 doesn't work on it. Windows 7 configuration menus are god awful and windows10 ones are far worse then that.
You are making a fallacy in saying that old equals bad/ugly. XFCE is keeping it oldschool and can still look amazing (xubuntu looks great IMO).
There's much to talk about when talking about GUI things. Much indeed. Many things are even quantifiable, instead of being pure personal preference. Just remember that Microsoft spent a lot of effort into studies in what culminated to be windows95, while many other projects did not (the study that ended up being gnome3 is a joke in comparison).
Honestly gnome shell is the greatest thing ever. My focus and productivity skyrocketet. I see how people who would prefer to run XP dont like it. But no reason to make it bad.
Xfce, kde (possibly unity, maybe enlightenment) can do the former. Dmenu (probably unity, kde, other standalone programs) and enlightenment can do the latter (Enlightenments Everything launcher is amazing).
There are actually issues with Windows XP, the only thing is that IMO people learned how to work around those issues.
I think UIs should either change very gradually and only in ways that literally are better, but also have a way of ensuring the options aren't removed entirely, or at the very least if the option is obscure ala options removed in Gnome 2, then they should be kept and allow someone to create a tweaking tool to change them.
If a huge change is to be made, like the Gnome Shell, then it should be done in a development release and ALL of the compatible UI should be kept on the desktop, with an easy UI option to switch between the modes. It turns out that Gnome Shell could do this - but only after a time.
UX and UIs are one of those things you really should be incredibly careful about foisting on anyone. If you do it, you need to do it somewhat gently.
I'm not sure about the feature you talk about, but maybe have a look at accessibility settings of MacOS. They putted a lot of advanced settings in there targeted at various kind of handicap. However these feature can also be used to improve productivity.
Some developers even uses accessibility interface to add custom overlays & controls (Steam for instance).
> people were happy with what they had in 2003, why change it?
Because the people who were willing to run a Linux desktop in 2003 are not in any way a representative sample of the population. That's like saying "if assembly was good enough for the first batch of programmers, why would we need higher level languages?"
KDE has always tried to play catch-up with Windows regarding style, leaving users in some kind of uncanny valley. And I'm not very impressed with what I've seen lately in KDE. That said, I'm hopeful something useful will come out of it.
I know this sounds ridiculous but KDE/Plasma5 is great if you manage to change the style to something else. Numix + icons is a good candidate. You can choose QtCurve for KDE/Qt5/Qt4 apps and skin that with Numix + Numix Colors and GTK2/GTK3 have native Numix widgets.
I've made peace with Breeze Dark at the moment, it's not so bad.
But optics aside KDE5 is mighty powerful. Dolphin is a great filemanager and I'm missing it on Windows. On the fly ssh/sftp/webdav is also a great feature.
The Plasma NM applet for NetworkManager is also great - especially if you often change settings for interfaces.
The menu works fine, searching is almost instant. I'm using the current KDE from Arch Linux and it's pretty stable for me.
Overall I'm happy and I'm not looking at Unity - that also has it's merits but the whole Nautilus/gvfs ecosystem was kind of unstable for me. YMMV.
What I dislike about KDE are the widgets - besides the essential stuff like time,battery everything feels clunky and slow - the system for adding them is also not so great. I've also never used 'Activities' the whole concept feels pretty much out of place. I'm also setting up my Desktop as background and ignore the "folder in a plasmoid widget" story that is bullshit IMHO.
I'm also avoiding akonadi/kmail/korganizer etc.pp - these might have improved but still felt clumsy last time I looked.
IMHO it's getting better but it's no silver bullet. For most users the default presentation and UI and settings are just confusing and ugly IMHO - if you look past that you will find a lot of good, stable and well thought ideas.
I'm running KDE now on all my Desktop Linux boxes and I can't remember the last time I had some hassle with it. It does what I want, it goes mostly out of my way and it doesn't look too bad.
For all the stuff it only uses 300-400MB RSS memory after a fresh reboot with a few Konsole windows running.
And really don't forget that KDE actually has a file picker that lets you browse by thumbnail, something that has been missing in Gnome since a bug about it was filed in 2005, and it's still not there (yet).
Plasma was before there was anything similar on Windows.
But to be honest my perfect linux enviroment was kde 3.5. And I don't think "regular" people had any problems with uncanny valley. When I left for university I installed kde 3.5 on my parents' and my sisters' computers. My sister switched to windows 8 years later because of some apps she needs for work, parents still use Linux (but recently I installed them kde 5).
> "don't look and behave as if they are from fucking 2003"
2003 desktop environment was fine. Actually, there weren't significant benefits after Windows 95. On the contrary, modifications simply confuse users. I'm a big LXDE & MATE fan for this reason. If it works, don't fix it.
Can someone please explain to me why everyone says that Gnome went off the rails? Gnome 3 to me was, except for leaving behind a lot of legacy systems, an extremely usable and downright pretty UI. Menu, search, windows, and virtual desktops all available from one keypress or one hotspot. I thought that was a stroke of genius myself.
Note the wording of the bug - they aren't just asking the app to adapt to a GNOME 3 change, they're asking it to do so by removing a feature that remains fully supported on other DEs. And if you read the comments, the justification is, "I guess you have to decide if you are a GNOME app, an Ubuntu app, or an XFCE app unfortunately".
It was that incident that made me conclude that GNOME is a destructive force in the Linux (and, broadly, Unix-like) ecosystem these days, at least if you don't like monocultures.
I think the general reason is that it's a lot 'more' with not many benefits. Things seem to be blown up and large, almost garishly so. The window title areas are large, and on systems with bad or no graphics acceleration it will complain and make you use Gnome Shell, which is ironically, in the opinion of many, nicer than Gnome 3 default UI.
I think more than anything it's good old nostalgia. The applications menu is no more, instead it's a grid where you have to search for what you want or whatever. There's a massive screen overlay instead of a nice little dropdown.
Multiple desktops was a good feature that's come with Gnome 3, though.
If I recall, perhaps some indication of the intentions of Gnome 3 can be seen by the fact that you are prompted to drag up (really?! on a desktop?!) in order to unlock after you've entered screensaver or something. So there could be some of the same reasons why people initially disliked the Win 8 view.
But mostly, Gnome 2 was just so simple and familiar, though I don't know if that's my hindsight or whatever. Point and click to what you want to run, put things on your desktop, do whatever. No BS.
I think part of the issue with GNOME 3 is that the instructions on screen for how to use the UI hide the most efficient ways to do so.
The application launcher for me is universally used by hitting super, typing what I want, and hitting enter. I never actually click on anything. Same with swiping up on the login screen, you can do that on a touch screen, but you can also just start typing your password.
It was a big change in workflow, and the developers gave off a paternalistic vibe (that I guess had been there in Gnome development for years, but Gnome 3 was the first time that I felt like it really impacted me).
I used Gnome 2 because it did what I wanted. Gnome 3 felt like style over substance (I'll agree wholeheartedly on "downright pretty UI"), and it couldn't be configured to act in a way I liked. I felt like I wasn't the kind of user they were targeting, so I switched to Mate when it came out.
Depending on who you ask, Gnome 3 was either a long-overdue redesign with great results or a dependable tool ditching its legacy to chase after some UX designer's dream.
It's been enough time. Might be time for me to look at it again. My patterns have shifted some, and it's possible that it would be a better fit for me now than it was then.
I agree. In the past I was always a kde guy but I've been running gnome for the last few weeks with a few tweaks and the dark theme on and I find it really beautiful and simple, especially the fonts. I'm a programmer who normally wouldn't notice if I was using a serif font or not, but whatever gnome uses by default is very pretty.
I used it for a few months when it came out, then switched to Mate. My biggest problem was that for a lot of things that I wanted to change or configure, the official how-to was (paraphrased) "Actually, you don't want to configure that. We designed the system not to work that way." I decided to "vote with my feet".
Curiously my experience of Unity on Ubuntu 16.04 is the exact opposite. It's by far the fastest Linux desktop I have tried and I have been tinkering with Linux desktops since the early 2000s. My experience has been so dramatically different from some of the negative reviews I had read online that I can only conclude these are from older buggy versions or perhaps mis-configured or non accelerated GUI drivers.
For me it's been much faster and smoother than my 13 retina Macbook Pro and Window 7 Desktop and this on an ancient i3 2100 system with 4GB ram and old 6770 Radeon card.
My only regret is I didn't try it earlier because of all the negativity floating around about it. I suggest anyone interested in Linux desktops ignore the negativity and try Unity on Ubuntu 16.04 with an open mind. Its fast, lightweight and reasonably sleek and minimal.
This! The negativity regarding Unity is hurting Linux. So many people I know who tried a switch in the last couple of years were put off Linux because their Linux-friends pushed them to Mate or Xfce, running on Arch, or something similar. Those friends heavily warned against Unity and Ubuntu as if they were the devil. Showing those who were interested in a switch what a modern Linux desktop looks and feels like was a revelation to them.
2001 laptop? Even contemporary systems (Win XP) would be slow on that.
> Inside, it had plenty of power, an 850MHz Pentium III-M, 256MB of RAM, and 802.11b for wireless networking (if you could find a wireless network, that is). Plus it had 20GB of storage, FireWire and even one USB port.
LOL, I use this very laptop with XP all the time. Yes, it is slow for modern tasks. But the XP OS is responsive! I can move the mouse, navigate menus, open the text editor. I can do this on this 2001 machine using Debian with no GUI, but not with Unity over Ubuntu.
This was an issue because where I used to work, we installed old computers with Ubuntu and gave them to poor families and schools. The SECOND Unity arrived, we were fucked. We had to go and build our own image with Gnome as the default and install that. It was really stupid, and I never saw our users have any easier a time with Unity than they did with Gnome. It was a gratuitous change for change's sake.
I think Microsoft's low-level HW engineering skills are underappreciated. Sure, IOS and Android rule the world, but the responsiveness of a Surface tablet kills them both on that one axis.
That said, IMO the problem is not so much that Unity exists, but that they didn't have the resources to get real end-user feedback that might have told them how much a stinking pile of poo it is on many machines. To be fair, what Google and Apple do to mobile devices more than a year old or so is every bit as bad IMO so I can cut them some slack.
What this has achieved for me is a migration back to desktop machines from IPad and Android tablets. My 2013 IPad Air is now pretty much a brick with IOS 9 and my Nexus 7 from 2013 is unusable. Compare and contrast with my 2009 Core 2 Quad desktop that's still going strong.
Finally, consider the hideous electronic waste we generate by effectively forcing people to upgrade HW every year or two.
Agreed on the Unity stuff. I think Unity could be fine and dandy on its own, but it should NOT be the default UI for a system designed and distributed under the auspices of being the super inclusive Linux of the 3rd world. Which is kinda what Ubuntu was trying to be.
If you want to be inclusive and spread technology to those that don't have it and can't afford it, you HAVE to support very, very low grade hardware.
> they didn't have the resources to get real end-user feedback that might have told them how much a stinking pile of poo it is on many machines.
They actually did; they just ignored the feedback and insisted that Daddy Mark knows best, completely throwing away all of the reasons that Ubuntu is actually named Ubuntu (namely: they stopped listening to the community).
I was willing to forgive Unity, though (hell, I kinda liked it, at least on systems that could actually run it). It was the combination of repeated awesome-looking but inexplicably-stifled projects (Ubuntu TV, Ubuntu for Android) and the Amazon Lens that resulted in me swearing off the community in which I was once happy to be a part.
Indeed. Decent responsive GUIs have been available for decades, and it's one of the great failings of the industry that we can't keep them that way, either in open source or the commercial world.
Of course it works fine with XP, that was the target specs in 2001 when that OS was released. Maybe try a Linux distribution from 2001 and you'd have a fair comparison.
I remember how XP was received. It was slow as hell compared to Win98 and even to Windows 2000. And 256MB RAM is too little for XP, W2K would be probably fine.
I have successfully run XP on 96MB of RAM. It ran surprisingly okay on a fresh install. Multitasking, Firefox and everything. As it aged though it declined badly. XP never was very good at self-maintenance.
I have ironically had similar-to-worse results with running 2015 Android on 2011 hardware, which has 512MB of RAM. The cheek!
Unity needs graphic acceleration. Of course it does not work on something like that. Use Lubuntu and it will work just fine.
XP on such machine is not responsive. Maybe it could be usable, if you uninstall half of operating system, and replace most programs (IE with Opera 12.X....). Or better is to use Win2k.
No! Lightweight Linux distro, is your best bet to revive such junk today.
EDIT: fixed typo; I restored a few netbooks in EE, but whatever.
I've run relatively recent Linux distros on hardware with far tighter constraints (like on my Compaq Presario 1210; may you rest in peace, old friend). Damn Small Linux was king here, but distros like Tiny Core and Puppy are also designed for this exact use case.
256MB RAM is actually pretty generous for Windows XP, as is the Pentium III; neither were unheard of in "Made For Windows XP™" systems.
Microsoft specified minimum is 64MB, recommended 128MB. XP RTM with 128MB RAM was perfectly usable, obviously it was not a good idea to run lots of crap in background.
> Gnome went off the rails, Unity completely ruined Ubuntu[1], and KDE always felt very cheap and hard to navigate.
Completely agree, and as a user of Desktop Linux for a very long time - I always felt that Linux GUI have been inferior to the options available on Mac and Windows.
It's not for everyone, but I found a personal solution with tiling window managers (Xmonad, I3) and related tools (suckless' dmenu). They restore sanity by providing lightweight, shortcut focused GUI systems, that don't intrude but provide enough functionality to get things done.
For anyone searching and testing GUIs like XFCE, don't dismiss tiling window managers as being too complicated or a bridge too far. My experience is that they work wonderfully in practice.
That's not my experience. I switched fron XP to Ubuntu 8.04 and I remember how the DE felt much better. I could do everything with half the clicks. Windows control panel dialog were so clunky in comparison. Four years later I discovered that KDE shares that clicks problem with Windows and I went back to Gnome after a few months. I'm on Gnome flashback now and specifically a Gnome 3 tweaked to look like Gnome 2 from 10.04 plus Compiz for the 3D desktop transitions (the cube).
I used Windows 8 and 10 on friends' laptops and I'm not impressed. BTW, everybody hated 8 because of the changes in the UI. They tend to be OK with 10, but it looks so old to me, like a window manager for some 16 colors home computer in the 80s (I used them back then) with 1080p resolution. Control Panel's dialogs in most cases are still the same from XP and I can't understand why they didn't redesigned them. They feel more and more out of place in the UI.
Mac's UI is somewhat similar to Gnome (actually the other way around, I think) but I never liked the menu at the top and the top bar. That's why I removed both from my Gnome and I never bought a Mac since the very beginning (I moved everything from the top bar to the bottom one.) Mac's UI feels zippier though, that's the advantage of building both the hw and the sw.
I agree about windows 8 and 10. Around the time of the Gnome versus KDE wars - I felt Windows XP, 7, and Server 2000/2003 were clean simple GUIs. I messed about with Compiz as well and thought it very neat. I was happiest on Linux with the simplicity of fvwm though, and I think modern tiling WMs replicate that experience for me.
I personally find that movable-window designs are a UI failure, requiring too much user interaction. A tiling WM, OTOH, enables me to do what I want: work with one tool or another, and work with multiple tools when that's what I want (which, frankly, is rarely — other than consulting a browser or info window, it's not often that I need more than one window on my screen at a time.
And this is what killed the Linux desktop. 99% of the population does NOT want to try a gazillion desktops (of which the installation is error prone), just to find out what 'works best for you'.
You have always been able to mix and match apps from a different desktop.
Vim and emacs are awesome.
Package formats have never ever been compatible, not even all distros that use the same format are compatible in fact. For open source software this is less of an issue than it seems basically the production of a package is 1/100000 of the effort required to produce the program itself and can even be done by individuals far less skilled than the original dev.
This means with only a relatively tiny additional input of labor a program can be packaged for all the most common distros.
I don't prefer unity but I think you bazaarly overstate the case.
I see an awful lot of posts like OP here where they're for some reason thinking that using a specific distro, or even DE, locks them into using specific packages.
You can literally install anything on any Linux distro. Some things might take a bit of work, sure, but they're all just a Linux system underneath.
Install the right dependencies, get the app somehow (through a package manager, or source),put the files in the right place, and bobs your uncle.
I'm tending to agree for the mass's at least, for me CDE was and still is a great desktop, least for somebody who just wants to pull up lots of terminals and have multi-desktop panels - just works. Simple, basic and sure lacks eye candy beyond Xeye's but then I own a lava lamp.
Yes mass basic users like their eye candy, and the diversity which is a strength in many area's is equally a weakness when it comes to desktops. GUI's and the overhead add so much in complications to code that if they keep changing then they tend to get ignored or badly utilized and that gives the user a mixed half-baked experience.
This and they tend to be the epitome of feature-creep.
> I could put Ubuntu on a 2001 laptop and be completely fine, fast enough, usable enough to do just about everything I needed.
This use case is an edge case, and you could still serve it by using a non-default desktop (like lubuntu). Laptops from 2001 have super-weak cpus and gpus, and aren't a representative sample of desktop users.
At one of the two companies I work for, more than half the developers use Ubuntu, and all-but-one (me) use Unity and like it. I use KDE there because of old-school configurability, but after a couple of years of dealing with KDE's bugginess, it's time to move on, methinks.
What killed the Linux desktop? Not being preinstalled. For a bit of context from PC history, this is also what killed the GEM desktop, killed the OS/2 desktop, and damn near killed the Windows desktop until Microsoft put pressure on vendors to make a 386SX with VGA the entry-level option and ship Windows 3.0 with every unit.
Preinstalls are an OS's golden ticket to adoption.
If one Linux with one desktop was widely preinstalled, that would help. That would have made that environment the target for desktop developers, which in turn would have added the pressure for backwards compatibility. Now, as a desktop developer I'm surely not targeting Linux desktop because it's not even a single target.
For all practical purposes, desktop Linux is a single target. Use Qt, statically link everything or install to a folder in /opt, and then the only thing you depend on from the OS is libc... which has been plenty stable for a very long time.
That helps with compatibility, but it does not help deployment. Flatpack-like projects try to solve this, but there won't be a "good" way of targeting linux until one of those solutions is widespread enough that you can just show 3 download links on your page (windows, linux, os x) and it all just works.
As a developer I have zero interest in trying to get my application into the package managers of various distributions,
Um, what? Phones differ in size and power, but a particular version of Android is a particular version of Android everywhere.
Android developers don't have a debian vs. ubuntu vs. centos. vs. gentoo vs. arch type choice to make, only a "how backwards compatible do I want to be" choice.
Libraries are just part of the picture, then comes the location of files, init systems, lack of standard frameworks for all layers of a modern desktop.
On OS X, in spite of CUPS, I can use AppKit Printing API. On Linux I have to resort to somehow generate PS and forward into the printer driver.
Yes. Of the two problems (apis and deployment) apis is actually the easier problem. It's not solved by any means (OS services are just too dang hard to figure out because desktop linux isn't one OS. How do I do desktop notification, printing, service init etc in a way that works for all distros?) but at least that's doable.
Android took linux and solved the problems: it created an ivory-tower set of API's for applications and it provided an app distribution model that isn't insane.
These days you can get computers with Linux preinstalled. What you can't get is a computer that has both Windows and Linux preinstalled in some kind of dual boot or VM setup. That you must still do yourself. You can get the choice to preinstall Windows, or preinstall Linux; never both.
> These days you can get computers with Linux preinstalled.
If you look for them specially, and not from most manufacturers. Try getting one from Dell (any model, not one specific variant), Lenovo and etc. Good luck with that.
> If you look for them specially, and not from most manufacturers. Try getting one from Dell (any model, not one specific variant), Lenovo and etc. Good luck with that.
Before you complain about that, consider that every GNU/Linux nerd has strong feelings about their preferred GNU/Linux distribution (or version (such as LTS, Stable, Bleeding Edge etc.). So they will probably not be satisfied with the distribution that is installed and install their preferred one. Also there is only one or two Windows versions that the customer wants preinstalled at any given time. Only every few years (much less often than a new version of a typical GNU/Linux distribution gets released) there will be a necessity to offer a new Windows version.
So if you are dreaming of PCs preinstalled with GNU/Linux for the mainstream market, first standardize on 1 or 2 distribution for, say, 95% of the market.
> Before you complain about that, consider that every GNU/Linux nerd has strong feelings about their preferred GNU/Linux distribution (or version (such as LTS, Stable, Bleeding Edge etc.). So they will probably not be satisfied with the distribution that is installed and install their preferred one
Those who have such preferences, have zero problems installing their own distro. The fact that it doesn't come pre-installed doesn't bother such users, as long as they aren't forced to pay the nasty Windows tax.
We are talking about what's slowing mass Linux adoption (by those who aren't likely to ever install any OS to begin with).
> as long as they aren't forced to pay the nasty Windows tax.
It is much simpler just not to ship any OS to customers that want to avoid the Microsoft tax (indeed some vendors sell PCs without any installed OS - in particular for enterprise customers; I also). This has the advantage that they don't have to provide support to customers that are using the installed GNU/Linux distribution.
But note that avoiding the "Microsoft tax" will save you much less money, since Microsoft has contracts with large PC vendors that gets them Windows licenses even much cheaper than the "System Builder" version. Slightly exaggerated: The Windows license that they install is "nearly free" for them. So even if they offer a PC without OS, the price will not drop by a lot (if any) for the simple reason that large PC producers hardly have to pay any "Microsoft tax".
An extra point is that OEMs charge to distribute third-party software, which is usually called crapware. If you install a lot of crapware, you can reduce the cost of Windows close to zero. That's why OEMs do it even though users don't like it.
If one of your users subscribes to the bundled AV software then you can make a recurring income from Windows that can be larger than the profit you made on the hardware.
Offering Linux as well as Windows just increases costs. There's the extra hardware qualification, sourcing drivers etc, extra stock-keeping and distribution costs, and extra advertising costs. You may have to do that across dozens of languages for over 100 countries.
Ad costs can be significant because Microsoft and Intel provide advertising support, discounts or kickbacks for including phrases like "MightyCorp recommends Windows 10" in your adverts. Microsoft isn't going to pay out if your ad includes Linux.
If you survive that, you have to survive the cost of support and/or returns. When your user base has a vague idea how to use Windows, and zero idea of how to use Linux, that's a killer. One "support incident" kills any profits from at least five sales.
What, like the Android monopoly, the Google monopoly, the Facebook monopoly, and so on? Please look up power laws...
Microsoft won the market fair and square (against an even more evil and much larger monopolist), and Linux has been built on the market created by Microsoft's de facto standards and the economies of scale that Windows has generated.
Did you read the article I linked to? It wasn't fair and square. MS gave the PC makers a discount with strings attached: don't sell PCs that can dual-boot into another OS. I think you can agree this is pretty anti-competitive behavior.
No, I certainly don't agree with that. Anybody can dual boot their PC, if they install the second OS themselves. Apple took exactly the same view of BeOS, which had failed to sell BeBoxes.
All discounts have strings attached. You either want the deal or you don't.
"Anti-competitive" in this case simply means "not giving the competition a free ride".
Companies tried to sell Linux and Unix PCs in competition with Windows, and they failed. Some went out of business. IBM tried to sell OS/2 systems and failed -- it ended up paying to have OS/2 taken off its PCs and Windows installed, even though OS/2 ran Windows. Apple ended up getting bailed out by Microsoft just to keep it alive.
You can lie to yourself for as long as you like, but it would be better to face the fact that people wanted Windows, and they didn't want the alternatives available at the time.
One of the conditions imposed by the US Justice Department in the anti-trust settlement was that Microsoft was prevented from doing separate deals with all the main manufacturers. That was probably in place for a dozen years. Did you see what a difference it made?
You didn't, so I'll tell you: it pushed up the average price of Windows and made Microsoft more profitable.
> it is much simpler just not to ship any OS to customers that want to avoid the Microsoft tax
Simpler yes, but it would undermine MS monopolistic control. That's why MS force manufacturers not to sell such computers. Their leverage is simple. If they would sell such, MS would raise the price of Windows for them. Most give in to blackmail.
Is that still true today? I think these days it's possible to buy a computer with no OS preinstalled from some manufacturers. It might even be slightly cheaper than the identical model with Windows. (Of course, only select models, etc.)
Where are they on their site? Besides, why isn't it an option on all their models? Admittedly, Dell have some models with even Linux pre-installed. But since it's only a small subset, it doesn't help boosting Linux adoption.
Its not about the amount. Its a more idealistic value (for me at least) i simply dont want windows and will buy the worse hardware for more money even if i can avoid owning windows.
I work on Win 10, I run Ubuntu Server and some Desktop VM, I have an iMac at home.
Due to recent lifestyle change I'll soon buy a laptop and I was highly considering buying a Linux laptop because I will mostly use it for coding.
After a few search I noticed that the most cited laptop choice was an ugly old Thinkpad. I really don't want to spend 200 hours checking if hardware and drivers will behave rightly and still end up in an energy efficient result. So guess what I'll just go for a MacBook and run Linux in a VM if needed...
The people I know who use desktop Linux love their Linux rigs a whole lot more than my Windows and Mac acquaintances enjoy their computers--which in many cases just borders on tolerance, especially with Windows 10. Maybe software needs something beyond market share, sort of like what Bhutan did with Gross National Happiness. We'll rank products based on how much people enjoy them and then see who's the dominant desktop.
I enjoy having a machine that ultimately I control. You just can't get the same level of control with Windows, Mac, Android or iPhone. You constantly have to wrestle your wants with the wants of the platform venders.
A key area where this is most apparent these days is privacy. If you want privacy you have to keep a really close eye on Microsoft, Apple and Google every update.
With a Linux Server/Desktop I'm free from their power games.
Do you trust everyone who contributes to, reviews, and maintains the package management infrastructure you use?
Do you ever find the feeling of control challenged when you want to upgrade an application you use without upgrading the entire OS in lockstep, or vice versa?
I'm confident the distribution I use isn't trying to collect, catalogue and sell my private info, or lock me into a service or product. So far it doesn't interfere with what I want in any way.
Part of that comes from the sense of ownership and the love that a tinkerer has for their projects. Not that it's a bad thing, but I think it's easy to see how that love would arrive.
That's because it takes more work and knowledge to get a Linux machine running smoothly, so in the end you feel a sense of accomplishment, when in the end all that was done was just getting a local desktop OS running the way a computer should be used in the first place. It's all smoke and mirrors.
That's the whole point. You can't achieve this with windows, because it runs an update whenever it wants and will just give you a restart at a random time. Happened to me at work during a presentation when i turned on a vm after few days. And then you have same privacy settings scattered around the OS trying to scam you into giving your data to Microsoft, which turns on by default after every godamn update.
I would have never moved to linux, if microsoft didn't shove their crap down the users throats everytime it updates.
I have now complete control over everything, and i feel very happy by using a fedora workstation on my personal pc/laptop. I am not into gaming, and even if i were, i would rather sacrifice my windows games just to support an OS which is more fair to the users and doesn't steal their privacy.
That might be true for some users, who are into the tweaking. But I don't think "OS as a challenge" characterises Linux users generally.
All my Ubuntu laptop installs have been pretty much plug-n-play, and I leave out the "last 1%" like installing 3rd party drivers for the fingerprint reader. Last time I spent effort was when setting up dual boot on a Macbook, and all the effort was on the macOS sids, shrinking encrypted partitions using the CLI tools etc.
(Of course I research Ubuntu compatibility beforehand and select the laptop based on that)
The original sin of the Linux desktop was the kde/gnome split over licensing. Had Linux had one good desktop, like the other operating systems, it could have had a future on the desktop.
Instead it has had a number of almost-good desktops that are still fragmented by hate and the desire to thwart interoperability. For instance, the GTK team won't fix certain bugs that affect running in a rootless X server because they can't stand the idea that you'd run Eclipse on Linux and view it on Cygwin/X.
DEs are one thing, but the biggest issue was a split over programming interfaces. One that has yet to be rectified even with the efforts of Freedesktop (in large part "Hijacked" by Gnome, Icaza's creation in the DE schism).
In the end the problem is likely the same as we are seeing on the web. People loves to code, not write specifications and sticking to them.
Another problem here is that unix DEs in general (and this includes even proprietary ones like CDE and 4Dwm) tended to solve problem that was cool in the mid 90's but then stopped to be relevant: having consistent cross-application object-oriented model for documents and preferably everything the user wants to manipulate. When you have multiple competing frameworks to implement this (and each KDE and Gnome went through at least one backward-incompatible reimplementation of this) you end up with applications that work only as part of the same framework. The whole OLE/COM/ActiveX on Windows is mechanism meant for exactly this and is used almost everywhere (Explorer window is COM component that displays COM components and the whole thing has exactly zero concept of files and directories).
This whole mechanism is in fact one great waste of time because the probably only meaningful user-facing application for this is embedding Excel charts in Word documents and in that case it is too generic and has some surprising behaviors that confuse users (I assume that what happens when you open document that has embedded OLE object for which there is no installed application has at least something to do with why Microsoft started selling Office instead of separate components) and can perfectly be implemented completely internally by the office suite, LibreOffice has internal system for this and nobody seems to care. One would assume that the useful use-case is more general, like "embedding picture-like things into Word documents", but I've never seen Word document that is actually useful and embeds something other than Excel chart or maybe worksheet, particularly I've never seen Visio diagram in Word document as an OLE object instead of image.
Currently KDE has it's KParts that are actually used in the implementation of KDE and KOffice, but Gnome seems to have completely dropped any such idea (which probably makes sense because their implementation that involved CORBA and could at least in theory work between X clients running on different physical computers was horribly over-engineered and probably was never used for anything that both worked reliably and was actually useful for end user).
I think he was well meaning. And at some point he just gave up.
As a long time Linux user I agree with him. FOSS is just not conducive to the kind of development needed for a high quality, long term stable desktop experience.
Why do you assume the correlation that he wrote this kind of articles to be hired be Microsoft? Isn't it much more likely that he had the posed point of view for all the time, but it did not become accepted in the GNU/Linux world. On the other hand Microsoft wanted to expand their .net ecosystem to Linux and mobile devices, so they bought Xamarin and hired Miguel de Icaza.
How common is this in Linux development? Is it better to have two competing solutions and "may the best one win"? Or would it be more effective for those two teams to work together on one project?
Part of the problem about KDE vs GTK is that it comes down to C++ vs C, which is a pretty big dividing line in the Linux programmer community. (You can have API wrappers, but the "native" language of the desktop environment influences the core abstractions, and so it'll probably seem more "natural" to use the same language the DE was written in.)
Maybe someone will come along and write a new desktop environment in Rust or something that's better than the other two. My fear that then we'll have three desktop environments.
Another option is to accept that these old debates will never be resolved and to start over with something new that isn't Linux. I've had a feeling like that's the best option that's been growing for the last few years. It's an enormous amount of work, though.
Of course it would be more effective, but it's hard to coordinate individual ambitions and goals within one project, so much so that it will cause splits or over-engineering of features just to please developers.
I don't know if GNU for example can get around that by having Stallman there to give the final say, or the Linux kernel with Torvalds there. Perhaps a similar approach would be good for DEs.
Nothing killed the Linux desktop. At the end of the day, it all comes down to the fact that the masses only have two options: Windows or Mac. People become set in their ways using a particular system either at home or work and there is basically no reason that they will ever be exposed to a Linux desktop, and if they are, they will find it just as impossible to use as any other OS that isn't the one they've used exclusively for 10 years.
Anecdotally, prior to the ubiquity of tablets, I've observed several computer illiterate relatives thrive on Ubuntu systems (gnome 2 and unity) because they needed a PC and I passed on an older machine with a fresh copy of Ubuntu. My uncle who had never used a PC more than 30 minutes at a time before I gave him an Ubuntu laptop half a decade ago barely uses his 2015 MacBook that my aunt bought him for his 62nd birthday because he is so comfortable with his Ubuntu machine.
I always have a laugh when I see these types of articles. Most of the time it's because the author had some sort of driver problem that was not fixed in 2 minutes or such or that a gui item is not exactly how they want.
I've been running linux on the desktop since around 95, and it's certainly not dead for me. Like you I have also installed ubuntu on my mothers and a few computer illiterate friends computers and it works just fine.
The gui might change a bit from time to time, but I'm not too young to learn new things yet :)
In addition to that, some CS students at some schools are exposed to the linux command line, and even less are exposed to linux desktop environments because most CS students have grown up using Windows or Mac and the curriculum is nearly always structured in such a way that it is unnecessary for the student to install a linux DE in order to do their work.
Finally, I'm not willing to blindly accept your claim that "very few of them still choose it", if you have some way to substantiate that claim I'd be interested in the details, but my experience says quite a few of them do choose it, though I'd certainly be willing to yield my anecdotes to some actual data.
For what it's worth, I have a counter anecdote. I go to a university where for some computer science courses, your labs are on computers with a Linux desktop environment. Other students seem to do fine in these labs. Even so, when I look around the classroom during lectures, almost all the open laptops are Windows or Mac. Perhaps these students don't get enough exposure over the couple of courses or don't get shown enough of the reasons to switch to Linux.
When I went to the university we had UNIX everywhere, not GNU/Linux, that started to be common around my 2nd - 3rd year.
We were using DG/UX and Aix before the department started adopting GNU/Linux on their recently build new department building.
Windows systems were only used for graphics programming classes and for the first year students introductory courses.
Back then our degrees were 5 years long, of daily using UNIX for project assignments, everyone had some form of fvwm, afterstep, windowmaker, enlightment, blackbox, GNOME, KDE, CDE configured as their environment of choice.
The majority of us, a few decades later, are mostly coding on Windows and Mac OS X systems.
"The attitude of our community was one of engineering excellence: we do not want deprecated code in our source trees, we do not want to keep broken designs around, we want pure and beautiful designs and we want to eliminate all traces of bad or poorly implemented ideas from our source code trees.
And we did.
We deprecated APIs, because there was a better way."
The Linux kernel only removes/changes internal kernel interface all the time - the kernel developers take care to keep the external kernel interface (syscalls) backward-compatible.
The linux kernel is hard-core into not breaking backwards compatibility. 2/3 of Linus's rants targeting a kernel developer tend to be because they broke userspace.
On the other hand, there is approximately zero chance that a dynamically linked executable from 10 years ago will run on my current system because the system libraries break backwards compatibility all the time. Drepper takes a lot of flak for this, but glibc is not even remotely the only library at fault for this.
Many core userspace libraries broke not only binary compatibility regularly but often source compatibility. Inspect a steam install sometime. It installs dozens of DLLs just so games will have a stable base to work from.
Also, have I just gotten lucky or something? This is the second time that someone mentions audio not working on linux as the norm, but from the mainlining of ALSA in 2.6 on, I've had zero issues with audio working for all but the most obscure audio devices.
These days when people say "audio broke" it is most likely the pulseaudio layer that is acting up (and their devs blaming bugs in alsa for the breakage).
> Drepper takes a lot of flak for this, but glibc is not even remotely the only library at fault for this.
The reason Drepper takes a lot of flak for this is not because glibc constantly breaks ABI compatibility but because he set tone of dynamic linking discussions back in the day and he is responsible for static-linking-aversiness of glibc & GNU toolchain and pure dynamic linking culture of mainstream linux distributions.
Also my experience, but people complain about sound. I know Ubuntu adopted PA way to early, so perhaps that's what some people are thinking of. Also, back in the day, dmix was not enabled in setups where it really ought to have been so you had the "some program is blocking everyone else from using the sound."
I've had about 50/50 when it comes to sound. I don't know if it's certain chips or manufacturers or what, but I never really expect it to work for me OOTB.
The author makes a good point, namely that the Linux desktopspace is too fast a moving target. The Linux kernel is exceptionally stable in its userspace facing APIs (the most important rule Linus holds up is "we never break userspace"). And userspace has been pretty much stable for a long time as well on the API and ABI side. But everything that's regarded with making desktop environments work, the UI toolkits, configuration APIs, essential DE programs, all that jazz is moving at a breakneck speed and more often than not even "minor" version bumps introduce regressions and subtle incompatibilites.
But this problem is not going to be resolved by coalescing all the various Linux distributions into a single vendor model. If anything that would just accelerate the troublemaking processes. What's really required is infusing a culture of high quality engineering, forward thinking and consequence estimation into the community. Focus on fixing bugs and regressions instead of implementing new features.
How many Linux distros and their many different GUI/UI variations have been listed here???
No wonder the Linux desktop was dead upon arrival. Too many flavors of developers to satisfy. But it wasn't about developers; it was about end users who just wanted something simple. No matter the changes that Windows has made, from 95 to 10, for end users, the changes caused relatively minor disruption, especially with everything moving to the web. Switch to Linux? Which one? Which packages to install and why? Which GUI to use? etc, etc, etc... Linux was the beginning of developers creating something for themselves to control and expecting the rest of the world to just simply fall in love.
I say all that because I genuinely like Linux and it's purpose, but not for desktop purposes. Maybe mobile???
That's the dichotomy right there: the "I can fix everything" versus the "I shouldn't have to" user bases. "I shouldn't have to" is much, much larger. There does come a point where freedom's just another word for nothing left to lose.
Nothing killed the Linux Desktop. Windows already won that war and it was good enough. Heck, even Apple and its billions couldn't get more than 5% market share worldwide.
The next "war" moved to mobile and there linux is not doing that bad.
The Linux kernel is doing well on mobile but GNU/Linux and open source in general is doing terrible. The status quo right now is that mobile operating systems are super locked down and almost all the software you install on it is proprietary stuff you get from an app store.
> The status quo right now is that mobile operating systems are super locked down and almost all the software you install on it is proprietary stuff you get from an app store.
I think there's been a shift from developers. As a collective we used to value open source a lot more, now the focus seems to be on getting venture capital for private services and the great mobile land grab.
Agreed. I'm currently looking for a new phone, and everything looks like a locked-down POS. I'd love an open, hackable, "actually linux" phone, but I don't know if one is available.
I'm hoping the fairphone (https://shop.fairphone.com/en/) works out well myself, and that I'll be able to install something like ubuntu or sailfish on it.
No, but I think it's meant to be open enough that other people can provide alternative OS's. At least I hope it is, I don't care for the ethical side of things much. ETA is late may.
The biggest problem I have with the current alternative OS's is that they only seem to support nexus devices that I'm not interested in. It would be ice if this or something like it became the PC of smart phones.
They just need to provide this libraries, and if you try to link to some private library, like many used to do, starting with Android 7, the app will get terminated.
Windows solved many of the GUI problems 20 years ago that Linux still struggles with today. Windows was and is ahead and this is why it was and is more successful. It is that simple. I would take Windows 95's window manager over Ubuntu's Unity every day of the week.
I think Unity (and KDE) were good enough to replace Windows but Windows won 10 years before Gnome and KDE became a thing. And yes Windows95 was not only good enough but great as you pointed.
The lack of hardware support was because it made no sense for hardware companies to support an OS with less than 1% market share and same for the gigantic ecosystem of software that ran on windows which had a very long tail.
My point is that Linux didn't lose the desktop because of any mistakes the community made (and god knows there were many), it lost because Windows had already won this platform for a loooong time and the incentives to switch weren't great (OS are boring, most people don't even know what an OS is).
I agree. Many mistakes would have been fine if there was a compelling reason for users to switch. Linux has been successful when it found a greenfield space: Android, Chromebook, x86 servers. The open source aspect isn't magic, it won't make "clone knockoff in an existing space" into an interesting product.
This was the root cause, "a new desktop" presupposes the existing product category "desktop" which was a solved problem and a mature market. Runaway marketshare required NEW categories.
ABI compat and hardware support are hygiene features; they can slow adoption, but solving them doesn't motivate adoption. Linux needed the motivator; that would have then funded solving the hygiene stuff - just as it has, at least well enough, for x86 servers, Android, and Chromebook.
I don't say this with 20/20 hindsight either. I made this point loudly both inside GNOME and inside Red Hat back in the day. We even had chromebook-type proposals. But the Linux companies at that time were too small and server-focused to undertake such things, and once ipad/chromebook/android were out, there wasn't an obvious opportunity anymore for "Linux" proper.
Still, "Linux desktop" continues to work well for me and millions of other developers daily and I think it's very good for a dev workstation, as long as you buy hardware with OEM-developed open drivers (which mostly means Intel parts).
A lot of people who find it doesn't work for them are doing the equivalent of running OS X on non-Apple hardware with hacks to modify the finder, and surprise it's buggy. Granted, in the Linux world it isn't so clear what the "supported configuration" is. But think boring: default config of a major distribution with all open source drivers...
even if the windows api's are kind of painful to use, you can always use them...even if there is a new hotness, the old stuff you wrote will still build or run without modification.
> I would take Windows 95's window manager over Ubuntu's Unity every day of the week.
And I'd take StumpWM over Windows 10. Honestly, I'd take Windows 95's UI over Windows 10.
I don't actually mind what the Unity guys are trying to do — it's definitely not for me, and is part of why I left Ubuntu for Debian, but I can see that they are trying to do something good.
Icaza has no right to complain when he was the instigator of the DE feud with Gnome just as KDE was on the rise.
Ts'o on the other hand, in the comments, puts the spotlight on how it is userspace that is the problem. Userspace devs keep breaking APIs and ABIs at whim, effectively playing out "perfection is the enemy of good" in real time.
As a programmer, I find Fedora with the Gnome Classic theme and the No Topleft Hot Corner extension[0] to be an excellent, clean, effective desktop UI. But in order to get this working properly, I had to go through package hell for a few hours to get the right graphics drivers and tweak a bunch of settings. I've heard Linux Mint is good for non-programmers, but I've never tried. Unity in Ubuntu is the worst desktop experience out of all I've tried.
I have moved to fedora since almost 2 years now and i can't say more. It had been one of the best experiences ever after the initial set-up and learning curve.
I have all my settings on git now and whenever i do a new install of any linux i just get those settings and boom.
I am very happy linux user and support the philosophy. Its way more fair than any other os where the main purpose is to rip of the people who support them by selling their private data etc.
I don't remember the details, but I do recall being flabbergasted that the latest KDE couldn't run the eclipse IDE properly. Dialogs were missing entire fields. It seemed so odd that made me doubt myself and check the same dialog on OS X to be sure.
Something to do about Gnome overhauling its style system to use CSS, which affected some other library in KDE that implements SWT via some gnome library, or ... gah who even cares.
The teams in charge of the linux desktops environments seem to behave as if backwards compatibility was no big deal.
Say what you will about Microsoft, but they do make an effort maintain backwards compatibility with some truly ancient applications.
This was probably fallout from the switch to GTK3 and is unlikely the fault of KDE (that only configures the GTK theme and nothing more). Eclipse and GTK3 was so broken (is it better now? I switched to IntelliJ) that almost everone recommended to set a flag to get back to GTK2.
> Clearly there is some confusion over the title of this blog post, so I wanted to post a quick follow-up.
What I mean with the title is that Linux on the Desktop lost the race for a consumer operating system. It will continue to be a great engineering workstation (that is why I am replacing the hard disk in my system at home) and yes, I am aware that many of my friends use Linux on the desktop and love it.
Was the author aware that perhaps the desire to have a 'consumer operating system' was not in the hearts and even minds of many developers of the "Linux" ecosystem?
Granted, the state of linux on desktop was much different in 2012 than it is today. Back then UEFI support was garbage and you had to fiddle around and use things like rEFInd to get it to even boot.
Something killed the Linux desktop? Its usage is growing, while MacOS stagnation causes people to abandon it, yes you guessed it - often for Linux.
UPDATE: Oh, after noticing the date and the author, I realized, my suspicions were right. I expected the piece like this to be written by Miguel de Icaza. And here you go, it's indeed his rant. From 4.5 years ago...
Just skip it and read something more to the point, which discusses Linux desktop today.
Well this "pile of shit" is the only reliable way I have to auickly open and export csv files from various sources, so I'd like to thank all contributors for this, especially considering that their multi-millions dollars concurrents are unable to read/produce a text file with delimiters.
Agree. Libreoffice works quite well, and although their presentation software isn't as pretty as powerpoint, I actually prefer their slideshow mode on multiple monitors. Quite sexy, in fact.
There's no reason for any government office or public school system at any level to be paying for office licenses IMHO. We can communicate and teach quite well with L.O.- and donate a share of the saved budget to FOSS.
Where do you draw the line for how much you can cut costs and sacrifice a little bit of productivity or retraining? For example, I often see programmers talk about having proper tools to do their job, whether that be high quality keyboards, faster computers with more memory, or subscription IDEs like Intellij. Should government programmers be denied these things because it is possible to program quite well without them?
Great question. I was thinking of the people who use office for communiqués and presentations and such. You don't need Microsoft Word to write an interoffice memo, nor do you need it to write a request for proposal.
That would free more money so that the people who did need or could truly benefit from premium tools could get them.
Same here. I am also happy they did not throw away the confusing but great working ui for that big icon bullshit. Using ms office is a very depressing eperience for me these days.
Office compatibility still a problem for Macs in 2017. Either from FUD or in actual functionality, Office prevents people from switching from Win10 to MacOS.
Two people asked me about moving to Macs. They repeatedly asked me about Office compatibility (the FUD). For one of the users, I wasn't sure if it was OK for functionality. She is a lawyer (ie. power user of Word) and I know in Mac Excel for me is not usable.
Both of them didn't need to buy new hardware (they are fed up with Windows, not the hardware), but I couldn't even mention Linux as a alternative for them.
The features might be 1 to 1, but the compatibility problems I found is when you use Office in Windows at work, then work on same file on a Mac at home, then go back to Windows at work. The formatting can change. As an example, in Excel charts the colors will be different, the alignment will be off.
Also, I find Excel to be unbearably slow on a Mac, when the same file works OK in Windows. And yes, the VBA Editor on Mac versions aren't at par as in Windows.
When I bring work home, it's usually because of a deadline and these things will slow me down (including different keyboard mappings).
Do you want to share any constructive criticism of Libre Office, or tell us why you feel that Microsoft Office is better, rather than slander something that people have volunteered thousands of hours of development time to?
I have never had any problems with the word processor and spreadsheet in Libre Office (the only two in the suite that I've used). Sincere thanks to the developers.
True. And I do agree about Libre office, first thing I uninstall on Ubuntu.
But you can try and use WPS office. It's made by some Chinese company, it's fast and it is compatible with Office files.
I had one of my Linux epiphanies with WPS. I got a Powerpoint presentation by email, I plugged the videobeam HDMI adapter to my laptop, pressed the Fn-F8 keys to change from mirroring to using two separate displays, opened the Powerpoint file with WPS presentations and it worked beautifully.
The presentation was fullscreen in the external display, and I had the notes in the laptop. The presentation was flawless.
I had never had any other software work without issues for me at the very first try and I was very happy that it happened while running Linux.
It's not enough to be able to open 99% of the files properly, the main problem that makes libreoffice/openoffice unusable for me is that if I open a MS Office Word or Excel file, change a single character, and send it back, the file is likely to have other unintended changes in formatting or layout.
Is that really that high bar to say that "nothing can be really compatible"?
It comes down to a simple test that can be easily automated - take a test file (created in MS Office), run the load-from-MS-format operation, run the save-to-MS-format operation, verify that the resulting file is identical to the source file. There might be some differences in metadata (e.g. timestamps) but the document file content should be byte-for-byte identical. If not, then the software isn't MS Office compatible yet.
It does mean that your internal in-memory document structure must be able to hold all of the information that MS Office files contain, and that the conversions between MS Office data structures and your internal data structures cannot be lossy, so your internal structure can't be so conceptually different in any aspect that you can't unambiguously convert to it and back - but that is required for interoperability, so that multiple people can do long term work on the same document with different tools.
I certainly experienced such situations back in MS office 95, 97 and 2001, but that has been a long time ago - the last few versions of MS office don't result in layout corruption like Open/Libre office does, even when interchanging the documents between windows and mac MS office versions which are quite a bit different.
I have just installed Fedora Linux 25. So far so good. Only thing stopping me from switching completely to Linux is MS Office. If Microsoft can create a Linux version of Office, I am switching completely. Until then, I will keep on going back to Windows whenever I want to compile documents.
I have lots of Office documents. Alternatives are just awful. If focus of the open source community can be diverted to making LibreOffice better, Linux will win some decent Desktop market. Having 100 different distros won't help.
For me the question is a bit to the side of the article. If Adobe or any independent developer were releasing a Linux desktop app today, what would they target?
What UI? [edit]Also, what type of process goes into converting a binary shipped by someone like Adobe? We are talking just changing the package format to install the binaries correctly, right?[/edit]
No change needed. Often the file structure just needs modification and dependencies need to be installed and fixed. AUR from arch is a good example for that. UI is not relevant most machines have qt and gtk
Quote: "The only way to fix Linux is to take one distro, one set of components as a baseline, abadone everything else and everyone should just contribute to this single Linux."
Essentially the article's point is that technical totalitarianism would make Linux succeed on the consumer desktop, using the successful models of Windows and OSX -- one source, one model, no disagreement. Hate to acknowledge it, but it might be true.
This is why systemd's natural dominance of system orchestration and device initialization, and PulseAudio's natural dominance of consumer audio configuration could not be more welcome.
The big threat right now, as ever I suppose, is fragmentation in consumer desktop systems; the worst problem being Canonical's decision to develop Mir for reasons they later admitted were fabricated. The point is that all major toolkits now support Wayland quite well, but Canonical has to fight to maintain Mir support in just a subset of toolkits.
If we can get all of the system services standardized in a way which doesn't completely alienate the community; then we can move on to having standards for consumers.
The pattern is... you edit the default configurations of system daemons until they don't work, then complain on the internet?
Wow! it's terrible that your init system has been fighting you! Let's figure out what custom unit you wrote which is causing the problem.
Unless your alsa devices are broken, I find it unlikely that pulseaudio 10 will do anything untoward; if it does, well then, time to fix a bug, good thing there are so many people working on pulseaudio.
Avahi... well idunno. Avahi tends to be true to the name of ZeroConf networking. I have never written an avahi configuration in my life, and yet it works for me. Maybe you have a malicious mDNS peer on your network? I'm sorry you have to experience that! If you could fire up wireshark and pass the hail mary instead of letting it drop to the green, that would be great.
I've said this many times but the only real problem I have is the shere pain to get things to work and for uniform styling across programs.
I find myself just using terminal applications because gparted's UI is SO different from Wireshark, FireFox, etc. There is no consistancy. There is no standard written. There is nothing. It's just a hodge-podge of "this looks good to me!"
>I find myself just using terminal applications because gparted's UI is SO different from Wireshark, FireFox, etc. There is no consistancy. There is no standard written. There is nothing. It's just a hodge-podge of "this looks good to me!"
And GParted vs. Wireshark? Of course they look different, they're completely different tools - and most non-technical users never need to look at either one anyway. But here's Wireshark vs. macOS's disk utility, for comparison: https://imgur.com/a/V9zZC
All icons have text. There is a uniform icon for Hard Drive. There is a uniform color scheme. There is a uniform font size. There is a uniform spacing between elements.
In the macOS window you see descriptions where needed like under "First Aid"
You get human-friendly lables such as "Capacity", "Used", etc. It is all presented eligantly and simply.
On the left of your screen all I see are random strings of text with squigly lines next to them. Are those buttons? Can I click them? I know you have to from using wireshark before but the average person sees that and says "...well shit".
Presentation is a very sensative art. HCI should be much more appreciated and people majoring in it shouldn't be relegated to VR as there are still many problems in desktop UI that needs to be fixed.
>Look at that top bar. What do those mean? How do I un-gray them? What do I click to start capturing packets?
I agree, yes. But notice that my screenshot was on macOS, not desktop Linux.
My point was that there are design guidelines for desktop Linux apps. GNOME has them[0] and KDE has them[1]. And then there are applications that don't follow those guidelines - like Wireshark, one of the ones you referenced - regardless of platform. It's not the platform's fault, it's the app developer's decision to either make their UI conform or not.
The lack of (enforced but painfully overrideable) standards was a fundamental contributor to fragmenting the "Linux Desktop", if there was such a thing to begin with.
This coupled with the community philosophy of "one man's turd is another's lollipop", leads to a thousand flowers blooming, with none representative of a "Linux Desktop" and thus fundamentally confusing to new users in the target audience.
I quote from the article: "To sum up: (a) First dimension: things change too quickly, breaking both open source and proprietary software alike; (b) incompatibility across Linux distributions"
I must say, this is the type of thing I have been talking about for years. Linux has always had its evangelists and champions in the Server community (I would reckon Red Het, Ubuntu, and Debian/Slackware are the root of what what upended Unix servers in the 2000s, but that is for another day at another time to get in-depth).
Yet, there was no foundation, no company, no project, until maybe in the last year or two with elementary OS and some of the standardizations coming out of the Fedora project, that you even see some evangelism in the Linux desktop space (let alone mobile. I am not counting android). The problem as I see it is Linux is the programmers playground - which is great! - but what that also turns out to be is that since you can fork, edit, re-create, spin, and otherwise modify the code as we programmers see fit, you get no level of standardization, no harmonized quality control on fundamentals, no one evangelizing the Linux Desktop as a platform that you can make apps for (which I still contend is a big issue for Linux as a Desktop for the even semi-mainstream user. I think most developers not withstanding.) It was never an inclusive platform
Some might say this was the point. They don't want, nor do we need Linux to be the same as macOS, or Windows. Perhaps the point of the communities that sprang up around the Linux on the desktop movements is that there is a ethos that having these fundamentals is bad, or that evangelizing a platform is 'selling out'. I've seen this argument many times over the years. To me, the quote from the article above, sums it all up nicely. The platform never had any fundamental evangelists pushing for a harmonized Linux experience. There is always Chaos, sometimes a little (nvidia driver breaks again because of a point update) or a lot (Well, today we have no GUI because Gnome moved to Wayland and for some reason my distro has some component it doesn't support yet so when it went to apply the update it failed).
Until idealism stops running the show and there is some consolidation in the fundamentals of Linux on mobile and Linux on the desktop, neither will be a way forward if you value access to the latest software, consistency in the fundamentals, stability, and out of the box hardware support from manufacturers.
<rant>
Ubuntu eases the transition because I don't have to think (too much) about partitioning the disk. As a "user", I really don't want to ever see disk partitioning software, ever! Although, now I wish /home was in a separate partition -- why wasn't that the default?
I continually trip over the Super key which brings up the Unity version of Spotlight. I killed Spotlight on my Mac a long time ago. I killed whatever Unity thingy the Alt key used to do. Was it the Alt key it hijacked? I don't remember. I don't even remember what weird UI it was trying to foist on me. It was annoying.
I killed the Guest user, but it reappeared with subsequent updates. I still sometimes get logged in as the Guest user, just annoying. And why does the login screen forget the state of the NumLock key? When we type in numbers that might be in our password, we get cursor keyed to the next or previous user in the list. Fun times.
I stopped using LibreOffice. I so much wish I could kill the office ecosystem, it is dreadful. Stop sending me .doc files. My solution is to use an ancient version of Word on an old Mac, and it prompts me so that I can avoid running those Word Macros.
My production server runs Linux (for years now), and I am surprised by how robust it is. A desktop is much more complicated to do well, to satisfy everyone's whims, never mind all of my strange choices.
Not even particularly a desktop issue, but: init cron rc rc.local upstart systemd runit launchd daemontools srvscanner inetd xinetd StartupItems grub2 SystemStarter (I am sure there are many more.) Do I care? No. Ah yes Linux, all of those choices, doesn't even have to be Linux, flavor of the week of BSD anyone? And if I don't like them ... it's open source, so I can "simply" fork my own turd. Not.
</rant>
I feel you there. When I first started dipping my toes into Linux at age ~16, I remember the Ubuntu installer asking me all kinds of questions I didn't know the answer to, like "Would you like to repartition your disk?" "Would you like a logical partition or a physical partition?" "What disk would you like to install your bootloader on?" etc. I had no idea how to answer any of these and I just sort of went with the options that sounded safe to me.
It wasn't until several years and versions of Ubuntu later that I decided I wanted to try a different distro and realized how convenient it would have been if I had put /home on its own partition. The whole concept of "you can swap out your OS and leave your files alone" was incomprehensible to me at that point as a lifetime Windows user. Even if it had been the default to split up the partitions, I probably would have set it to stick with just one, because that's how Windows did it.
(Incidentally, I have vivid memories of my grandfather's Windows hard drive, which he had inexplicably partitioned up to at least G:\ and I think L:\. I still don't know why he did this and since he has passed, I probably never will. Our best guess is that it was a habit he picked up from some pre-Windows OS - he was an early tech adopter, but opinionated.)
>all those startup service managers
I haven't used desktop Ubuntu in a while, but after investing some time into grokking Arch and systemd, I'm pretty happy with my idling process list. I'm pretty sure I know what each of them are and how they got there. I'm young enough that I don't miss init, though, so maybe that's just naivete on my part and malicious gluttony on the part of systemd :)
> It wasn't until several years and versions of Ubuntu later that I decided I wanted to try a different distro and realized how convenient it would have been if I had put /home on its own partition.
It's a great idea in theory, but my /home is full of dotfile configs that belong to specific applications - which is great, I can take my configs with me! Except that not all distros have these in the same place. For a given app, some distros will have the config in ~, in ~/.config/, in /etc/, and so on.
You reap what you sow – in this case by having people confused over needlessly controversial choice of title for a blog post.
Personally I'm not invested in operating systems, so I do not care about their market share or anything of that sort. I have ended up using the dreaded Linux Desktop solely for the last decade, because it's what I'm most comfortable with (also, I can install it on my machine unlike OSX, which is why it never had a chance to compete).
(I can understand that people who use laptops have much more compatibility issues, but can't say anything else on that matter due to lack of experience. I've had essentially equal amounts of hardware compatibility issues with both Linux and Windows in the 2000s, adding up to very little in total indeed. For me personally, the last major issue with the Linux Desktop was purely aesthetical, and which not many found a problem to begin with, to which I found a solution about a decade ago.)
Nobody else seems to have mentioned it, but with 2x4K screens, Ubuntu Gnome (16.04) with it's Hi-DPI capabilities is the only thing even close to usable for me at the moment.
I've tried switching to XFCE and a couple of other light weight DEs but when I can actually get both displays to output at once, the text and UI are so tiny or messed up it's unbearable.
That said, Windows 10 does an almost equally horrific job (scaling everything up to a horribly pixelated size but at least a size I can see sitting 2-3 feet away. I only use Windows 10 for gaming so it doesn't bother me.
My go-to for both home and work is Ubuntu Gnome 16.04 right now, sadly even Kali is horrific to use when I do occasionally need it now, but Ubuntu with Gnome has worked close to flawlessly and looks stunning with its crisp clear test and UI elements at 4K.
I think one of the problems is that there's no centralized leadership from an encompassing GUI perspective. It's tribal. The kernel and RMS contributions of course has centralized leadership and are much more successful and coherent.
For anyone who skipped the comment section of the article, I found the following comment by "has" to be very insightful. Specifically the way that the community would rather divide its resources to keep multiple mediocre implementations of a given thing around than consolidate them, and how the Unix Philosophy has been ignored:
@Miguel: You are right about developer culture being a huge factor. Linux geeks all to often see the OS as the end in itself, whereas the rest of the world knows the OS is merely the means to an end. The OS is the least important component in the ecosystem: what actually matters is the applications and services they can use to get things done. Either it enables that or it obstructs it.
You also mentioned excessive fragmentation as being one of those obstructions. Now, I do believe the 'let a thousand flowers bloom' approach of the OSS world is valuable: where it completely falls down is in its abject failure to asset-strip the less successful variants for whatever merits they have unveiled, then put them wholesale to the sword. Evolution doesn't succeed by ideology or sentimentality; it works through merciless competition culling the weak so the strongest may dominate.
The best immediate remedy for Linux's desktop ills would be to put 80% of current DEs to the axe. Really you only need three mainstream distros: one to cover general users (Ubuntu; sorry Gnome 3), one to cover the inveterate tinkerers (KDE), and one to cover the reactionary conservatives (one of the Gnome 2 clones/derivatives). Retain a few specialised distros for niches such as very elderly/low-powered machines (hi, Pi!) and security work. Anything else is a research project to produce new ideas that can then be stolen by the mainstream distros, or just a leach on the body Linux that should be salted forthwith that the rest may grow stronger.
...
However, I think you completely missed one other valuable - and uncomfortable - point; arguably the most essential of them all.
While addressing the excessive dilution of manpower and message may help in the short term, there is a far more fundamental cultural problem: the Linux desktop world (and even the kernel world beneath it) has completely and utterly forgotten its roots. Unix Philosophy isn't merely a neat marketing phrase: it describes a very specific way to construct large, complex systems. Not by erecting vast imposing monoliths, ego-gratifying as that may be, but by assembling a rich ecosystem of small, simple, plug-n-play components that can be linked together in whatever arrangement best suits a given problem.
By mimicking the Apple and Microsoft tactic of constructing vast monolithic environments and applications, you have all unwittingly been playing to their strengths, not yours. Such enormous proprietary companies can afford such brute-force strategies because they have vast financial and manpower resources to draw on. Indeed, it works in their favour to do so because vast monolithic applications help to create user lock-in: look at Adobe Creative Suite; look at Microsoft Office. Nobody can truly compete with them because to assemble comparably featured applications takes at least a decade: until then, any competing applications are fewer featured and far more susceptible to being excluded by the vast user-side network effect that the big boys have formed around themselves.
Conversely, look at what has happened when the above vendors have tried to take a more component-oriented approach. For instance, Apple's attempt to implement OpenDoc failed not because it was fundamentally, fatally flawed at the technical level. (It may not have been perfect, but what is? It was still a good and promising platform in itself.) It failed because the business model it proposed - lots of small, cheap, single-purpose components from many vendors that users could purchase and mix-and-match however they liked - was utterly disruptive and utterly incompatible to the business model used by the very application vendors that Mac OS relied on to survive. Adobe's control of the market was predicated on it being the 800-pound gorilla in the room; it was never going to give that up by choice.
Whereas the Linux business model has no such requirements for maintaining artificial scarcity; indeed, given its far more limited development resources, it should be pouring every ounce of its strength into finding ways to work smart, not hard, like this. Unix Philosophy was a reaction to the inescapable hardware limitations of the day; now those limitations are no longer enforced, nix developers have gotten flabby and soft. They build these vast, inflexible edifices simply because they are not required to find a more ingeniously efficient way, and because as a short-term strategy diving straight in and copying how everyone else already does it is inevitably the easiest, laziest approach available. As a long-term strategy, however, it's an absolute disaster. Projects like Gnome and Open Office become like our banking industries: vast, baroque, impossible to regulate effectively, and cripplingly expensive to maintain.
The result is: vast projects that are far too big to fail. The thought of axing, say, Gnome 3 - not on technical merit but simply because it consumes too much resource from Linux as a whole - becomes unthinkable. So rather than killing it and folding its best bits into other, fewer distros, even more manpower must be poured into keeping it going and looking like it actually serves a critical purpose instead of acting as yet another boat anchor on the whole show. Manpower that should've been invested in finding ingenious ways to play to Linux/Unix's unique strengths, not to its competition's.
Apple didn't go from virtually dead husk to #1 in the whole damned industry by continuing to play Microsoft's game by Microsoft's rules. It did it by looking at what MS and all its other competitors weren't* doing, or weren't doing well, in order to meet consumers wants and needs, then devising a cunning plan to do a complete end-run around the lot of them: redefining the entire game to suit their own strengths and allow them to define their own rules. Even if it meant burning their own traditional platform to get there. It was an absolute masterstroke, and a prime reminder that if you want to understand how this game is really played, you don't read Slashdot, you read Sun Tzu.
...
TL;DR: Windows didn't kill the Linux desktop and neither did OS X. The Linux desktop killed itself, by playing on their terms instead of its own. The best thing it can do now - once denial and recrimination are done with - is turn the killing process itself into a virtue, and slice not only the DE/application mess but also the cultural one right back down to the bone and start rebuilding from there.
I loved reading this comment. Thank you for this long quote which contains few familiar thoughts. I've shared similar points during one of my presentations few years back.
Yet years passed and we in FLOSS world seem to be stuck in the same place. We're creating more and more overbloated-I-do-it-all-better software than to improve with small drop in solutions. We can see that the only integration efforts are done by replacement of small good tools with crappy rewrite inclusions. Not compilation and redistribution of original small but good software following good old UNIX pipeline philosophy.
Sure, the are projects like suckless keeping up the good work but there is no connection with more mainstream solutions like Ubuntu or KDE. Which instead of serving with experience about UX and stuff do their own rewrites according to their own APIs.
Writing independent GUI applications becomes more and more of a chore. So many different ways of doing simple things to support. Like multimedia keys or even a tray icon. Try to do it for multiple operating ststems: you do macOS, Windows and when it comes to Linux: Unity, Gnome, KDE and there is I think still at least standard X-window way of doing things for others which should work quite fine for other UNIX based open operating systems.
"The Linux desktop killed itself, by playing on their terms instead of its own. The best thing it can do now - once denial and recrimination are done with - is turn the killing process itself into a virtue, and slice not only the DE/application mess but also the cultural one right back down to the bone and start rebuilding from there."
One of the key features of Linux is the way it was always supposed to run noticeably better than Windows on much older and/or lesser hardware.
If everyone involved had at least prioritized this one feature without compromise (I know, infiltration of open source by those who felt the need to kill this advantage), that would still be part of the playing field that other OS's would have to contend with instead.
The comment you quoted was right; people don't care about the OS, they care about their apps. Since Windows is king on the desktop, it would make sense to have an open source "clone" of Windows that will be able to run all the Win32 apps people love. This is exactly what React OS is! Sadly, the project never got the resource and support it deserved. Instead, we have 200+ Linux distros, multiple package manager, DE, etc...
Ironically, the only "distro" that has a real chance of success is Chrome OS, because everything is moving to the web, but also because Google made it ultra simple, secure and zero maintenance. It can also run Android apps, so that's interesting.
And in both cases, Chrome OS and Android, the fact that Linux is running underneath has zero value for the Web and Java developers targeting the respective OS.
Google can replace it by something else and they won't notice.
This is just a Pyrrhic victory, but geeks haven't yet realized it seems.
The linux desktop was killed for desktop developers. We haved good desktops with gnome 2 and kde 3, only needed evolution (kde 4 is true that was evolution,but plasma 5 is a buggy horrible slow thing), and nerds decided that the cool thing was transparencies or twisted cubes, then came KDE4 with unstable first versions, Gnome 3 with many icons triyng look how a mac, and unity.
I was KDE user but with Plasma 5, the kde people spit again to users with a new buggy version that broke all things as with kde4.
I use MATE now, and i'm happy. Simples menus, file browser, no problems with graphic card because incompatible drivers with desktop...
The desktop is dead anyway - people use phones and tablets now. For the uses that my son gives to the PC - YouTube, Agar.io and Counter-Strike - he uses Mac or Linux without making distinction
It's the difference between riding a hand-tuned motorcycle versus a car. The latter is low maintenance, always works, etc. but ask a biker, he'd say how passionate he's in maintaining his beast and how well it performs, etc. a car is no comparison to it.
Now how many bikes do you see as opposed to cars? The premium would always be lesser, only for those who want it and are willing to sweat for it.
So which is Linux in your comparison? I think many users, especially developers or "power users", feel things just work on Linux compared to more frequent twiddling required on macOS / Windows.
you read this with chrome. they could port that to bsd or even their new os magenta and you wouldnt notice. the linux part of that is tightly constrained. honestly,thats why itworks.
I liked this essay, but I also wanted to rant for a bit about how the state of the Linux Desktop is just one particularly glaring example of a bigger problem, which is that instead of building something that works for most use cases, it seems to be popular to build things that work really well for one use case and not so well for others. Which in some cases is fine, but when we're talking about infrastructure like operating systems or programming environments or APIs, it's a big problem for anyone who wants to build on top of this infrastructure because then they have to pick a technology stack that limits who will use the thing they write because it was built on top of stuff that only works well in its little niche.
I write software. Suppose I want to write a program and have lots of other people use it, and want good confidence that those people will continue to be able to use it for a long time into the future. This is a pretty basic thing to want to do; how do I do it?
The first question is what kind of software is it? If it's a desktop application, then I'd need to write it for Windows and/or MacOS. If it's a server application, then Linux would be a good choice. If it's mobile, then iOS or Android.
Writing in Java might be a good choice for OS portability, but then I'd have to educate all my users on how to run java apps on their platform.
So, what we have is a lot of walled gardens and the un-walled wilderness of Linux, and these are all incompatible with each other for various reasons and I can't really say that any one of them is what I would call a good general-purpose platform for writing general-purpose programs.
It seems like something that's missing in the open source community (and something that would combat the walled-garden balkanization of user communities in both proprietary and open-source software) is an effort to create portable, carefully designed binary formats and APIs that we intend to be stable and usable for the next fifty years or so, and that "just work" for almost any common use case, whether it's a desktop app, a mobile app, a server app, an enterprise app, an embedded systems app, whatever, and then we make sure that there are tools available on every open platform to run these portable binaries.
(Java did attempt this before, but their early efforts were slowed by being a proprietary single-vendor platform; once they opened up, Java was kind of old and less exciting. It should be possible to do better by creating a new platform that isn't part of some maneuver by one tech behemoth to take market share from some other tech behemoth, and by building from a more modern foundation.)
You mean package the JVM with the application, in case it isn't already installed? The details of how to do that vary by OS, and making sure it works properly on all recent and future versions of Windows, MacOS, Ubuntu, Android, iOS, RHEL, SLES, Raspbian, Arch, the various BSDs, VxWorks, Hurd, and anything other platform that exists now or will exist in the next few decades isn't something I care to do. I just want to be able to write software and have users be able to use it without knowing or caring what platform they're using -- which is what Java was meant to solve, but they never had a broad enough user base that that you could truly hand someone a java binary and trust that it would "just work" for everyone.
(I admit that I'm not very knowledgeable about how Java is used in 2017; there may be some cross-platform solution for distributing Java programs to people who might or might not already have the JVM installed of which I am unaware.)
> As for myself, I had fallen in love with the iPhone, so using a Mac on a day-to-day basis was a must.
I don't understand this. Why do you need a Mac if you use an iPhone? I have not used iTunes to manage my iPhone in half a decade I think. Is there any connection left between iOS and macOS? From a user point of view?
The high level of integration in Apple's stack is the one thing I really miss after moving back to Linux. If you have a Mac and an iPhone the two exist almost as one thing - you can answer phone calls on your computer, have a shared message history, and these days copy and paste between devices or use the presence of your watch to unlock your computer.
While I agree with many of the author's points, I don't think "lost" or "killed" are valid descriptions. It's never too late to dig in and make the Linux desktop experience better. It's not like Linux disappeared and doesn't exist anymore.
Why not create a gnome equivalent of containers? It won't make compatibility issues go away in the past. But containers for Ui applications could help going forward.
thats kind of what flatpack and company are trying to do. its something that apple got right a long time ago. its also something microsoft is moving to with the .appx package format. everything should be in a bundle that is installed and removed as a unit.
The Linux Desktop didn't die, because you simply can't kill something that never had life to begin with. Linux (as in the kernel) succeeds because it's free, flexible, stable and secure. But none of those words are "pretty", "aesthetic" or "powerful". And the DEs associated with Linux reflect this in every nature. OSX became the uber tech junkie's OS of choice, but with Apple's recent shenanigans these savvy enthusiasts and devs are also starting to use Windows again. And in the end it doesn't matter, the internet has taken over the desktop OS anyway.
> OSX became the uber tech junkie's OS of choice, but with Apple's recent shenanigans these savvy enthusiasts and devs are also starting to use Windows again.
I believe that hipsters/nerds migrated to Apple for two things: build quality (most laptops these days STILL are made of plastic, and well as someone who often eats near his machine or operates it with pizza-greasy fingers I can say that an Apple laptop just doesn't give a sh.t while all my Windows laptops end up looking very indecent after a year), and battery lifetime.
The latter is the most important: Apple with its ultra tight control down to the tiniest chip on the motherboard can tune the performance in ways that cannot ever be achieved by either Windows or Linux, with 6+ hour usage times being the norm. Windows laptops with this usage time usually are way more than expensive than Macbooks, are heavier than a piece of granite rock, or a combination of both.
> The latter is the most important: Apple with its ultra tight control down to the tiniest chip on the motherboard can tune the performance in ways that cannot ever be achieved by either Windows or Linux, with 6+ hour usage times being the norm. Windows laptops with this usage time usually are way more than expensive than Macbooks, are heavier than a piece of granite rock, or a combination of both.
This definitely used to be the case (and was one of the main reasons I used macbooks over windows laptops, the other being the touchpad) but this is no longer true (outside of cheapo laptops) imo.
High quality ultrabooks like the dell xps 13 rival the macbook in build quality and battery life, and afaik are not more expensive than macbooks.
> Linux (as in the kernel) succeeds because it's free, flexible, stable and secure. But none of those words are "pretty", "aesthetic" or "powerful".
I think that StumpWM, i3 & dwm are all powerful, æsthetically-pleasing environments. I think that emacs is a powerful, æsthetically-pleasing environment.
> OSX became the uber tech junkie's OS of choice, but with Apple's recent shenanigans these savvy enthusiasts and devs are also starting to use Windows again.
I think anyone who chooses to use Apple doesn't care about his freedom, and anyone who chooses to use Windows doesn't care about æsthetics. With Linux, I have both.
> The ecosystem that has sprung to life with Apple's OSX AppStore is just impossible to achieve with Linux today.
And yet Apple breaks compatibility basically every release; and the App Store itself often fails to operate at all, even for Apple's own packages!
Also, if computing is suddenly all about The Open Web; then the point is moot since all three major desktop platforms have identical support for The Open Web. Then in praising the Mac App Store, Miguel goes full circle and affirms that native software is still important. I'd countenance either train of thought, but you can't multi-track drift your ideas like this.
I really don't like this new hipster post-mortem style of writing which makes heaps of unfounded assertions, and mixes them in with indisputable facts. I wish Miguel would stop trying to convince people to leave the ecosystem he helped create just because he no longer prefers it to the alternatives. Why drag your old friends down with slam pieces on WIRED when you could just let things shake out?
Couple this with the fact that for a decade, the actual GUI was tied very, very heavily to the apps. So you had to use Gedit in Gnome, or whatever the KDE version of a fucking text editor is. Seemed really, really stupid. Can't we just have a relatively standard GUI text editor across platforms? Or were we all just expected to use vi and emacs?
When the windowing layers stopped moving around so much, and Greg Kroah-Hartman and co. had gotten the driver problems under control, we suddenly had a bunch of different Linuxes that were no longer compatible with each other due to packaging systems. So, now users had to download a deb, or a yum package, or a gentoo package... It was all very confusing for desktop users, especially beginners.
My quote to sum this up would be: "Why are these packages in my repository if they won't work after I install them because I'm using KDE/Gnome/Cinnamon?"
[1]: Unity took Ubuntu from being the default Linux I could put on anything, to a bloated, slow moving sack of crap. I could put Ubuntu on a 2001 laptop and be completely fine, fast enough, usable enough to do just about everything I needed.
Put Unity Ubuntu on that same machine, and it's unusable garbage. I can't even hit alt-F5 to get to another terminal because Unity slows the whole rig down to an absolute crawl. Like, can't even move the mouse, 5 minutes between keypress and action taking place. Fuck Unity. I blame Unity about 40% for killing the Linux desktop.