Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Wayland Isn't Going to Save the Linux Desktop (2022) (dudemanguy.github.io)
99 points by ms47jdu on Dec 3, 2023 | hide | past | favorite | 167 comments


Wayland didn't save the Linux Desktop for me, but I'd say that dropping nvidia + Wayland in favour of amdgpu + Wayland did.

Wayland is simply unusable with Nvidia's junky blobs. It frustrated me to the point of selling my founders ed 3090 (that I got from a friend at Nvidia at launch) and replacing it with 7900xt. It was like night and day. Not a single dropped frame (at 120hz), no tearing, not a single crash in 12 months. Just some minor issues with Electron to do with variations in fractional scaling across monitors. My 7840u 90hz oled Thinkpad is the same - not a single crash, not a single dropped frame. An absolute joy to use.

Wayland isn't perfect, but it certainly isn't the problem. At least it wasn't for me.


To my eye, Nvidia not playing ball seems to be at the root of some of the most significant issues with day to day Wayland usage. Unfortunately, with the degree of lockin they’ve achieved with CUDA they don’t have to care a whole lot.


At this point nVidia is an AI and HPC accelerator company that happens to attach video outputs to the accelerator cards for that niche use case.

They are still good video cards but that’s a side effect of their main focus.


That's true today but it's a really recent development that doesn't justify the current situation with their proprietary blobs.

I vaguely remember they made some announcement about planning to push things to the mainline kernel but I've stopped following most of the drama.


CUDA is their main moat, so presumably they don't want to risk having open source drivers provide outsiders an insight to how their hardware works...


Before that was the case, you could just say "gaming" instead. There is nothing new here from the perspective of Linux users.


Yet said company has a 90%+ market share in its 'side hustle' of gaming GPUs, and makes the SoC for the world's best selling console (the Switch).


They sell their non consumer gaming hardware for multiple times the profit. As llm usage keeps increasing I would not be shocked if Nvidia exited the consumer gpu market as it becomes smaller and smaller part of its business.


I feel like this is a perilous outlook - LLMs and other AI might not turn out to be the success stories we want them to be - we are still banking on their future potential instead of their present capabilities.

We are with AGI at a similar point we were with self-driving cars circa 2016(?). Back then it was predicted that each car will need a super powerful AI brain to be able to drive itself. We didn't get self driving and I'd bet these fancy driving assistants that we have today in cars don't justify the technological expense that went into them.

Gaming is a solid, established business and has been the backbone of NVIDIA's profits. The AI hype is very high now, and it's not clear where its going to go in the future, but it's not going to be stable.


Same arguments apply to crypto it seems, but that didn’t stop nvidia basically leaving the gaming market high and dry while the crypto market vacuumed up all of their cards.

I agree that’s a perilous approach for nvidia… but based on past behaviour that doesn’t seem to bother them.


to such a point that people will buy an Nvidia GPU for compute and an AMD CPU for the integrated graphics


This criticism of Wayland with NVIDIA drivers surprises me every time I see it, because I've been on Wayland a few years with NVIDIA hardware (both very old and very new) and it's all cool.

But I'm on Fedora. Are you on a different distribution? Ubuntu?


Last time I tried the 535 driver, GPU copies from XWayland and Wayland were broken, as well as Vulkan in Wayland. Pretty much the only thing which works are GL applications in Wayland. I found that both of these had been previously reported by other issues: https://gitlab.freedesktop.org/xorg/xserver/-/issues/1444 https://github.com/NVIDIA/egl-wayland/issues/72

I'm curious what kind of workloads you were running that you have had no issues. I wouldn't expect XWayland to have worked at all before Nvidia added GBM support.


> Wayland is simply unusable with Nvidia's junky blobs.

That's a lot less true as of their 545 branch. It still has a rather serious VRR sync issue but otherwise they're almost there. And aside that sync issue, it feels better than X does. It's close.


545 made my Steam games unplayable. Had to roll back to 535


> dropping nvidia + Wayland in favour of amdgpu + Wayland did.

This also dropped my laptops power usage by 2.5W.


My problem with Wayland is that it is (or at least seem to be) so much close in respect to X11. Linux is not just great because the code is open source, but also because the whole ecosystem is open.

This meant that X11 was open to applications to interact with it and implement functionalities that otherwise X11 didn't offer. Also, X11 allowed functionality to be implemented by multiple, small programs with one purpose: window manager, compositor, screen recorder, screen locker, screensaver daemon, clipboard manager, etc.

Wayland is closed in the sense that all the functionality needs to be incorporated in the compositor, since otherwise the protocol lacks of the required interfaces for other applications to communicate.

Sure, from a security/privacy prospective it's good that any application can't see or interact with the content of other windows. But, we are not on macOS or Windows! Us, and with us I mean us Linux users, trust 100% the software that we run on our machine, because we trust the open source community! Thus I don't care if any application may access the screen, because I only run trusted software.

I prefer that an application may do so, since by doing so it can implement functionalities that are not possible on proprietary operating systems! This to me means being open, not just having the code available, but also being open for interaction between any piece of software on the computer.

To me it seems that Linux is going into a wrong direction because it makes assumptions that make sense only in the context of proprietary operating systems like macOS/Windows. We choose to ditch these systems for an open OS not to have these issues! I don't want to have the stupidity of these OS ported on Linux...


> I mean us Linux users, trust 100% the software that we run on our machine

I don't think pretending everything is just a trusted system (in the trusted computing sense that trusted system is one whose failure would break a security policy) helps.

One of the reasons why system stability overall got better since the days of cooperative multitasking OSes is that you can no longer simply steal CPU forever or write to memory that isn't yours anymore. You will get preempted. You'll get a segfault. The OS doesn't trust, and doesn't need to trust, that all the applications running in the system are written to a quality higher that the OS itself to "just not crash all the time".

On the note of absolute trust, I have less that 100% trust on any proprietary crap that I run.

I also absolutely have less that 100% trust on any random site that I visit. I doubt anyone runs a browser that isn't under some sort of namespaces, sandboxes or MAC. So those layers of security are valuable, and I hope they continue to exist.

> To me it seems that Linux is going into a wrong direction because it makes assumptions that make sense only in the context of proprietary operating systems like macOS/Windows. We choose to ditch these systems for an open OS not to have these issues! I don't want to have the stupidity of these OS ported on Linux...

Aren't you in fact trying to shoehorn the Windows model (Just let me run this binary blob from the internet that does arbitrary side effects on my system) in Linux?


> trust 100% the software that we run on our machine

Wow! I’m sorry, this is very naive and short-sighted position, especially for a Linux user. Malware has been known to evade the open-source scrutiny and even a few days of exposure would be enough to make significant harm to certain users. Your data may not be that value or sensitive, but that’s not universally true. Most users have no idea what software is running in the Linux machines given the very complex dependency chains that exist on modern systems. Also, companies’ motivations may not be evil, but it doesn’t mean they’re in the best interest of their users, for example Canonical and their ad deal with Amazon. While I’m in favor of convenience and by no means a free software absolutist, you can’t blame the Wayland developers for trying to hold a zero-trust or trust-minimization position.


You can because it's pointless. If you run a malicious program, it can read other process memory and all the files owned by your user. The fact that it can't read another GUI window is security theater. You've made everything a lot less useful for the user without providing any security.


>it can read other process memory and all the files owned by your user

Tools like flatpak and bwrap exist. A process in a bwrap sandbox can't read other processes memory or any files that I didn't bind into the sandbox.

If I want to run an X11 GUI app in a sandbox, it can trivially escape. Wayland makes it possible to do this without a trivial escape.

Also, processes cannot read other processes memory without being a parent process or something, or having root, and with bwrap they can't even see other processes running on the system at all.


With this kind of logic, you don't fix any security hole, because you can't fix them all at once.


You can use linux namespaces or a tool like bubblewrap to sandbox the process which makes it reasonably secure. However if you're running a GUI app that rendres to an X11 server all your security gains go out the window.


> Us, and with us I mean us Linux users, trust 100% the software that we run on our machine, because we trust the open source community! Thus I don't care if any application may access the screen, because I only run trusted software.

The problem is that this mindset is incompatible with widespread adoption. The "year of linux desktop" will never come if this continues. You may be fine with this, but many other people aren't.


If the year of Linux on the desktop ever comes, we will all hate it because it'll be just like windows or macOS. Mainstream users don't want to be empowered. They want someone to hold their hand and make choices for them.

After all, look at the one Linux on the desktop option that you can buy in any store near you. It's called ChromeOS and every decision in it is made to further Google's commercial interests. It's the least open ecosystem compared to macOS and Windows (at least those can still be used without a vendor account). There is nothing there that we want. Being a niche OS is a strength, not a weakness because we Linux fans are a niche.

I agree though that 100% trusting FOSS is a fallacy. Besides the possibility of introducing malware to the repo, applications can also be exploited by malicious content like video files exploiting vulnerabilities in VLC.


Passionate Linux fans are not representative of Linux users in general.

Linux has a huge number of people who installed it on anything from grandma's old computer to a corporate server, who have exactly zero interest in any kind of tinkering with system services.

We just want something free, that won't break in an auto-update, and has access to the ecosystem of things like PipeWire and DBus.


> Mainstream users don't want to be empowered. They want someone to hold their hand and make choices for them.

I think that is deeply untrue. They want to be able to change stuff, it just turns out to be hard. I bet there's a heap of things you want to change on your car and you haven't: because you are a mainstream user.

I use Ubuntu because every time I've tried to change distro, I've been burnt.

I've got as far as finding the lines of code I need to change in "Linux", but the next steps are not simple - especially maintenance over the years. Even if one has the technical skills necessary we may choose the consumer friendly version - and it isn't because we want our hands held. I'm currently using an iPhone - nasty bugs and other terrible compromises - but for the moment the upsides are beating the downsides.

Sweeping negative stereotypes are seldomly useful.


> bet there's a heap of things you want to change on your car and you haven't: because you are a mainstream user

No. I want it to turn on and go. The less involved my car is in my life, and I with its existence, the better.


> If the year of Linux on the desktop ever comes, we will all hate it because it'll be just like windows or macOS.

This is exactly why I don't want that to happen. We're already much of the way there, though, and the trajectory is clear. That's why I'm looking into abandoning Linux in favor of BSD.


I feel the same way and it is why I moved to FreeBSD as daily driver. Very happy with it in fact.

I like it because there's much less corporate influence over it. And less need to constantly change how things work.


> Sure, from a security/privacy prospective it's good that any application can't see or interact with the content of other windows.

I got tired of hearing this so for my current desktop I went for something a bit special. I cannot log as root not "sudo" nor "su" on the desktop itself. So, from Xorg, it's not possible to have, for example, a terminal running stuff as root (so a bad actor that'd be in control of my Xorg --not something I'd like mind you-- wouldn't have an easy way to run everything he wants as root).

The way I set up is simple: I've got a laptop, without WiFi (well it has WiFi but all the drivers have been removed), on a separate LAN, that I use as my "root console". The only way to log in as root on my desktop is through SSH, from the laptop, by using a Yubikey: so I modified SSH to allow root login, but not with passwords.

That laptop doesn't have Internet access: it's only connected to the desktop.j

I also hardened the kernel running on the desktop a bit: non-root users can only see their own processes, stuff like that.

I don't know how secure it is but it was fun (and easy) to set up. I'm running that setup since eight months now: it's smooth sailing.

Besides that... I'm using X back since way before the XFree86/Xorg fork and I do actually use wmctrl and a few other tricks.

From TFA:

> The problem is that they took out many features the users depend on with no real replacement in sight.

This. I never understood the Wayland approach of telling their users, early on, "You don't want to take a screenshot of your entire screen". When I did.

I don't even know if all the Xorg features I'm using do or do not work nowadays with Wayland but I don't really care: I understood very early on that Wayland developers were actually hostile towards Xorg users.

> Xorg could have no development for another 10 years and still be more functional.

Ouch.


For me, I switched to Linux because it's what runs on the Raspberry Pi, it doesn't force auto-updates, and it's free.

For a lot of users, modularity is not a desirable feature. If you want things to work predictably every time, tinkering with stuff isn't the best plan.

I don't trust the open source community that much. Mistakes slip by all the time. For the most part, everything is fine, but people's entire lives are on computers. You get the home folder, you can probably get someone's browser password store, then you get access to someone's entire life, including not uncommonly physical access via smart locks.

Some people (not me) even have cryptocurrency that can't be recovered.

Trusting the community fully doesn't make as much sense if we want to be able to have a digital first world.


I would say that this is not really true. One of the biggest complaints with Wayland has been the lack of network transparency. Waypipe came along and added that outside the compositor. The article itself mentions pipewire adding functionality. You can still write small focussed tools.

With compositor libraries like wlroots, swc, and Louvre appearing, it is no harder to write a Wayland compositor than an X11 window manager. It looks like it will lot be long before it is easier.

And while few ever attempted making alternatives to Xorg, multiple parties have created their own Wayland compositors. You can say that nobody bothered because Xorg was great but clearly writing a Wayland compositor is far easier.

The article says multiple compositors is a problem. I see that as a benefit although I certainly get that it will be a pain for app devs. At least, it will be that way until the ecosystem matures and the bad across stand out more.

Compare the logic of what is being said here with the web. Should we all celebrate when Chrome wins the browser wars. Clients will finally have just one environment to target instead of having to deal with how multiple instances implement the standard. Oh wait, over there we think that multiple browsers is a good thing. Why not on the desktop?


No matter how shortsighted this comment is, I get the sentiment. I'll eternally miss cool x11 gimmicks that could be added to any desktop like compiz, xpenguins, xsnow and xneko


I don't think it's impossible to do that, it's just not written by anyone yet. There could be an extension to the compositor that would give your application the ability to add those gimmicks if the user chooses to allow that.


I'll just echo here what I say whenever Wayland/X11 comes up: multiple monitors and screen tearing.

Wayland rocks those two challenges. No screen tearing, I can pick scaling for individual monitors on the fly, and the colors look gorgeous.

There were features that it was missing in the previous releases -- I have to use a pretty recent version of Fedora to have all the bugs that I need rattled out of KDE so that I can use Wayland -- but it's really going well now. Zoom works thanks to pipe wire, copy and paste works. I hope KFonts will get working soon but I've been able to work around that with fc-cache.


> multiple monitors

You are talking about a very rare niche case where one monitor runs with VRR and the other doesn't. Everything else is handled just fine with X.

> screen tearing.

"No screen tearing" just means forced vsync which is easily possible on X11 with a configuration switch or by using a compositor. Actually forced vsync is one of the great disadvantages of Wayland because it always comes at the cost of higher latency.

> Wayland rocks those two challenges.

And it sucks in every other challenge. Most importantly standardization and development. The Wayland API ecosystem is ultra developer unfriendly and complicated and will pose serious harm to the FOSS landscape which thrives on hobbyist and niche applications. It's so bad it wouldn't be far fetched to call it sabotage. (E.g. look at the hello world: https://github.com/emersion/hello-wayland/blob/master/main.c it's a complete mess)


Not really that niche when you have a 4K laptop hooked up to the 1080p monitors that you were able to buy because it was on your own dime and you wanted to go the affordable route. Every time I do that on X, The only thing I can do is downscale my 4K laptop screen to 1440, I can't set a fractional scale for it while still preserving reasonable performance.

And no one cares about standardization and development. They just want to use their blasted laptop. I don't know about any of that stuff, I just know Wayland works better out of the box.


How is that interface unfriendly?

The basic idea here is that you provide some callback functions the handle events, setup a framebuffer with mmap, and register your surface and framebuffer with the compositor. There is nothing complicated or messy going on here.

This is also the low level interface, most people will not being using this directly and instead will be using a GUI library like GTK or whatever.

I don't know very much C and I have been able to get a "hello world" wayland window setup and displaying some silly graphics with cairo in a few hours or less.


For my personal taste it's already a mess but for the the average masochist it's maybe bearable. The real mess starts from here though:

- Wayland can not render strings. You need something external just to render some other text than "hello world". And depending on the library text rendering will look different for each application.

- Wayland can and will block your render loop for arbitrary reasons (like being minimized) potentially indefinitely. In order to make this application useful you have to put everything concerning rendering in it's own thread which complicates things hugely.

- If you want to have basic functionality like capturing the screen (former simple XGetImage() call) you have to talk to dbus and pipwire which pull all kind of dependencies and require loads parallel infrastructure running. And then you still have no guarantee that it works on every compositor.


To me these are all advantages, not disadvantages.

Wayland is not X11 and doesn't want to be, it doesn't implement even a fraction of what X11 did and that is an intentional design choice!

Wayland does not provide any graphics drawing primitives at all. It only sets up a handle to some shared memory and expects the client to do all drawing.

I think the point here is that you shouldn't be using the display protocol to render text in the first place. If you want to draw text you use a library that is designed for that task rather than shoving that functionality into the compositor and into the Wayland protocol itself.

>- Wayland can and will block your render loop for arbitrary reasons (like being minimized) potentially indefinitely. In order to make this application useful you have to put everything concerning rendering in it's own thread which complicates things hugely.

You don't have to use a thread to do this! You aren't forced to block and wait for the callbacks, you can implement your own event loop and check for messages when you want to!

>- If you want to have basic functionality like capturing the screen (former simple XGetImage() call) you have to talk to dbus and pipwire which pull all kind of dependencies and require loads parallel infrastructure running. And then you still have no guarantee that it works on every compositor.

I think it's good that clients can't capture the screen without permission, and I think using dbus for IPC is very reasonable, but I don't know anything about pipewire or the compatibility stuff.

Also I think it's fair to not like some of these design choices for sure, but it would probably be better for people who like X11's design to continue to use X11 rather than trying to force Wayland into an X11 clone. X11 will be around for a very long time so I don't think anybody will be forced to stop using it any time soon, and people can pick it up if development stops!


> Wayland can not render strings.

Wayland is a display server. Display servers in general do not render strings. They render bitmaps passed in by the clients. Clients are welcome to use their favorite string render libraries into those bitmaps.

> Wayland can and will block your render loop for arbitrary reasons (like being minimized) potentially indefinitely.

Do you realize that all wayland calls are async? On the client side, you don't have to block on waiting on empty socket, if you do not want.

Few conmmens above, someone complained that a basic wayland client is complicated. The complicated thing was... setting up an event loop.

> If you want to have basic functionality like capturing the screen (former simple XGetImage() call) you have to talk to dbus and pipwire which pull all kind of dependencies and require loads parallel infrastructure running.

Yes; but these dependencies are in different processes, not yours. From the POV of your process, it is just an structured IPC.

XGetImage() is not that simple; it only coincidentally worked due to implementation detail. Having a global framebuffer is not mandatory, just at the time all the PCs had it. Nowadays, even PC hardware is getting overlays, so there might not be a buffer that represents what's on the display anymore.

Another issue is, that it is impossible to implement zero-copy screen casting with XGetImage(). It is possible and has been done with wayland and dmabuf, feeding the screencasted surface into hardware video encoder (without the buffer bouncing between system and GPU ram several times), and the userspace getting already compressed video stream.

Final issue is, that XGetImage() is not gated via user permission and does not provide indicators, that the screen was grabbed or is being casted. Wayland does.


Last I knew multimonitor with mixed DPIs (an increasingly common situation with laptops standardizing around HiDPI displays) was messy under X, with xconf fiddling required to get things working well.


This is not a problem with X. DPI and pitch information is accessible via the xrandr extension. It's GNOME and GTK that arbitrarily chooses to ignore that information.


The problem with X is that all X11 screens have to have same DPI (and few other things, like color depth). Which is quite a problem when you have mixed-DPI displays.

Now the separate X11 displays do not have this limitation, but they have other ones: like not being able to move your window from one to another, without restarting the app. Historically, there was only one app, that was capable of changing the display without restart: XEmacs. For all the others, I have my doubts, that users would accept such limitations.

Now, both Xinerama and Xrandr standardized on using X11 screens for multi-monitor displays for exactly this reason. With displays roughly with same DPI and the graphics becoming truecolor, it wasn't really problem. But nowadays, it is.

That is mixed-DPI hardware. Another issue with your argument is, that "DPI and pitch information is accessible via xrandr extension". Yes, it it, that is true. The problem is, that it is one-way street. The client can read that, then it can respect that and adjust its rendering... but it doesn't have to. The client is free to ignore all that. Now the display server has the problem, that it doesn't know, how the client decided: should it upscale the client, or not? Nobody knows, all the display server has is an opaque bitmap.

With wayland, the client must explicitly set the scale, communicate it do the display server, so then the compositor knows, how to handle that surface. It can even transparently downscale the surface, when the user moves it from HiDPI to normal-DPI display. Something impossible on X11.


To be fair, I actually ran a system recently (Surface Tablet with integrated intel graphics) where the screen tearing on X just wasn't solvable. Neither with settings in the xorg.conf nor with a compositor could I get Youtube videos to not tear on fast movements. If Wayland really would solve that for those machines - while being deactivable where it counts - that would be a big plus even for me.


Currently you can't turn off vsync on wayland, except some limited cases involving full screen apps with direct scanout that may not even be implemented in your compositor or may not be possible due to missing implementation for async page flips with atomic modesetting.

Due to design issues relating to implicit sync with wayland any misbehaving app can cause your entire desktop to drop frames so if you're a multi monitor user expect stutters when the browser you have open in the background was a little too slow. Or worse if an app is badly broken [1]

The good news is that all of this is fixable. Windows has forced compositing for most cases and it performs much more consistently. A browser taking a while to render can't hang unrelated windows' composition. Wayland compositors can get there too eventually by moving to explicit sync APIs [2].

For now if you do care about these issues staying on compositorless X is the least bad option.

[1]: https://github.com/ascent12/compositor-killer

[2]: https://www.collabora.com/news-and-blog/blog/2022/06/09/brid...


Oh my goodness. Is that’s what’s been going on? I’ve noticed that when I open one particular browser on my desktop that the entire display freezes for roughly a second before it becomes responsive again.


I think that's the issue. People use LTS distros running ancient software versions and declare Wayland sucks


Did you read the article? The author shows many concrete and serious ways that Wayland is not good enough, both in protocol and implementation.


Wayland was never going to be fully Xorg compatible. It's not just that it would be a massive effort, but it conflicts with core Wayland concepts relating to isolation and security. If keystroke access and window properties were still a free-for-all like on x11, we'd be back around to building on an imperfect protocol. Distros and desktops can build around those insecure concepts if they want (KDE has options for some of them), but it doesn't make sense to include it as part of the protocol.

The discussion is circular, and it ultimately amounts to a lot of dissatisfaction on either side. People should use what they want to use and support what they're capable of supporting. Neither x11 or Wayland are going away, so we need less of the "make Wayland more like x11" and "make x11 more like Wayland" bandwagoning.


I have to admit that I find the isolation and security design to be rather strange. Isolating graphical applications requires a lot of pieces, one of which (what Wayland did) is preventing them from poking at other applications via the GUI. It requires isolating them in the backend (out of scope for Wayland, but Flatpak is at least trying). But it also requires preventing them from spoofing each other and thus deliberately confusing the user. This seems like it needs UI enforced on top of the isolated applications, which means drawing them in a box (like a nested compositor, at which point none of what Wayland did to isolate applications matters) or enforcing informative window decorations. And that part seems like it requires server-side decorations, but Wayland is allergic to SSD.

So I don’t get it. How exactly is the core Wayland protocol a good base for the GUI parts of isolation?


You can render decorations server-side, it's just not guaranteed that the client will respect it. If you really want a Qubes-style SSD desktop, it's attainable in Wayland although it will look incredibly ugly and be highly redundant. Good luck pitching that to GNOME and KDE devs as a default.

So... I don't see how the isolation design is strange. Wayland makes sure that windows are individually isolated, and Flatpak/Bubblewrap isolates the backend and provides interaction portals. It's not a perfect solution, but it does stop your timer application from being a secret keylogger. If your biggest concern is a Trojan horse attack, it sounds to me like Wayland did what it set out to do.


> Wayland was never going to be fully Xorg compatible.

Maybe not, but the fact that Wayland doesn't support an important use case for me is why, regardless of any benefits Wayland may have over X, I won't be using Wayland.

> Neither x11 or Wayland are going away

I sure hope this is the case. I don't care if Wayland exists, I worry that it will become the only realistic option.


> regardless of any benefits Wayland may have over X, I won't be using Wayland.

That's fine. I don't really know what your 'mystery feature' is, but I feel pretty certain it's on a Wayland roadmap somewhere. The same cannot be said for new features in x11.

> I don't care if Wayland exists, I worry that it will become the only realistic option.

It is the only realistic option, if you care about security and isolation. x11 is very flexible and fun, but it's not surprising that the people taking the Linux desktop seriously are pushing for Wayland. It sucks that you're unable to use it for whatever reason, but people aren't going to reallocate development resources to give a dying protocol new features.


> people aren't going to reallocate development resources to give a dying protocol new features.

Which is actually fine by me. I don't need it to have new features.

My point isn't that X is better or worse than Wayland -- clearly the answer to that question is "it depends". I was just expressing the concern that Wayland may eventually become mandatory.


I respect their internals discussion, but they had only three user facing issues & I think all of them have been either fully or partially addressed. Screen recording for example has been reasonably well addressed for well over a year, and protocols for it have existed for a while.

Also it is again the rules to backhandedly ask "Did you read the article." Please read the guidelines.


Except if you’re an nVidia user which I’m guessing you’re not. Multiple monitors (even single monitor) and graphical glitches and tearing all over the place.


As a developer with a Linux laptop (ThinkPads, etc.) since forever, I have been avoiding nVidia like the plague since switcheroo became a thing more than a decade ago. Every time I tried the user experience with (discrete) nVidia cards as a non- single screen gamer has always been a horrible waste of battery. Intel and AMD GPUs getting so much better have been a total blessing. Being able to just connect an external monitor/projector with different scaling without a hassle is very liberating in a professional setting


The latest 545 Nvidia driver is so broken that I had to downgrade back to 535. A huge amount of Linux desktop issues can be traced to Nvidia and their drivers.


Maybe although I haven’t experienced issues that appear to be Nvidia driver issues per se (ie I’ve installed older ones and newer ones and all issues remain). I’m on a desktop 2080 so that may be it. Haven’t noticed anything especially bad with 545. What kinds of issues have you noticed?


There's an example right above you of NVIDIA's explicit-sync patches being rejected because AMD and Intel aren't ready to move on that piece of functionality yet, despite implicit sync resulting in higher latency and a worse experience for users, and being generally agreed to be undesirable and not the direction they want development to go, which is generally agreed to be what NVIDIA has implemented. And this follows the overall hostility over EGLStreams or whatever it was.

They're doing the code and releasing the patches, and they largely capitulated on whatever the EGLStreams issue was supposed to be about, they are leading in what everyone involved seems to agree is "the right direction" and the linux community still can't find their way to "yes". At some point you have to start looking at it as not being entirely NVIDIA's fault - "fuck you, NVIDIA" is no longer an expression of how difficult they are to work with, but rather a generalized expression of hostility from the linux community over perceived historical slights etc.

Like, if there's some big history of warfare between IBM and the kernel team, are you just going to reject any patch coming from IBM because you don't like the company, even when it's generally agreed that the patch is generally right and going in the consensus direction, just because you don't like the name on the signoff? That seems to be where the linux community is at these days.

The whole "fuck you, nvidia!" thing has always been noxious, that's an example of linus's personal toxicity that a certain segment of the linux community has eagerly celebrated. At best it is a thought-terminating cliche that contributes nothing and drags down the debate, and at worst it's given covering-fire for that segment to act out in their own immature, negative ways, to the detriment of the users.

It's not a good look, it's just immature, and at this point it's practically expected from the kernel team. Look who's in charge. And it's not just linus, Greg KH seems pretty bad too, from a casual observer every time I see his name come up it's some childish troublemaking like the symbols-relicensing warfare thing. He's personally made my life worse on multiple occasions, breaking shit that is working for reasons I don't really care about. And I know people think it's for a good cause, but at the end of the day users don't really care about that shit.

The BSD/Unix community is generally a lot more adult and mature, and tbh avoiding linux hi-jinx is a great reason to go with a macbook these days. It doesn't have to be this way, the personality-driven shit doesn't exist in most other spaces and they are better for it. The Linux kernel team is almost uniquely bad about this, and it's largely due to poor leadership and toxic individuals at the top that then bleeds into these adjacent spaces and venues. The wayland people feel empowered to act out when the head honcho publically tells a major partner to go fuck themselves, and so on.

It kinda is what it is - as long as there are children at the helm, there are gonna be issues in linux-land.


My setup:

    * 3 monitors
      * 1x 2560x1440@60Hz
      * 2x 1920x1200@75Hz
    * X99 - Broadwell chip
    * Fedora Workstation 38/39
    * GTX 1070
      * Driver 545 from RPM Fusion
      * Currently on 545.29.06
Workloads:

    * Excessive terminal use
    * Code authoring (VSCode/Vim)
    * Web browsing/app usage in Firefox/Google Chrome
    * Screen sharing and recording (browser/OBS)
    * Electron apps including Slack, Element, and Discord
    * Occasional use of Krita^
^ Krita has a known issue with (X)Wayland, canvas acceleration, and tablet input.

I decided to give Wayland on my workstation another go when the 545 series dropped. Hilariously, the main reason being the underlying features added that enabled support for GNOME's Night Light feature.

Other than the first day of going through my apps and making sure things worked, I haven't had any real issues. For the work I do, Wayland more than covers my needs.

I have gotten a smidgen of cursor glitching after upgrading from the released 545.09.02 NFB driver to the beta-NFB 545.09.06 version. It only happens when GNOME switches to its "loading" cursor, but it's mostly been ironed out by now. I updated the driver as part of the system upgrade from Fedora 38 to 39, so that probably had a hand in causing it.


On my nvidia system (gtx a2000) with an ultra wide monitor and Gnome, Chrome seems to be glitching if I do the following.

- open a new Chrome window

- snap the new window to fill half the screen by dragging it to either left or right edge of the screen

- result: solitaire-like window glitches

- if I resize the window a bit before snapping it, it works.

I haven't seen this glitch on any other app.


Can't say I've seen that one. I have seen Chrome have some windowing issues, but nothing that would drive me crazy. I've also added "--gtk-version=4" to my launcher to support alternate inputs (e.g. typing booster).


I think it should be a point of pride for the linux developer community that nvidia cards work as well as they do seeing as the company is actually hostile.

The only solution for the end user is to vote with their wallets if running linux and/or wayland is important to them.

Bringing it as an argument _against_ wayland, however, is missing the point entirely in my opinion.


Well the bug that I’m experiencing is specific to Xwayland and seems to be a bug where the wayland developers are refusing to take Nvidia’s patches to fix the issue. So maybe it is wayland and not the drivers? Nvidia and Google seem to have a decent relationship working together to resolve issues for Linux Chrome Wayland.


what patches and why?, the other two major gpu company (intel and amd) don't need to send any patch to make their gpu work perfectly on wayland, why nvidia need it?, isn't their fault that their driver system is flawed?


Today Xwayland relies on implicitly synchronization. That’s apparently a legacy mechanism. Nvidia wants to add explicit synchronization support which it has as it sounds like the future whereas AMD and Intel aren’t ready for that. Xwayland devs claim the ecosystem isn’t ready and yet if a simple patch fixes the issue for Nvidia and the Xwayland folks agree eventually GPUs will use explicit synchronization, it seems like the Wayland folks are the ones dragging their feet.

https://indico.freedesktop.org/event/2/contributions/78/atta...

https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests...

https://wayland.app/protocols/linux-explicit-synchronization...


On what basis do you make the assertion that intel and AMD don’t contribute patches to wayland or compositors? That seems like an extraordinary claim.


Just a personal anecdote here. I use Wayland and Pipewire on Slackware 15 which is in absolutely zero sense a cutting edge distribution. While I do have a newer kernel (6.6) than what ships in Slackware, I haven't really modified any other underlying components. KDE Plasma is running great. My audio is the best I've ever experienced, and Wayland (using the "full Wayland" session) is wonderful. It's highly responsive, and very stable in my usage of it. Now, regarding screen recording, I tested with Pulse and with Pipewire, and suprise surprise, it works with Pipewire enabled and running. The mesa issue mentioned is a thing. My performance in gaming did go down slightly by switching to Wayland, but I am also using libraries that are a few versions behind as I run Slackware 15.

For reference, my machine is Ryzen + Radeon, so I cannot speak to any issues with NVIDIA. My wife uses Intel + ARC with the same software stack and she too has zero issues. My son, otoh, runs Windows 10 and has rather frequent issues every single time there's a Windows update. His machine is Intel + NVIDIA 3070 FE.

One household's experience doesn't reflect the state of the industry, but in the Linux-forum-guy tradition: "it works for me"


The main complaints against Wayland is:

* It forces me to use a desktop environment, I know about sway, but I like fvwm. From what I have read, creating window managers on wayland is very difficult.

* X-forwarding. Yes I know about waypipes, but does not work when you want to execute an X application on AIX and display it on Linux. AIX is still in use in many large companies. And yes, no performance issues for me doing that.

* Many Linxisms that need to be ported to the BSDs, and in my opinion these corrupt the design of a sane architecture.

* Will probably obsolete many old computers due to resource usage. Similar to what Microsoft is doing with TPM2. Luckily NetBSD is around for old systems. One selling point of Linux was it kept old systems "live".


>From what I have read, creating window managers on wayland is very difficult.

Only because the wayland community insist on bundling everything into the same binary. Someone could make a compositor that had a swappable window manager component.

>when you want to execute an X application on AIX and display it on Linux.

Don't forget XWayland exists. So it should just work. waypipe is for if the server is using Wayland.

>Many Linxisms that need to be ported to the BSDs

Wayland supports FreeBSD so it must not be that bad.

>Will probably obsolete many old computers due to resource usage.

Why would Wayland use more resources? What work is it doing extra compared to xorg?


> Wayland supports FreeBSD so it must not be that bad.

FreeBSD has adopted evdev/libinput and, if I recall correctly, mostly comparable KMS/DRM APIs to Linux, which makes it the only BSD currently reasonably capable of running real Wayland stacks that are actually useful to day-to-day users.

Outside of FreeBSD, the story deteriorates rapidly: see this article about what it's taking to get even the absolute basics, like Weston (which ~nobody actually uses as a daily driver), running on OpenBSD: https://xenocara.org/Wayland_on_OpenBSD.html


> Why would Wayland use more resources

Memory usage, I do not know what sway uses, but I expect it to be much more than the corresponding WM under X

https://itvision.altervista.org/linux-desktop-environments-s...


  ~  cat /proc/2276/cmdline 
  sway%                                                                                                            
  
  ~  sudo grep -E '^[RP]ss:' /proc/2276/smaps_rollup
  Rss:              523864 kB
  Pss:              267873 kB


Based on your source Wayland vs Xorg memory usage looks to be within the error margin and I don't know which computer you're running where an additional 100MB claimed make the system unusable.


iceWM will not run under wayland. Using a DE on an old computer will not work.


Sure, but purely based on the comparisons between KDE and Gnome under Xorg and Wayland, I don't see any indication that Wayland would inherently use more memory.

It's simply a matter of using one of the lighter Wayland compositors, none of which have been tested in the comparison you linked.

https://wiki.archlinux.org/title/Wayland#Compositors


I think you are missing something. In a old system, which probably has at most 1 gig of memory (maybe 2 if you are lucky), I can run Wayland KDE or GNOME ? Seems to me 1487 is > 1000 :)

So, Wayland will kill off using GUIs under Linux on old systems. Even with 2G memory, if running Wayland, you will hit swap a lot unless you are just staring at an empty screen.

Now I do not know what sway uses, but my guess is it must be at best a bit less than 1G.

Under Linux, the only option is X on old systems. But as I mentioned, NetBSD along with OpenBSD will work quite well on these old systems.

So it is my belief, Wayland will push retro computing people off of Linux to one of the BSDs.


My point is simply that I can't see any evidence of Wayland using more memory than Xorf in the scenarios we can compare.

You're free to find or create benchmarks of lighter Wayland compositors and prove me wrong, of course.


You might want to investigate what others are doing to ensure a way forward for legacy X WMs on Wayland. Here's one I have bookmarked, mentioned in another HN thread recently:

https://github.com/puppylinux-woof-CE/woof-CE/pull/2265


Bold titles like this are misleading - Wayland is a necessary replacement for X.org that allows it to evolve beyond what is currently possible.

"The Linux desktop" is an equally misleading phrase as it leads others to believe that it's a focal point of Linux, or a product that the open source community is trying to push to compete with macOS and Windows (where the graphical desktop is often seen as a defining feature). A Linux desktop is merely a Linux distribution with at least one of several different graphical desktop environments and related components. That is all it is, and the traditional X.org windowing system used by many of these graphical desktop environments is fast being replaced by a more modern version.

It's not uncommon for people to resist change or critique features of software as it evolves (as the author has done), but in the end, Wayland really is the Way forward for X.org and the points made by the author will likely be moot in a short while.


Necessary for what?

X11 has always worked well, never got in the way and never broke anything I relied on for daily work


Overall, this a refreshing change to the normal crop of anti-Wayland coverage. There are some important issues raised.

By the end though, he kind of lost me. I sometimes lament that too much choice hurts the success of Linux. I rarely think that it hurts the quality of the ecosystem in the long run though. If the Xorg mono-culture was a good thing, I hope we are all reading this is a Chrome browser. Wayland compositors are to the Linux desktop what web browsers are for the web. In the long run, more is better. All the same arguments can be made for the dev convenience of only having a single implementation. In the short run, a single implementation can be higher quality and innovate faster. Over time though, a diverse ecosystem with competing implementations delivers better results.

Also, while Wayland progress has been slow, the idea that it will never improve does not ring true. The article has to be updated with “surprise” progress. The same is true of “someday” features like HDR that are already upon us. NVIDIA has been a long road but there has been a lot of recent progress.

Overall, the idea that Wayland adoption will stagnate from here may resonated a couple of years ago but heading into 2024, it seems clearly wrong. In just a few months, KDE is going Wayland only. RHEL is going Wayland only ( and hence GNOME realistically ). Wine is adding Wayland support at a rapid pace. I expect the Steam Deck to go Wayland only at some point which will drag the Linux gaming scene with it. All the significant Desktop Environments have near-term Wayland plans ( Cinnamon just dropped a preview recently ) and even many niche choices are looking at it or have Wayland equivalents.

Again, the article raises some great points. Things could have been easier. However, the overall thesis is not one I can get behind.


KDE is not going Wayland only with the upcoming Plasma 6.0. It's merely switching the default selection, while continuing to support X11 for much longer (no end date announced at this point).

I agree with the remainder of the parent comment, and prominent distributions dropping X11 in 2024 will definitely accelerate the quest for the last bunch of Wayland features that common users feel on a day-to-day basis.


Stean deck use gamescope, and it is wayland only, the only things not waylabd there is proton(use wine) and kde(still on kde 5)


> the only things not waylabd there is proton(use wine) and kde(still on kde 5)

meaning

> the only things not waylabd there is 99% of games and desktop mode

So basically, everything is using X and XWayland.


Agreed, but this will change in the feature when Wine and later Proton get native Wayland support.


> Was it really worth it?

I mean, yeah, I'd say so. Browsers and terminal emulators are working, so that will cover 90% of the Linux userbase's needs. XWayland runs fine for non-Wayland apps and supported GPUs run smooth as butter. I have no reason to not run it on all my machines, currently.

It's not great that x11 reached the state of stagnation it did, but moving on was our only option. In hindsight, I think we'll be happy Wayland spent so long in the design phase so we didn't end up with another x11-type project.


I have one system that I use for photo, video, audio, and development stuff. I have two other systems that are pretty much consciously browser only plus the very odd app like Zoom.


To the section about lack of features, highlighting screen sharing: I completely ran into this and it soured me on Wayland even more (than my experience with Pulseaudio did before for big desktop changes). On an upgrade, Ubuntu switched to Wayland as default with the Gnome DE, which I used at the time. I didn't even notice, until it was next meeting time where I was supposed to hold a presentation for the big boss. Sharing my screen from the browser - completely stayed black. I was humiliating, frankly. And completely the fault of Wayland.

I won't trust that project ever.


One of the frustrating parts of this discourse is that people come to it assuming all the responsibilities fall in the same rough "bins" that they used to, and place their blame accordingly. Wayland made the explicit choice that information about other applications should be inaccessible by default, and that it is the compositor's responsibility to give access in particular cases when appropriate. Your distro decided to switch Gnome to default-to-wayland when it didn't yet have screensharing protocols in place or your videoconferencing software didn't yet support those protocols. You can argue that those protocols should be built in to wayland and backward compatible with the ones from X if you like, that's a perfectly coherent view. But the fact is that they aren't, and the gnome devs knew that all along, so blaming wayland that that functionality wasn't ready is really misunderstanding the issue.


It is at least 90% the fault of the Wayland devs and community for not making it obscenely clear that they only replace ~30% of the functionality of X, and if people try to replace X with Wayland they lose the remaining ~70% of functionality.

Pretty much everyone I know that knows about Wayland assumes that it is a "drop-in" replacement.


Oh, you are not wrong. It's perfectly valid to also blame Gnome and Ubuntu for this. The Wayland session should not have been the default in either project before such a basic functionality was working. Still, the result is the same: I will move to Wayland only when there is absolutely no other choice, as I don't trust the project, or at least not the people pushing the project. Same as with systemd and Pulseaudio, both never got a place on my main system (and Pulseaudio definitely won't, now that Pipewire is here).

Btw, it wasn't special videoconferencing software, it was something like Zoom or Teams in the browser, so completely standard functionality.


> people come to it assuming all the responsibilities fall in the same rough "bins" that they used to, and place their blame accordingly.

This is fair, though, isn't it? Wayland is being sold as an X replacement, after all. Expecting it to have the same responsibilities/capabilities is not unreasonable if you're looking at it from that standpoint.


I don't think it's fair. Maybe the communication has been poor to the broader community, but to anybody interested in the technical consequences I think it's always been clear that a big part of the point of superseding X was that X had taken on a set of responsibilities that didn't make sense to put together (in the view of the Wayland developers). "Wayland is being sold as an X replacement" is a drastic oversimplification that you might take away if you're in the habit of reading just the headline and first paragraph of blog posts about it.


This was refreshing to read. From a user standpoint the author echos my experience. I fired up a very recent distro (Fedora 38 I believe) that shipped Wayland as the default and immediately found dealbreakers:

1. Screen sharing from a web browser (Google Meet) required a multi-step approval process. It seems like first Wayland (or something) makes me choose/approve what I want to share and only then do I get to make the same choice within the browser. Very odd, confusing, overwrought. I'm actually surprised I don't see this complaint more often.

2. Flameshot (which is a godsend for taking screenshots quickly and mocking them up) was comically broken, especially in multi-monitor. They have a 'Wayland issues' page but it doesn't capture the full scope of issues that people report via Github issues. Reading this article you can see tell that the job of fixing Flameshot for not just Wayland but for every different compositor is just not something they are going to prioritize.

3. In my Chromium-based browsers I had to find and change an obscure key in ://flags to make them use Wayland correctly otherwise they would consume massive amounts of CPU.


"Was it really worth it?" - is better than nothing, at least something happens. But things like only providing "just core" protocol and then functionality depends on compositor I would say is less than ideal architecture.

There are some more criticism on https://itvision.altervista.org/why.linux.is.not.ready.for.t... go to "Wayland and its compositors", I think is pretty accurate for wayland (sorry, I use xfce environment on X11 and do not use wayland everyday).

Probably nothing new from me, but I imagine that replacement should come from team with same attitude which created pipewire (obviously nothing is perfect, but for me it really felt as modern replacement) and tough about following functionality:

* multi-monitor, 30bpc, variable refresh rate, fractional scaling and etc;

* in architecture foresee RDP, accessibility use cases;

* requiring session manager to implement common desktop services: clipboard, hook on screensaver/lock, global keyboard shortcut integration, light/dark whatever theme controller, battery saving mode and everything that today is required for common desktop (maybe mobile) functionality (I would take inspiration from Android, IOS, Windows);

* have minimal widget and dialog set (it can be session manager requirement): message box, file open/save, simple window with buttons;

* allow server side widget processing/display – obviously it would decouple GUI interfaces even further, but with smart RPC programming it will have seamless RDP integration and in limited cases allows to have modern widgets displayed with only server side glue library.


Ctrl+F "color" "HDR" "gamut" - no results, both in the article and comments. Which is puzzling, considering that wide gamut and HDR displays are the default now, starting from the mid-range. Surely some commenters here have one?

And yes I know that the color management pipeline is in the works for Wayland. It's been more than 3 years. What state is it in? Is it usable?


Ctrl+F "HDR" in the article produces:

> That said, who knows what the future holds. Maybe Wayland will manage to get a decent HDR solution that outstrips what's currently available on Xorg (i.e. nothing). There's a protocol in the works for this which has been in development hell at least.


I don't love the headline, but the technical criticisms are really good.


What is going to save the Linux desktop is:

- sovereignty over your own computer

- no forced ads, telemetry or surveillance

- the altruistic forces in this world working against evil


No this is less important than ease of use and the feeling of using a good premium product. Apple has demonstrated this in spades. What you present here is a personal opinion, but consumers in general want premium tools that work. All the side stuff like telemetry and morality are orthogonal to that goal.


Whatever "ease of use" mean to you, you got it somehow.

To me if is HiDPI, which on macOS is intentionally sabotaged so you buy ultra expensive monitors.

If it is the global menu, or some other UI style, it is there too.


To me, it is far easier to use than Windows and macOS.

macOS is usable for the beginner, but it inconvenient for the power user to the point it is not ergonomic.


The attitude that software freedom is what users crave above all else is the single most damaging thing about the desktop Linux movement.


But this is what hobbyist developers care about, which is what ultimately matters.


Most web developers use mac. Only the most hardcore ones use linux. For gaming developers are pretty much majority windows developers.

Developers go with what works. Linux works for web servers, so it's an option. It also works for desktop, but why aren't developers all using PCs with Linux installed? Same reason why ordinary people don't use linux. They want the premium feel. They don't like the jankiness that comes with linux.

>which is what ultimately matters.

This isn't true. What developers like is not what ultimately matters. We are a small portion of the overall population. It is the population of people that dictates what matters and if a developer wants to change the world they have to target the population. Why is javascript popular? Because most of the population uses javascript in their browsers. It had nothing to do with developers loving javascript. In fact, javascript is a pretty poorly designed language.


I didn't know I'm hardcore for using Linux to webdev TIL


There is no reason to use macOS as a web developer unless you do graphic design.


You don't get out much do you? macOS is the overwhelming choice for web development, both front end and backend. Linux is a huge minority.

I know your type. You're the type of guy who likes to characterize himself as an ultra rational automaton. And you carry yourself that way pretending why you don't understand why people would irrationally use anything other that linux for web development. Do you exclusively use vim too? No need for graphics or a gui or a mouse, all you need is letters? Yeah I can see it.

I think the reality of it all is that you're just trying to be different, trying to act like you're better. But deep down you totally get why someone who's not a graphic designer would use a mac.

I think there's one thing you don't know anymore though. The whole macs are only for "graphic designers" is a two decade old thing of the past. Macs are mainstream now. You go anywhere in silicon valley they will more likely be using a mac then they will windows or linux this goes for both developers and office workers. "Graphic design" isn't even a term that's used anymore, they call themselves UI/UX designers.


Leaving all your creative fiction aside, that's not the reason I use Linux.

macOS is the overwhelming choice for those who do not want to know how things work, and are OK with being prevented from seeing how things work.

macOS is the overwhelming choice for those who are OK with planned obsolescence, being denied their right to repair, and not having control over their property.

macOS is the overwhelming choice for those that got into this for the money, not because of passion. The reason you need a supercomputer to run eyecandy versions of the same software you ran on a Pentium 100 MHz with 16 MB in RAM in the 90s.

If you want to become a software engineer, and you don't want to know how things work: you are a commodity and generative AI will get your job in the next 3 years. If AI doesn't do it, then one of the millions of people learning how to code on YouTube will.

Everything that you see as special about your code, will be offered for a fraction of a cent per token by an actual automaton.

Most of the macOS users in Silicon valley (and elsewhere) run Linux on Docker, which will run slower on macOS than on Linux. As a macOS user, that performance tax will apply to everything you try to do as a developer.

They also run Linux on all their production environments. So they're just making their life harder for no reason while thinking they're doing the opposite. If your production environment is Linux, you will still need to learn how Linux works, and on top of that you will have to learn how macOS works. Did you really simplify your life doing that?


None of it was fiction. Is vim your editor of choice? Or emacs? If so, then take a deep look at yourself. You're a walking cliche.

There are negatives for using macOS there's no denying that. Everything you said was true and obvious. Everyone knows about it. The problem here isn't about that. The problem is you.

You don't understand why people use macOS despite being 100% aware of all the negatives you mentioned. Instead you have to view through the lens of superiority. You think you're better. This is not a fiction. You actually do think you're better, just read what you wrote.

The cold reality is, there are developers who use macs, who joined the field for the money, who are more effective, more skilled, more knowledgeable and done greater things than you/I ever have and ever will do. That's just reality. There's no point in attacking everyone who doesn't follow your stringent philosophy and wishing you're superior. Everyone has their own reason for doing software it's not invalid if someone joined for money.


You like mentioning vim and emacs for some reason. I wonder why that's important to you? Many macOS users nowadays use VSCode or a JetBrains IDE, which are also available on Linux. And, others use editors like vim and emacs.

If you work with remote development environments at some point you'll have to use a terminal-based editor, and even if your primary editor choice is not vim or emacs you might yourself using them, because they come either preinstalled or packaged on many systems.

Well, what can be said about the rest? The most skilled, knowledgeable developers who have done "greater" things than most, and make more money than most, rarely fall into the description of "web developers". And if what you seek is money, there are more efficient ways to become wealthy than web development.


>You like mentioning vim and emacs for some reason. I wonder why that's important to you? Many macOS users nowadays use VSCode or a JetBrains IDE, which are also available on Linux. And, others use editors like vim and emacs.

It's important because you fit a stereotypical archetype in which that person displays exclusive usage of these editors. At the very extreme these people rarely use the mouse, there entire interface is a terminal. You don't have to tell me certain attributes about yourself and I likely can guess those attributes because it fits a very typical profile. Predictable is the best way to describe it.

Although cliche, people who fit this profile make up a small minority of the overall developer population. It's cliche because people fall for similar traps all the time. It's basically the same psychological profile as a religious fundamentalist. It's quite obvious what your religion is.

>If you work with remote development environments at some point you'll have to use a terminal-based editor, and even if your primary editor choice is not vim or emacs you might yourself using them, because they come either preinstalled or packaged on many systems.

Most developers don't do this. They use the remote development features of vscode. Or they just build locally and the local build is isomorphic enough to the remote build that they can just do most development locally. Vim at most is used for quick edits.

>Well, what can be said about the rest? The most skilled, knowledgeable developers who have done "greater" things than most, and make more money than most, rarely fall into the description of "web developers". And if what you seek is money, there are more efficient ways to become wealthy than web development.

I use linux, I got into development because I liked it. But I'm not delusional about the reality of how most people operate.

Now instead of deriding someone who doesn't use your preferred operating system you're looking down on "web developers". 99.99% of all development is web based now. The highest paid jobs are basically web dev. FAANG is basically an acronym for the highest paying companies in which Most of their developers basically do web dev. Apple is the one exception to that.


Here's some unsolicited advice: You get more out of your interactions with people if you stay curious about them, rather than projecting your ignorance.


Rather not about this guy. If you look at his beginning remarks he literally just insulted anyone who uses macos. That's why I decided to say this stuff to him.

Talk about being rude and closed minded. So there's no point to be charitable with this guy, I'll make the narrowest stereotype and see if it sticks. It likely does..that's the sad part.

My advice for you is to learn to read subtlety in this conversation. The guy I'm responding to hates macOS with a passion and he thinks anyone who doesn't agree with him is stupid.

Or maybe you're that way too? I hope not. Then my advice for you would be the same as my advice to him. Understand why the majority of developers use MacBooks stop looking down on other developers like your some kind of superior ass hole.


Linux is an altruistic force in this world. Being against it is like being against the Salvation Army.

If I had the money to purchase a Mac growing up, probably that would have been my choice. Instead, I assembled my own computer from parts, and upgrading the computer incrementally by adding and replacing components.

For you, if you don't have money to buy a mac, you are no longer a human being worth of respect. That tells me all I need to know.


Nobody is against Linux. The problem is worshipping Linux like some kind of god. And then stepping on those who don't agree with your viewpoint. Linux is one force, among many.

>For you, if you don't have money to buy a mac, you are no longer a human being worth of respect. That tells me all I need to know.

This is just random, and a bit too far. It has nothing to do with someones humanity. It does for you though, because like I said, you fit that stereotype.

I mainly use nixOS with hyprland. I also own a mac and a windows PC.

>If I had the money to purchase a Mac growing up, probably that would have been my choice. Instead, I assembled my own computer from parts, and upgrading the computer incrementally by adding and replacing components.

Usually this profile I'm talking about fits someone who desperately wants to be seen as smart and identifies with that. Your zealotry has a different origin. But it doesn't excuse it. If you're no longer poor you should break out of that mold. Software developers need to get paid. The money needs to come from somewhere. Free time costs money as it's someones' salary that pays for leisure time to work on open source software. If you follow that trail you will see that something like linux would not exist if it weren't for closed source software.


Ah, the stereotypes again. You are angry at a person that only exists on your mind, the reviled snobbish vim and Linux user that looks down on you each time you use a mouse. Since that person only exists in your mind, try talking to that imaginary person and figure your problems out.

Also, here's a game for you that you'll surely enjoy: https://vim-adventures.com/


it's you is it not? I think I'm spot on as you haven't said anything otherwise.

My description fits you perfectly.

I have nothing against vim. I use it on the daily. I have vim bindings on my ide too.

The imaginary person is the one you're making up. You think I hate Linux and you think I use macOS exclusively.

I'm primarily a Linux user.


OK, so: you use Linux, you use vim, you look down on others... the person you have been hating throughout this thread is no other than yourself.

I do not fit the description of the person you are insulting.


Like I said, I'm past the point of believing anything you say. I think you do fit, I don't think you're honest.

I didn't like you because of your attitude towards people who used operating systems other then linux. It's as simple as that.

As a linux user I despise the holier than thou attitude from certain other people who use linux.


Don't see any subtlety with two broad brushes painting in angry colors. When you accuse others of arrogance, don't let yours run rampant.


I don't even use vim as my preferred editor. I use it when it's the default diff tool for git and I haven't yet configured it, or when accessing a remote system where installing an editor is not an option. So your intuition here is not correct.

You can use VS Code on Linux, install XFCE and skin it as macOS, and install Tilix instead of iTerm2. Now it is the same thing.


If it's not vim then it's likely emacs. If not emacs then prob some other console based text editor.

Unlikely for you to be vscode or jetbrains the performance profiles on that are too slow for you plus it's closed source. But at this point it doesn't matter what you say because I wouldn't think you're being honest.

>You can use VS Code on Linux, install XFCE and skin it as macOS, and install Tilix instead of iTerm2. Now it is the same thing.

Yeah I can see you saying that. Just reskin Linux and you can have any operating system you want. Windows, Android, macOS... You name it. Maybe you can reskin some Linux distro into openBSD then from there reskin that into macOS. Oh wait a minute.

You can solve the all problems with this technique. Just reskin Linux into anything and then the world will only need one operating system. Makes perfect sense. Not.


There are desktop experiences for Linux catering to every choice of preferences. Some distros even come with everything preinstalled.

If a macOS style experience is your choice, you can try this: https://elementary.io/


I use nixos with hyprland.

I'm already a Linux user. That's your main problem here. Your inability to two view the issue as something other then black or white.


Fedora and Ubuntu would like to object


Where does Wayland fit in that picture?


That’s their point. It doesn’t.


wayland brings in a new security model. x11 is wild -- any client can snarf data from any other client at will.

any game on steam can screenshot your bank details when it sees a window open with the right title


I assume by providing a better user experience compared to X11. I've not yet used Wayland so I can't comment about if it does.


Presumably to prevent people from being instantly pwned when trying to run the latest "ad free" version of some software or compiling some random person's code.


I don't have much knowledge in this space, except that I used sway since ~3 years, and now I'm using Hyprland.

But isn't wlroots the "strong reference implementation" now? I guess Gnome and KDE did their own thing, but that was somewhat expected.


A reference implementation sure. But a strong one? I don't know if I agree there. It's not API stable and almost every release comes with breaking changes, which makes it tedious and annoying to use.


> Hyprland

https://github.com/hyprwm/Hyprland

I loved the Special Thanks section.


The author won me around to his point of view. If 15 years later this is the state of things, it's really time to cut your losses and admit the idea wasn't that great in the first place.

As an aside I am not sure the linux desktop needs "saving".


15 years later we are getting steam deck(wayland only) and chrome os switching to wayland, rhel deprecating xorg etc and xorg is dead in development front since them we took 15 years for adoption, not fault of wayland


Even though steam deck uses wayland, all of the software on it runs X (and XWayland), including the entire desktop mode. The fact that you're bringing up a platform that exclusively uses X for its desktop functionality, as an argument in favor of Wayland, is ironic.


Just to say it, running an all Wayland machine works fine on steam deck.

I'm doing it with NixOS.



I spend the vast majority of my computer time reading text. System 76's Cosmic DE will finally have good subpixel font rendering. That is one thing that will convert me from Windows.


I don’t know, subpixel rendering in Gnome works well enough for me (when you turn it on).


Save it from what? The only desktop I've used recently that seems to be getting less and less usable by the day is Windows. My slackware i3 system runs smoothly, my wife's Macbook runs smoothly; it's my kid's Windows 11 laptop that seems to always be needing something, and takes forever to do much of anything on (boot up, install a gpu driver update, etc).

Even gaming on Linux has been quite smooth in recent years thanks to Lutris, though I will admit, not being willing to drop more on a graphics card than on the entire rest of my system does mean I'm not playing new AAA games. Seems to me PC gaming isn't about the PC anymore as much other than an addon for the graphics card, which is all but a standalone console otherwise, given what they've grown into.


Bold to assume the Linux desktop needs saving in the first place.


I can't trust anyone penning their posts grey on black.


Better NVidia support is the only thing that keeps me on X. Well that and my distro (Pop) uses X still.


Guess which one is getting deprecated and which one is going to have HDR support.


Wayland isn’t going to save linux desktop, but stream deck would.


The shift from X11 to Wayland was a natural disaster that the fragile Linux ecosystem couldn't afford, and failed to deal with. It's set desktop Linux back more years than anything else I know of, although hard science-y numbers on that are pretty hard to come by.

I don't want to come across like I am complaining about the developers and their work, because sooooo much great work went into Wayland, and some subset of it is great, and it's great if you just need a simple desktop with one or two displays and don't need any of the (long) list of things that don't work.

But the underlying problem is so, so complicated — far beyond like "which distro do you use, Fedora or Ubuntu or Arch or...?" or "which desktop environment do you use GNOME or KDE or Mint or...?"

For "desktop" computing, X11 and Wayland (and all the shit that relates to them) are a huge part of the foundation of all of those. Like... possibly as complicated as the kernel itself, when you factor in the need to thread so many needles in terms of getting widely disparate teams and contributors on the same page and the absolutely enormous number of things that X11 does, in horrifyingly insecure and now-mostly-unmaintained ways, which are still relied upon by so many many pieces of critical software on Linux.

All of that needed to be not only rewritten, but completely redesigned and re-architected, by volunteers, and volunteers who don't necessarily agree on anything.

The process has been just as agonizing and glacially slow as you'd expect.

I've written before on this website about my own multi-monitor setup, which effectively required me to adopt Wayland a few years ago. X11 just can't handle the number of monitors (usually about 5). Those also have different resolutions and scaling factors.

It would get too long to go into why I need those and what I do with them here; let it suffice that I had to do it, and thus I had to adopt Wayland earlier than I otherwise would have, because it broke so many things:

- Remote Desktop (we used both xrdp and AnyDesk — neither those, nor anything close to them, work on Wayland — there is a fairly new, extremely limited RDP server in Fedora 39's version of GNOME, but it's still an incredibly kludgey implementation that requires you to login to the GUI first (presumably, many times that is the part you wanted to do remotely)

- sharing screen, remote presentation in online meetings ... is just a clusterfuck of brokenness (although I haven't tried everything out there, I've tried a lot over the past 5 years and it's by and large completely broken)

- Synergy, a really useful piece of niche software to control Linux alongside Mac and Windows with a single mouse and keyboard — seemed to try really hard to support Wayland, for years, and failed (as did all others in that niche, I believe)

- tons of useful end-user text macro and scripting software (Autokey, etc) doesn't work; basically most of it, although the one I use, Espanso, luckily does have a mostly-working Wayland edition

- NVIDIA (but also: fuck NVIDIA; like many others in this thread, I switched my hardware to AMD, and that was long overdue even without (but especially with) Wayland

- screen recording, essential for many jobs, doesn't work

- even the many things that do kinda work, or partly work, on Wayland have tons of bugs or missing edge cases and feel just very alpha/beta ... but have for many years

I'm just an end user, not a developer that works on any part of Wayland, but I feel like the switch to Wayland failed. We're now in some post-failure loss-cutting phase where we (the community I mean, but that definitely includes users) are trying to figure out what losses we can cut, what workflows we can move to e.g. the web, or do in some other way other than a desktop computer GUI.

I think X11 is like the Windows XP of Linux, and Wayland is the Mac OS X 10.2 of Linux. One is crappy and janky but covers all the use cases, except maybe the new cutting-edge ones, just like it has for decades. The other one is new and shiny, does a few things well, but fails at many basic things, and won't be "finally good" for several more years.


I'm tired of these "saving the Linux Desktop" ideas. The Linux Desktop doesn't really need saving, it's doing fine. Sure, if you're coming from MacOS or Windows you'll need to adapt, just like when you switch to any other OS. But it works, it works well, it doesn't get in your way, and it really doesn't matter so much if you're using XOrg or Wayland, they're both fine.

This whole Wayland hate makes it seem like something is severely broken, where in fact it's more of a storm in a teacup.


Q: I just installed a new version of Linux, how do I re-enable hibernate?

A: Google "linux hibernate", read an article from ArchLinux wiki and several forum posts, decide which one to follow (because they'll talk about several conflicting approaches and some of them won't work for you), find out the UUID of your swap partition, add the UUID to the correct place in /boot/grub/grub.cfg (you should have an editor installed and know what "sudo" is), reboot, try running "sudo systemctl hibernate" and see if it works. Easy peasy!!! Oh, and locations of some files will be different from what you've Googled, depending on what distro you're using, so you'll have to hunt them down. Good luck!

(True story from a few weeks ago.)

(No, didn't have time to figure out how I can enable hibernation from the "Start menu" - that's for another time...)

Yeah... the Linux Desktop is really doing fine, I can totally see it.


Yeah, hibernate is weird on linux. Even after getting it working, there are a few warts with amdgpu for example.

I really appreciate distros that add an ext4 label to your root partition, so that you can just add resume=LABEL=Root to your kernel cmdline and get hibernation ready, nothing else needed. I do a custom arch install and make sure to do that. You can do it later from a live usb as well, iirc, e2label is the command.


Do people still use hibernate? IMO hibernate is obsolete.

Most software nowadays have their own session management, so their session is preserved both across software restarts and reboots without OS hibernation.

Also, one of the main reasons to reboot your computer nowadays is to reload updated libraries, which hibernation skips.

And with SSDs, most computers take at most a dozen seconds to boot.

Hibernation is hacky on Linux because hibernation is hacky, and it doesn't provide much value when you can already boot your machine in a couple of seconds and get all of programs back in their previous session, without the hackiness.


"IMO hibernate is obsolete."

Uh, no! Both windows and AFAIK macos (although they call it standby, not 100% sure they still do it on the Arms) use hibernate as a deeper sleep states for machines that have been sleeping for some length of time.

So, use case, unplug a laptop computer from dock at work on Friday close the lid, throw it in the bag, return on Monday, or maybe two weeks later after a holiday/PTO, open the lid, and it's precisely where it was on the Friday when it was undocked. (On topic bonus is that using Xorg all the windows move to their previous locations when the machine is plugged into the dock, while using a nvidia optimus setup, just positioning windows where they were isn't something wayland seems to be able to do consistently in my experience).

I don't think I've found an S2idle machine that can achieve resuming two days later in linux, and I've got a pile of them. Most of them will burn a percent or two an a hour. S3 standby machines (real ones not the ones faking it) can usually do a few days but they also have issues with the week long timeframes.

This is solved with hibernate, which is configurable in linux, and the machine i woke to type this on has been in hibernate is at 99% battery.

One of the nice things about KDE is that the system settings panel has reasonable configuration options for hybrid sleep/hibernate. Like on windows it shows the option to "While asleep, hibernate after a period of inactivity" or just use hybrid sleep (which writes a hibernate image to disk before sleeping, so if there is a power loss, it can still resume, which is useful on desktops that are sleeping).

So, no, some distro's (ex: fedora) removed the option during install and decided that compressing swap to ram was a better plan because they think people have slow EMMC storage and limited RAM, rather than fast NVMe disks and laptops with 32+G of ram and they are still getting regular issues with OOM/etc and now its an extra step if you actually want to have a chance of opening the lid on a laptop and picking up where you left off.

So, no, hibernate is not obsolete just because there is a subset of users who think their narrow use case means everyone uses computers like them. Nor is it any more hacky than any other suspend/idle mode in linux.

And no, session management isn't the same, and most apps don't know what to do (particular on linux) when the machine needs to shutdown due to low battery. And plenty of applications don't consider part of their data to be persistent. Scroll back buffers in the terminal apps I use for example.


Funnily enough, your list of instructions unironically sounds easy to many of us.


Just reminds me of all the systemd hate.

I like that phrase though "storm in a teacup"


I really like the phrase "storm in a teacup" and I'd never heard it before. Is that common in your place of origin?


"Storm in a teacup," "tempest in a teapot," and other variations is an idiom in many languages, including English: https://en.wikipedia.org/wiki/Tempest_in_a_teapot


haha, it is, I thought it's also a phrase in English... Hebrew is my mother tongue.


> I thought it's also a phrase in English

It is.


It's also a saying I'm Polish, and I also didn't realize it's not in English.


From where I from we use to say "una tormenta en un vaso de agua" (which could be translated to "a storm in a glass of water")


I wasn't aware the Linux desktop needed saving. TIL


1px fonts aren’t going to save anyone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: