Hacker News new | past | comments | ask | show | jobs | submit login

I have no experience with variable refresh, but I have been using up to three monitors with different resolutions, up to 4k, under X11, for almost a decade, without any problems whatsoever (mostly with NVIDIA GPUs, where their Settings utility simplifies the configuration of a multi-monitor layout).

X11 is very far from an ideal graphics system and I would like to see it replaced by a better system, which would still have to also implement the X protocol for the legacy applications.

Nevertheless, I have not seen any argument yet that would indicate that Wayland is the appropriate replacement for X11. On the contrary, some of the ideas on which Wayland was originally based were definitely wrong and they have shown that the Wayland developers lacked experience about how many computers are used.

Even if a part of the initial mistakes have been patched meanwhile, that lack of vision at the Wayland origin makes me skeptical even about the quality level of the Wayland parts about which I do not know anything.




Not defending wayland, but X11 has serious downsides for me with multiple monitors.

Are you using fractional scaling (or different scaling values per monitor)? Does vsync work? How about the scrolling stutter at high refresh rates? And if you want to use variable refresh, you have to disable/unplug all the other monitors and reboot.


All my monitors are fixed 60 Hz, so I have never used variable refresh and there is no scrolling stutter.

Vsync works, but I must choose with which of the monitors.

For about a decade, I have used only 4k monitors, even starting with the early models that were seen by the computer as multiple monitors, because HDMI and DisplayLink could not carry 4k @ 60 Hz on a single link at that time.

Nevertheless, I still cannot understand why would anyone want to use any kind of "scaling" in relationship with a monitor.

Any kind of "scaling" is guaranteed to generate sub-optimal images.

The correct way to deal with monitor resolutions is to set for each monitor the appropriate dots-per-inch value, depending on the monitor size and resolution.

With the right DPI value, all typefaces and vector drawings will be rendered beautifully. Scaling is never needed per monitor, but only per window, either for the windows containing bitmap images, i.e. pictures or movies, or for the windows that contain GUIs implemented by incompetent programmers in Java with typefaces sized in pixels, instead of using scalable typefaces sized in points, like any decent (non-Java) GUI.

I did not need to set different DPI values for each monitor belonging to a multi-monitor layout, so I do not know if that is possible in X11.


There is one DPI setting per X server I think, and I use 192 because I have two 4k screens. Which forces me to xrandr the 1440p display at 2160p then output a blurry 1080p to match the 4k screens at 2x scale.

As for why use scaling, I can't read anything at 4k native and the UI elements don't scale correctly with the font dpi. Windows has fully functional 150% and 175% options (one of the few things Windows does well now) and in MacOS has really nice global super-sampling options between 1080p-1440p hidpi.


I am also using 192 dpi with a pair of 4k monitors, a 27-inch and a 24-inch, both at the native resolution of 2160p, so everything is crisp.

With the default font size set at 12 points, everything is very easy to read. Even 10-point or 9-point fonts are still comfortable, despite the fact that otherwise I need glasses for reading small printed books.

Under XFCE, almost everything is scaled fine on 4k and 192 dpi. The only exception is that some of the scrollbars remain rather narrow, but they are still usable.


> On the contrary, some of the ideas on which Wayland was originally based were definitely wrong and they have shown that the Wayland developers lacked experience about how many computers are used.

Like what?


Wayland's "just don't send callbacks to hidden windows" approach is completely backwards and should have been replaced by the same event-based visibility notification every other GUI uses. The scaling model was wrong, but they've finally admitted that and real fractional scaling (as opposed to over-rendering+downscaling) is close. Wayland blandly dragged X11's biggest technical debt -- implicit synchronization -- along with it, even though every other modern GUI synchronizes explicitly (i.e. moves a lot of work out of the critical path of drawing a frame).

Most significantly, the "only the focused app should be able to read input" is wildly, fantastically wrong, mind-bendingly deviated from the norm on literally every other graphical user interface in the history of humanity, and utterly incomprehensible to anyone who is not an outright "cybersecurity" fetishist. Imagine a windowing system where you are playing a video game with a USB controller, you mouse over to a window to send a text message and your video game loses the ability to process the controller input. This is clearly surprising behavior, if not outright user-hostile, and only Wayland gets this wrong. Rather than fixing it, Wayland devs gaslight users and developers into believing anyone who opposes this bad behavior is anti-security and therefore pro-badguy, or something. So, each separate desktop environment tribe is now having to produce their own bespoke utilities to mux input clientside to enable normal use of computers, because the Wayland protocols are empowered by divine right instead of technical merit.

I want to add that despite the above litany I do regard Wayland as a marked improvement over X11 and I have no interest in going back to the bad old days. None of the problems with Wayland are unfixable in a 2.0 or so bad that it's worth going back to X11. The actual big problem with Wayland is the unwillingless of the core developers to take advice, but that was a problem back when they were the X11 core developers, and I don't see it changing. For those of us who don't have to interact with them, it's a non-issue.


> The actual big problem with Wayland is the unwillingless of the core developers to take advice, but that was a problem back when they were the X11 core developers, and I don't see it changing.

Is there a solution to this? Any alternative at all? Like, I agree with this problem and it makes me almost irrationally angry that fundamental software on what should be a flexible and open platform are built by people who are so hostile to basic settings or extensible functionality that they are even more egotistical than Apple engineers :/.


The fundamental problem is this kind of stuff is hard work and to a first approximation, nobody wants to do it. So if the X11 core developers and the Wayland developers (who are generally the same people) want to do it their way, and their way is better than nothing, we kind of have to deal with their way, don't we?

I mean, I can run X11 as long as possible, but sooner or later, I'm probably going to have to deal with Wayland. Especially since I'm also not willing to consider Mac because I've suffered enough, and Windows 11 looks like it might piss me off enough to go back to an opensource desktop.

Alternatives probably look like borrowing from other projects that have managed to wrangle things. Android doesn't use X or Wayland, afaik, but I don't know that it makes a good base for a desktop. I believe ChromeOS uses X11 (EDIT: I'm probably wrong, looks like they use Wayland) and their own window management etc, that doesn't help if you don't like X11/Wayland.

Otherwise, maybe it's possible to build on top of Windows apis. There's NDISWrapper, maybe someome crazy could build something to use GPU drivers for Windows, and run wine or something. If you look around, you'll see articles about how the portable executable file format for Linux GUIs is windows PE, and it sort of makes sense, a little. That'd be a big transition in expectations though.


I really like the idea of window isolation - I can run something in a container and not care about it misbehaving unless it’s leveraging a zero day, but I also agree that screen sharing / recording is a fundamental need. Surely there’s a middle-ground, like policy based access?


Making a GUI that supports isolating a client is a great innovation. Making a GUI that force-isolates ALL clients is obnoxious. I really think the functionality should be behind a gate like OpenBSD's pledge/unveil tools -- a process (or cgroup etc) should declare that it should be isolated, then launch whatever. Otherwise it should continue to work the way people expect computers to work! But this and other suggestions were disregarded because security.


The one that has been most frequently discussed was the lack of support in the beginning for screen sharing and the like, but there are many others.

X11 is such a central part of the software required to use a computer that it is not acceptable for a substitute to implement only a subset of its functions.

For any X11 replacement, its architecture should have been conceived since the beginning to enable the implementation of all X11 functions, even if for some of them a lower performance caused by interposed compatibility layers is acceptable, if it is hoped that they will be deprecated eventually.

It should have been obvious for the Wayland developers that some of the X11 features will never be deprecated, e.g. screen snapshots, screen sharing or remote desktop access.

Instead of trying to remove such features, they should have tried since the beginning to find better solutions for them then in X11.

While a network protocol should be implemented by a program distinct from the graphics system, when designing any modern graphics API one should ensure that it has good compatibility with something like the Remote Desktop Protocol.

In general the Wayland designers have attempted to minimize the work they had to do by claiming that various functions belong into other programs, but their arguments are not convincing.

Most, if not all, functions of a window manager should be implemented in the same process as the graphics system, even if the window manager should be some kind of replaceable plugin. That includes the window decorations. A GUI application must be concerned only with the client area of a window.

Moreover, it is inexcusable that Wayland has not been designed with an up-to-date color management since the beginning.


> While a network protocol should be implemented by a program distinct from the graphics system

Who uses just one computer anymore? We shouldn’t make architectural choices that assume one GPU local to an app server, because in real life GPUs are found attached to each user’s display.


the wayland folks were the folks who worked on xorg for the most part, so that doesn't make sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: