You maybe wouldn’t expect it, but in this regard Windows is far superior to macOS. It will output whatever resolution your monitor has, and then you can just set a “scaling factor” which will make the interface exactly the size you like while still being pixel perfect.
Given how simple most UI is, especially these days (circles filled with gradients, roundrects) and given how many different screen sizes and resolutions are used even within Apple’s first party displays, it’s almost insane macOS isn’t resolution independent.
Android is like that too. Some devices have non-integer pixel densities (multipliers or device pixel ratios or scaling factors or whatever term you prefer), especially 1.5x aka "hdpi" that was popular around 2011. You can provide separate resources for each pixel density if you still use bitmap graphics for some reason. Oh and there's also "ldpi", which is something like 0.75x, though there were very few devices with it.
With how advanced macOS graphics stack is, I don't understand why does Apple not do this, instead insisting on using integer multipliers. Some iPhone models, too, render the UI at 3x and then downscale it to fit the screen. Even more curious is the fact that they have returned it as a float for as long as retina displays were a thing: https://developer.apple.com/documentation/appkit/nsscreen/13...
edit: one important difference I forgot to mention. On Android, all draw calls on the Canvas, and all view dimensions and other things like text sizes, always take physical pixels. It's your job to multiply and round everything correctly. On macOS and iOS, all graphics APIs take "points", which are those logical pixels you get after the scaling is applied, "density-independent pixels" as Android calls them. I guess Apple thought it would make it easier to adapt existing apps to retina displays. Android has always supported UI scaling, so there was no problem designing the API the way it is.
I worked on the Cocoa frameworks back when the hope was arbitrary scaling. The short answer to your question is "rounding." At 1.5x, a box of length 3 occupies 4.5 physical pixels. Either one side of the box will be blurry, or the box must get resized to align with physical pixels, which can cause mis-centering of text or icons, or unexpected clipping or truncation.
There was also a lot of custom layout code in existing apps, both first and third party, which just didn't work at arbitrary scale factors because it was never tested.
Yet another complication is that apps are expected to handle scale changes, e.g. if the user drags a window from one display to another. Android doesn't have to deal with this.
After years of trying to make it work, Apple basically gave up and did the scaling in WindowServer.
The thing I love about HackerNews is how you can casually run into people who worked on things you've been using every day for the last 10 years.
> The short answer to your question is "rounding."
Yeah, on Android, you're supposed to round yourself. View dimensions are integers, so you have to, but graphics APIs take floats, you don't strictly have to, but you want to anyway. Even though there's no such thing as pixel-perfect on AMOLED displays thanks to their subpixel layout. When you use XML and specify `25dp`, the system does the rounding.
> Android doesn't have to deal with this.
Oh it does! Though it deals with this in a lame way: your activity with all its views gets destroyed and recreated for a "configuration change" unless you've explicitly specified that you can handle this specific kind of change. Density doesn't change often, but the possibility is certainly there.
I remember how around 2013 there was this bizarre device. It was a phone, but it came with a tablet you could dock it into. The tablet contained only a screen and a battery. The screens on the phone and on the tablet had different densities. There were bug reports about my app not handling these transitions correctly, presumably because I was dumb back then. I can imagine something similar happening with the new hotness, foldable devices, where the external and internal screens could have different densities.
And while I have you, one question. Why does macOS have the coordinate system origin in the bottom left? I've always found that bizarre, especially when iOS does use the usual top left.
Ah yes, the Hi-DPI days. I left Apple before all of that was sorted out. I assume all of the code dealing with scaling was just removed from the frameworks?
I agree, resolution independence should be a higher priority for macos. They've been able to dance around the subject by just doubling everything and calling it retina, but true accessibility would allow everyone to set the scale that works best for them.
Resolution control on my mac mini + 3rd party 4k monitor works just great, you can choose to have native resolution, 'retina' (i.e. 2x scaling), or a few steps in-between the two, there's plenty of flexibility.
Plus, the way it works means that (as I understand it), programs don't really need to do anything to support all these variations. Behind the scenes, the OS renders everything at a very high resolution, then uses the graphics hardware to smoothly scale everything down to your chosen size.
My eyes aren't good enough to use the native resolution, and using the 'retina' setting wastes too much screen real estate (I mean, that's why you buy a big monitor for, right?), so having somewhere between the two is essential. I was concerned that choosing a non-simple scaling (neither doubled or native) would make things blurry or slow, but it just works, and works well. Even dragging windows between a high-DPI and normal DPI display is seamless.
As an aside, after years of using a low-res monitor, with tiny fonts to squeeze as many terminals and emacs windows onto my screen, high-DPI is a godsend. For a long time after getting the 4k monitor, I'd just play around with the mac's screen zoom accessibility function (uses the mouse wheel to smoothly zoom into an area of the screen) - it was astounding to me just how much you can zoom before individual characters become blocky. They used to be drawn as 8x8 pixels and now, to my aging eyes, each letter looks as smooth and detailed as printed text!
It's not as simple as fixing the display setting code. For example WebKit is not happy with non-integer DPRs. It will leave gaps here and there. When those gaps happen is probably rare enough that it's not a show stopper but I suspect lots of other MacOS software has similar issues.
Another tangentially related issue, how do you display a pixel perfect image on a non-integer DPR. For example lets say you want to show 8bit mario in HTML at exactly 16 or some multiple of 16 pixels so that there's not odd integer scaling. it's quite a pain, requires JS, and requires flexible page layout since you won't be able to use "CSS pixels" to decide the size to display it. Possibly that's an argument for never having non-integer DPRs.
You're right it's not simple, it would require a refresh of uikit and a transition period for developers to learn how to opt in or opt out. There are existing ui frameworks like what android provides that show how to define ui elements, strokes, padding, margins, and text to use either scaled pixels or fixed pixels.
Leave everything as fixed pixels for backwards compatibility and add scaled pixels in for new development. I can guarantee that if Apple ported the main apps like Finder, Mail, Safari, Calendar, Music to support resolution independence that other developers would jump on board quickly to keep up to date.
I don't use an external monitor, so I'm not familiar with the issue here, but are you saying this person created a solution for a problem that doesn't exist? Is this one of those RTFM moments?
I used dual-monitor set up for my MBA-M1. My external monitor is 1080p native and connected through Anker USB-C dock. When I connected the monitor to it, macOS on my external monitor looks blurry as hell. It is like it is imposed a strange resolution on my 1080p monitor that ended up looks worse. I tried to change the resolution through the Display setting and it didn't generally help. So the "secret menu" in Display didn't help at all, macOS is doing something to my external monitor that it was crisp and clear in Windows and Linux whereas macOS are not.
I would not ever say that, I do not have an M1 (which is specifically mentioned in the readme).
I am responding to the parent because indeed there is a way to set a resolution as desired rather than just the "pick from these 4 scaled images".
I've been running Win 10 with 150% scaling for a year or so, with fairly light use because I spend easily >10x more time in OSX (where scaling works correctly nigh-universally).
Win10 scaling is..... very hit or miss. Various programs are of course all over the place, but some system utilities don't even render correctly. Window size and control placement is mostly correct everywhere, but text? Images? Tooltips? Ooooh boy are those a mess. Maybe like 20% of apps I've ever touched don't render something correctly. That's A LOT of problems.
And that's before I start talking about running two screens at different densities. Then it's borderline hilarious how broken things are.
The worst thing is that scaling isn't applied correctly at all. You can have a taskbar icon with correct scaling that has a tooltip that isn't scaled at all and a context-menu that is scaled correctly but the scaling was also applied to the position so now it floats 50 pixels in the wrong direction.
It's an inconsistent mess. Great for people who think inconsistent messes are beneficial for them, not great for other people.
Taskbar stuff! I totally forgot about that. Yeah, those are very frequently wildly screwed up. I've had their icons squashed weirdly several times, context menus in particular have no rhyme or reason I can discern about how they break (they just do everything imaginable), and a good number just never show their popup content at all. Maybe it's appearing off-screen? It's impossible to tell!
Not that stuff in the taskbar is at all stable normally, of course. About half of the time I'm trying some new application out, it'll disappear from the taskbar as I mouse over it, apparently because that portion crashed or something and for some reason windows doesn't detect that until it's moused over. Or sometimes they'll double or triple up for no apparent reason, and mousing over will clear out the duplicates. I can't imagine what leads dozens of applications to have those same kinds of problems, so I'm forced to assume it's at least partly because of Windows.
I'm on Windows 10 right now, running a 1440p display, and a 4k display at a logical resolution of 1440p. Lots of stuff misbehaves on the scaled 4k display.
For a couple of examples: The nVidia Control Panel ignores the scaling on the 4k display and renders tiny text as if the display was set to 4k. GPU-Z looks like a blurry mess.
Tell that to my Windows 10 install, because it's displaying a hodge-dodge of non-uniformly scaled bullshit.
Solved would mean I'm not hacking a scaling factor into the perforce config file to get its window to match the rest of the UI. Solved would mean that Windows handled it for everything drawing windows.
The last time I used Windows was last night. Unlike Mac where I hold back on updates until they force my hand, I update my Win 10 as soon as it hits the public channel. I can assure you, the problem is far from solved. Whereas on MacOS, not a single app misbehaves.
Just on the weekend I ran into Keyshot pretty much ignoring the scaling. Then Rainway installer rendered in some weird scale factor. Then Modern CSV got issues all over the UI. It is far from being solved…
I always hate this response. "I don't have this problem, so no one else does either." This all depends on what tools you use on a daily basis. I have some data tools that have scaling problems and they're unique depending on whether you use Windows Server or Windows 10/11. It's one of the cons of Windows having such great backwards compatibility. Some tools just aren't updated in basic ways despite receiving plenty of updates.
Sure, I'm not saying the problem is completely solved. But for most people using the nth percentile most used tools, the problem is solved. There will always be holdouts or legacy programs that are not HiDPI aware. But if you ask 100 people to list their 10 most used tools, I'd wager a bet that 95% of the unique entries on that list are not blurry when used scaled.
Man, what're you talking about? If you hook a Windows 10 machine up to a pair of monitors with different scaling percentages, then try to drag a Google Chrome window across them, you get sent straight to Resolution Hell. Even the first party apps that "work" go crazy for a while right after they're dragged across
I ended up having to buy a new monitor to work around the issue, since I couldn't handle having apps explode on me any more
"[M]ost used tools" are not the things that really annoy people. It's the "least used critical tools" -- the things that you absolutely need to do, but only once every few months. They often involve loading three layers of progressively older control panels.
This is a failure brought by the overreliance on data and telemetry to show what is "most used," without regard for what is most important.
yeah, Microsoft loves to give an inferior experience to 1% of its users in favor of polishing the experience for the 99%
The problem is that everyone is a member of a different 1%. "Oh, only one percent of users have that monitor setup" "Oh, only one percent of users run that app" "Oh, only one percent of users change that setting in any given month" "Oh, that app only crashes once per hundred uses"
If you do 100 things with your computer every month, and every one of them has a 1/100 chance of crapping out, you're looking at a 64% chance of something crapping out on you over the course of the month
If you do 100 things with your computer every month, and every one of them has a 1/100 chance of crapping out, you're looking at a 64% chance of something crapping out on you over the course of the month
Good point (1 - 0.99 ^ 100 ~= 0.63397), and this also relates to the idea that the higher the number of dimensions (in this case features used by a given user), the more volume there is away from the middle. Our spatial intuitions for distributions in two or three dimensions do not prepare us to handle distributions in 100 dimensions.
And that's what I'm saying is not the case. We have at least 10 Windows machines in my office and everyone has different scaling issues with different apps, even things as ubiquitous as Google Chrome. It's mostly an issue when there are multiple monitors of different resolutions. I have 2 of the exact same monitor so my issues only pop up with legacy software where the UI is still bitmap scaled but every person here has some kind of situation where Windows scaling just craps out.
The only apps that don't really have issues are Microsoft's apps. VNC and RDP mostly function correctly so that's the direct response to this main post but to say that it's "mostly solved" is not accurate, in my opinion. Searching Google for Windows HiDPI issues and filtering to the last year still yields support docs from Dell and other monitor manufacturers (from this year) that describe these issues.
If you just use chrome and visual studio, you won’t run into half of those apps. But ya, any old app that renders directly to a bitmap would have problems if you sneed to use them.
I used to write a lot of WPF apps and the resolution independence was a real thing. I guess I just got lucky in that car never used any legacy apps.
The issue is with lossy scaling. I also have a 4k monitor but you can clearly see a difference in clarity if you select anything other than 3840x2160 (1x, everything is tiny) or 1920x1080 (2x, everything is huge).
Windows UIs can draw natively at non-integer scales. This put them a good decade behind macOS for software support because 3rd party products all needed tons of work to support it, and would all get drawn at 1x with pixelated upscaling in the meantime.
But now that it's more widely supported, it's really the more efficient and more precise solution.
I had a Surface Pro 3 with a default 150% scale and the experience in 2014 was great as long as you lived in OneNote.
What I mean is Windows UIs can draw natively at non-integer scales, not that they all actually do it. But it's certainly more than it used to be, especially consumer-focused stuff.
In contrast to Mac where your UI draws at 2x and then gets downscaled to fit on screen, unless the user has picked the one resolution option where it's actually native 2x mode, which I never see anyone do because it's not enough space.
I imagine any niche business targeted software on Windows still doesn't do scaling right, but that's the nature of the Windows ecosystem. You can run your software that hasn't had meaningful updates since 1995, and someone will happily keep selling you their software that hasn't had meaningful updates since 1995.
Windows has had support for variable DPI since win95. They've had to hide or rework it over the years because it hasn't always held up to problematic dev practices like fixed pixel dimensioning but the fundamental graphics subsystem has supported it.
Now if Windows could just make my laptop not use half the battery overnight with the lid closed.
(This is frustrating me to the point that I am on the verge of getting a macbook for the first time now even though my laptop is otherwise great and only a year old, so it's good (for my pocketbook anyway) to know the other side has its own issues).
> Now if Windows could just make my laptop not use half the battery overnight with the lid closed.
Eh I have a Macbook and various apps will cause the laptop to not go to sleep. Web browsers are the largest offenders in my experience. I've often enough came into my office to a laptop that was running at 100% CPU all night.
I don't get how idle power use is still so bad on competing products. I understand Android, at least, has improved somewhat, but it spent years being laughably bad compared to iOS, even on tablet hardware with no cell radio to worry about. Dead after three days in a drawer, while the iPad that'd been in there for three weeks still had plenty of charge left. WTF.
Check your manufacturer's website for updates and you rpower scheme settings. I have a early-2021 Dell XPS 13 (the 9310) and I can close it, forget it in my bag for two weeks, open it and it has 94% battery left because it auto-hibernated.
It's been a while but I think I had that working too, except wifi would be totally dorked up after waking. I'd have to manually disconnect and reconnect each time (xps 9700). That was more annoying than being dead 1/3 of the time, so I left it like this. Are you experiencing similar?
That said, yeah I'll check if there are updates I'm missing.
FWIW I disabled "Allow network connectivity during connected-standby" in group policy and it seems to be better (fingers-crossed).
Apparently I'd set that to "enabled" last year when trying to deal with the wifi flakiness after wakeup. However setting it to disabled now, I am not getting the flakiness so far, so maybe a software update fixed that problem. (Again, fingers crossed).
Two days and so far it's worked. Drops 1% power overnight (compared to 8% per hour pre-change), and connects straight back to wifi upon wakeup.
Note hibernation is still disabled. This is just sleep. And I didn't hack anything AFAIK to disable the connected-standby that Dell/Windows forces. So, pretty happy! (Of course, was looking forward to trying out a Mac, but oh well).
The following sleep states are available on this system:
Standby (S0 Low Power Idle) Network Disconnected
The following sleep states are not available on this system:
Standby (S1)
The system firmware does not support this standby state.
This standby state is disabled when S0 low power idle is supported.
Standby (S2)
The system firmware does not support this standby state.
This standby state is disabled when S0 low power idle is supported.
Standby (S3)
This standby state is disabled when S0 low power idle is supported.
Hibernate
Hibernation has not been enabled.
Hybrid Sleep
Standby (S3) is not available.
Hibernation is not available.
The hypervisor does not support this standby state.
Fast Startup
Hibernation is not available.
Standby (S0 Low Power Idle) Network Connected
Connectivity in standby is disabled by policy.
Try looking into Power Profile and Power Management setting, likely you are on the high performance profile or the profile was set incorrectly. Also look into the device manager and change the power setting for each devices, some device can turn on the computer and keep it awake through out the night. Even apps (I managed to find out why my computer keep awaking up in the middle of the night because of the particular app that have no reason to use the waketimer function).
As someone who has to do UI/UX work on Windows, I prefer the way macOS handles it, the issue is, if I use a non-HiDPI aware app on Windows, I have no idea what my UI elements will look like - on MacOS, they may show up blury, but I know exactly what they'll look like - another issue is, Windows does UI scaling on non-HiDPI screens by default.
I think it embodies the philosophical difference between Microsoft and Apple. And I’m not making a judgement here.
Windows: give users choice, including the choice to end up with blurry apps with ill aligned UI.
Mac: we cannot let you stray outside these boundaries, otherwise your UI experience will deviate too much from the experience we determine to be the best.
For me, scaling on non-HiDPI screens is a feature. I like big text and big buttons and Windows gives me that. I’ll gladly except perfect big UI 99% of the time, if the cost is ugly UI 1% of the time.
I agree, and wish to add that the new graphics stack on Linux is also far superior to MacOS in the same way. By "new graphics stack" I mean Wayland without XWayland.
In fact, of the 3 desktop operating systems, I distinctly prefer the details of how my screen looks on Linux: on Windows, small (plus or minus 25%) changes in scaling factor tend to cause large changes in the details of the rendering of text (e.g., the average width of lines and curves can seem to halve or double or the text can seem to change font) which I find a little distracting.
MacOS is by far the worst of the three. Not only does it look horribly blurry to me (on a normal old monitor -- I do not own any HiDPI monitors) when a non-integer scaling factor is applied (via the Displays pane of System Preferences), but even at the native resolution of the display, it is blurrier than I like because of a decision by Apple long ago to optimize for how closely text on the screen matches how the same text looks like when printed out. (At least one of the terms hinting and anti-aliasing are relevant here, but I don't know the details.)
Windows is further along than Linux in the rollout of a truly resolution-independent graphics stack: on Linux, I have to pass certain flags to Chrome to get it to circumvent XWayland and not be blurry -- and then there are a few bugs -- but bugs I can definitely live with. Also, on Linux, I have to use a special branch of Emacs (named feature/pgtk) to get Emacs to be non-blurry when a non-integer scaling factor is set in Gnome Settings, feature/pgtk has a bad bug (freezing at random times) which I learned to work around (by starting a second "sacrificial" Emacs instance, which luckily would always be the first to freeze, and once frozen would somehow prevent the first instance from freezing).
I was a MacOS user for 10 years, and would still be a MacOS user today if I hadn't spent time on Windows 10 and had my eyes opened to the painfulness of the two aforementioned sources of blurriness (namely, Apple's decision to optimize for fidelity to hardcopy and MacOS's use of a scaling algorithm when the scaling factor is not an exact integer). I always knew during those 10 years that MacOS was too blurry for me at non-integer scaling factors, but I thought it was OK because the 2 individual apps I spent the most time in (Emacs and my browser) have app-specific scaling factors that don't introduce blurriness, and it wasn't until I spent time on a different OS that I understood how sub-optimal MacOS was for my pattern of use and my particular visual cortex.
If you are on a Mac, a good way to experience what I am talking about is to install Google Chrome, then operate the "Zoom" control in the menu of the 3 vertical dots. Note how every element in the viewport instantly changes size. PNG and JPG images (e.g., the white "Y" in the left top corner of this page, but not the white rectangle surrounding it) might become blurry (or stop being blurry) because unlike essentially everything else on a modern web page, PNGs and JPGs are not stored as resolution-independent mathematical descriptions of curves. Well, Windows has a control in its Settings app that has the same effect on every visual element on the OS (including the mouse cursor). And on my Linux box, Gnome Settings has a control that does the same thing. (Safari and Firefox might have the same "zoomability" as Chrome does; I haven't used Firefox on a Mac in years; back when I did, its zoom control zoomed only the text, but not the images; I no longer have access to a Mac, so cannot experiment with Safari.)
This is a major factor. macOS has aggressively optimized for these in recent years, often at the expense of classic 1x display experience. I use 5K 27" and 4K 24" monitors at 2x scaling (this is the default), and the result is excellent.
Maybe so, but the blurriness caused by decisions around antialiasing and hinting was present already in Snow Leopard, which predates the first retina Mac.
That was my first thought. All this crap Mac users have to go through if they want to go off the "happy path" even slightly with an aftermarket monitor.
I worked for Apple more than two decades ago, but now I'm a happy Windows 11 user.
I have an aftermarket monitor, runs at 4K, has a scaling factor applied so that I can make everything bigger. I don't feel like I had to go through any crap, either, it's the simplest of UIs.
I agree that it is simple (i.e., the Displays pane of System Preferences) but the result IMHO is unpleasantly blurry.
These days, fonts, SVG images, CSS, etc, are stored as mathematical descriptions of curves that can be rendered at any scaling factor. Any use of a scaling algorithm, like MacOS does when the scaling factor is not an integer multiple of the display's true resolution, is an unnecessary source of blurriness. A HiDPI display make the blurriness less noticeable, but do not solve the basic problem, which is that the scaling algorithm reduces the "effective resolution" of the display.
(I have never used a HiDPI display with a desktop OS, but another participant here on HN has and reported that he notices MacOS's blurriness relative to Windows even on a HiDPI display.)
Given how simple most UI is, especially these days (circles filled with gradients, roundrects) and given how many different screen sizes and resolutions are used even within Apple’s first party displays, it’s almost insane macOS isn’t resolution independent.