Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's not enough. What if the background is solid white, or white with few details? Blurring just leads to white-on-white.


The primary color turns dark when the background is too bright, see "Meet Liquid Glass" or "What's new in SwiftUI" from WWDC25.


(a) Changing the primary colour past a certain threshold is not a solution, in my experience: there are backgrounds where the contrast is too low, no matter what you do.

(b) Even if it were, Apple's implementation doesn't work. https://furry.engineer/@cendyne/114660612541978921

I've been playing with such UIs for years, and they've never left my personal setup, because – while very pretty – they're not worth the hassle. However, Apple's implementation here is particularly bad. (I don't see why anyone takes these companies seriously any more.)


(a) fair point but edge cases exist everywhere

(b) this is the developer seed do you think they will ship it like this?


(a) Looking around me, I'm struggling to find a photo I could take that wouldn't trigger this "edge-case" – so it can't be that much of an edge-case.

(b) I'd expect them to patch this particular issue, now it's been widely-publicised, but Apple's pushed bigger defects to prod over the last few years (e.g. https://www.applevis.com/bugs/ios/when-editing-text-using-ha...). I'd be extremely surprised if this feature shipped in a properly usable state, especially given that their marketing shows that several fundamental design flaws are works-as-intended.


That's not hard to solve, you just need a good blending algorithm. For example: https://leloctai.com/asset/translucentimage/image/flatten.we...


It would be more grey on white than you think. It's not just Gaussian blur, they also have noise layers in the shader.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: