Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fact was that Intel did the work, and it was about to ship on Chrome, and as interesting as your explanation is, it wasn't the official reason for the Chrome team to drop support for WebGL 2.0 Compute.

Rather WebGPU and Apple's OpenGL lack of support for compute shaders.

Which became irrelevant the moment Chrome decided to move WebGL on top of Metal via Angle, just like it does with DirectX on Windows.



The official deprecation [1] cites "some technical barriers, including ... [macOS support]". I'm not able to speak for Chrome, but my understanding is that these technical barriers included serious challenges in making it safe. That's where a significant amount of engineering went into WebGPU from the beginning.

[1]: https://issues.chromium.org/issues/40150444


The reality is device specific render output signatures are already demonstrably unique.

So it is likely impossible to make these GPU APIs anonymous, and thus can never really be considered "secure".

Have a nice day, =3




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: