Don't get me wrong, this is supremely cool... but I wish the W3C and browsers solved more real world problems.
Just think how many JS kBs and CPU cycles would be saved globally if browsers could do data binding and mutate the dom (eg: morhpdom, vdom, etc) natively. And the emissions that come with it.
Edit:
For example, just consider how many billions of users are downloading and executing JS implementations of a VDOM every single day.
What would that mean? And what version of "mutate the DOM via databinding" would win, because there are at least three different approaches used in various JS libraries?
Accessing the GPU in this way is something that can't be done without browser-level API support. You're describing a problem already solved in JS. Different category entirely.
Honestly no idea, but any native implementation would be more useful than a userland JS implementation.
> Accessing the GPU in this way is something that can't be done without browser-level API support.
That's true and it will open the door to many use cases. But still, mutating the DOM as efficiently as possible without a userland JS implementation is orders of magnitude more common and relevant to the web as it is today.
> Just think how many JS kBs and CPU cycles would be saved globally if browsers could do data binding and mutate the dom (eg: morhpdom, vdom, etc) natively. And the emissions that come with it.
Browsers are solving these real-world problems. With modern JS engines, frameworks are nearly as efficient as a native implementation would be.
And with web components, shadow DOM and template literals, all you need is a very thin convenience layer like Lit/lit-html[1] to build clean and modern web applications without VDOM or other legacy technology.
> frameworks are nearly as efficient as a native implementation would be
I have a very hard believing that a C++ implementation of eg a VDOM would not be significantly more efficient than a JS one. I'm just doing some benchmarks and even the fastest solutions like Inferno get seriously bottlenecked after trying to mutate about 2000 DOM elements per frame.
And even if the performance was similar, what about the downloaded kBs? How many billions of users download React, Angular, Vue, etc, every single day? Probably multiple times.
VDOMs were created precisely because the native DOM was too slow.
This is no longer the case and a tiny web component framework like Lit significantly outperforms[1] React relying entirely on the browser DOM and template literals for re-rendering... so what you're asking for, has already happened :-)
But even the big frameworks are really fast thanks to modern JIT JS engines.
Yes, I know Lit exists, and yes it's fast. But I think you're missing the point.
I'm not talking about one JS implementation vs another or if JS solutions are fast enough from the end user perspective.
The fastest JS solution (VDOM, Svelte, Lit, whatever) will still be bottlenecked by the browser. If JS could send a single function call to mutate/morph a chunk of the DOM, and that was solved natively, we'd see massive efficiency wins. When you consider the massive scale of the web with billions of users, this surely will have an impact in the emissions of the electricity needed to run all that JS.
And (again) even if there were no perf improvements possible, you're avoiding the JS size argument. Lit still can be improved. Svelte generally produces smaller bundles than Lit. Eg see:
Just think how many JS kBs and CPU cycles would be saved globally if browsers could do data binding and mutate the dom (eg: morhpdom, vdom, etc) natively. And the emissions that come with it.
Edit:
For example, just consider how many billions of users are downloading and executing JS implementations of a VDOM every single day.