It's not "pulling the trigger" on JS—per the FAQ, it's fully polyfillable and typically implemented inside the JS engine, as essentially a separate interface to all the implementation work that's already there. This is a crucial difference from everything that's been tried before (including PNaCl). JS isn't going anywhere and continues to be improved.
Sure it is. Step 1: get all browsers to support WebAssembly, so that sites can drop the polyfills. Step 2 (specifically noted in the article): make WebAssembly more full-featured than JavaScript, ignoring the ability to polyfill it because browsers have native support. Step 3 (potentially doable before 2): implement JavaScript in WebAssembly, so it becomes just one of many possible languages, and its implementation is no longer beholden to browser quirks.
That'll make JavaScript a better language for its proponents (latest ECMAScript features everywhere), and entirely avoidable for people who prefer other languages.
No browser is going to implement JS in Web Assembly anytime soon. You have no idea how tight the performance margins are on SunSpider and the V8 benchmarks, for example. The entire compilation pipeline has to be lightning fast; adding another IR will kill you.
Web sites move to new technology at a glacial pace. You're posting this comment on a Web site that, by and large, hasn't even adopted CSS 1.0 for layout yet.
Dream on. JS is "obligate, not facultative", as biologists say, for browser implementors. It lives in source form on the web and must load super-fast.
We can wager about when it might actually die (I said it might given something like wasm, at Strange Loop 2012), but that's beyond the horizon by years if not decades. Gary Bernhardt knows!
People build latest-ECMAScript polyfills today. Future such implementations will likely use WebAssembly.
And anything caring about performance can start targeting WebAssembly directly rather than JavaScript. CPU-bound JavaScript performance will start mattering a lot less. (DOM-access performance will still be critical, but it will be for WebAssembly too.)
I predict that, five years from now, people will still be caring a great deal about JavaScript's CPU-bound performance. Even if all new code switched en masse to Web Assembly tomorrow (which won't happen) people would still want existing content to run as fast as possible.
> You moved the goal post from dropping JS (again, dream on) to (in the future) using wasm as compiler and polyfill target.
I'm not suggesting dropping JS; that's too much to hope for. If nothing else, backward compatibility with existing sites will require supporting it approximately forever. I never intended to imply dropping it, just that WebAssembly becomes the interoperable baseline, with JavaScript and other languages being peers on that common baseline.
I disagreed with the statement that "JS isn't going anywhere"; I don't think it's going away, but it's clearly going to evolve and go new places.
This assumes WebAssembly will have virtually no performance overhead when compared to C/C++, and still doesn't address the fact that JavaScript "binaries" will be much larger than today's scripts that rely on a JavaScript interpreter being present inside the browser.
> This assumes WebAssembly will have virtually no performance overhead when compared to C/C++,
It won't have zero performance overhead, since unfortunately it will still require translation to native code. But it'll be far higher-performance than asm.js, and precompiling JavaScript to WebAssembly could produce higher performance than a JavaScript JIT.
> and still doesn't address the fact that JavaScript "binaries" will be much larger than today's scripts that rely on a JavaScript interpreter being present inside the browser.
You don't need to do the compilation in the browser; do the compilation ahead of time and ship wasm bytecode. The DOM and all other web APIs will still be provided by the browser, so I don't see any obvious reason why wasm needs to have substantial size overhead compared to JavaScript.
So much in this thread suggests that influence will be exerted to permanently weld wasm's semantics to Javascript's from day one, so that there will never be a viable compile target that isn't just Javascript. In that case, wasm will be nothing but a Coffeescript-like facade over Javascript semantics, and anyone targeting wasm is really only writing Javascript in an inefficient way, and Javascript is the only first-class language, forever.
If wasm is already intended to be crippled relative to Javascript, it really throws into question why wasm should exist at all. When the response to eventually implementing Javascript on top of browser wasm support is "dream on," like because Javascript on top of wasm will suck because it leaves too much performance on the table, that raises the question of why we think building on top of wasm will be acceptable for languages other than Javascript. If building on top of wasm is unacceptable for Javascript, why isn't it unacceptable for everyone else?
Maybe this also throws into question whether any discussions supposedly shaping the development of a new cross-vendor standard are really in good faith, if the reality will be that nothing new is made other than a new spelling of Javascript, and that wasm will be forever steered by Javascript and incapable of meaningfully hosting an implementation of Javascript because it has been intentionally crippled to require an implementation of Javascript.
Anyone who wants wasm to become a real thing has to be cautious that it is not crippled in order to prevent it from being capable of replacing Javascript (whether it actually does replace Javascript is another issue which can really only be asked if we assume wasm is not already sabotaged to be incapable of that).