Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Javascript engines gave up on interpreters too quickly. JITs have been a huge source of security holes, and the language is so huge now that verifying the correctness of JS optimizations is extremely hard. JS was never meant to be a high performance language. Plus all the heroic work on exotic optimization has just resulted in induced demand. Web pages have just grown to contain so much Javascript that they're even slower than they were when JS was slow.

Browser vendors should agree to make JS slow and safe again like it used to be, forcing web developers to make their pages smaller and better for users. For the unusual cases like browser-based games, WebAssembly is ok (it's much easier to verify the correctness of a WASM compiler), and it should be behind a dialog box that says something like "This web page would like to use extra battery power to play a game, is that ok?"



> Browser vendors should agree to make JS slow and safe again

Not gonna happen.

There's one browser vendor in particular who has 2/3 of the browser market, 96% of the ad network market, 87% of mobile, and a similar lock on online office software, email, mapping/navigation, etc. etc. They have every incentive to use their commanding service in providing both the services and the means of access to those services to consolidate their control over the world's information resources. And, as the key way in which all these different components are implemented and interact with each other, JavaScript is their most effective means of maintaining that stranglehold.


And driving to make JS that good themselves got them there.


> JITs have been a huge source of security holes, and the language is so huge now that verifying the correctness of JS optimizations is extremely hard.

Do you have numbers to back that up?

There certainly have been 1 or more security holes in JITs, but AFAICT most of the browser vulnerabilities have more to with bad (new) APIs.

The level where a JIT operates really has nothing to do with the surface syntax of JS, so adding "syntactic sugar" features to JS should have very little impact on JITs. (I'm thinking of things like the class syntax, lexical scope for function literals, etc. Maybe there's a class of additions that I'm missing.)


> Do you have numbers to back that up?

Hm hard to come up with a number that shows JS optimizations are hard, but you can peruse a collection of Javascript engine CVEs: https://github.com/tunz/js-vuln-db

Notice how many are JIT or optimization issues, or are in esoteric features like async generators or the spread operator.


That's fair and I now know more. I'm not convinced that this is the biggest issue with JS engines/browsers, but I certainly have more evidence against me :).

It's interesting how many of those are labeled OOB. Does that mean that we're talking JIT flaws that allow OOB access to memory? Is it's actually tricking the JIT itself into allowing OOB access, or is it actually OOB'ing the JIT?

I wonder what the performance impact of all JIT code being forced to do bounds-checking would be...


> Is it's actually tricking the JIT itself into allowing OOB access, or is it actually OOB'ing the JIT?

What's the difference between the two? Many JavaScript exploits abuse the interaction between strange features of the language to get around bounds checks (often, because a length was checked but invalidated by later JavaScript executing in an unexpected way, or a bound not forseen as needing a check) leading to an out-of-bounds. And I'm assuming many of these are heap corruptions where someone messes with a length that lets them get out-of-bounds.


It's not much of a difference in terms of consequences, but always-on bounds checking could alleviate OOB'ing the JIT at least.


I am more familiar with Java where the runtime implementations have gone back and forth through various iterations, such as Jazelle and various ways to accelerate Java on ARM, the various Android implementations, etc.

What people think is the best choice of tiers to use is always evolving.

One factor against JIT's is that modern chips and OS want to set the NX (no execute) bit against the stack and the heap which at least forces attackers into return-oriented programming. To JIT you have to at least partially disable that behavior.


I mostly disagree, but I appreciate your opinion - it adds a valuable viewpoint that needs to be considered.

Opinions on this matter may arise from the dichotomy Martin Fowler describes between an "enabling attitude" and a "directing attitude" in software development: https://martinfowler.com/bliki/SoftwareDevelopmentAttitude.h...

You're right that web apps have become extremely JS- and framework-heavy. Just like adding lanes to a freeway increases traffic, adding JS performance has increased demand for it. But faster JS execution does translate to more headroom for developers (regardless of whether they abuse it), which enables new scenarios that wouldn't be possible otherwise.

An enabling attitude will give top developers the freedom to rise higher than ever before; whereas a directing attitude helps to improve those who would perform poorly otherwise (by preventing stupid decisions) -- but places artificial blockades in the way of the best performers.


The best performers can use WASM, I suppose.


Unfortunately it isn't as simple as saying "just use WASM for intensive apps". There are still huge hurdles to getting WASM modules to understand and interface with the page around them. Maybe one day that will change, but not in the near future.


Pandora's box has been opened so what you say is not going to happen, ever. Get used to it, JIT javascript is not going away anytime soon




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: