It is interesting for me to follow how the community mindset develops around TypeScript/JS/... over the years. Many in the JavaScript community frowned upon GWT, Closure Compiler, Dart and other typed compile-to-JS tools, most of which predate TypeScript, and provided state-of-the-art tooling at the time. "Types are overrated, everything that is non-JS must go away" was a common mentality.
Now some of these people started to see some value in tool-based refactoring, and suddenly realize that types are not always bad. TypeScript undeniably improves the situation over JavaScript, with the caveat that it won't stop you shooting in the foot with the bad parts of JS. So now, types are good, nevertheless JS must stay.
I wonder how many years it will take for people to lobby for a clear break of JS backwards-compatibility, to use something that is not ad-hoc, has good tooling and a great base API, etc. Something where sorting number doesn't need additional extra care.
These usually start to matter when the team size is beyond 10 person working on the same part of the codebase. It will be interesting to see how this spans over the years. I bet the earliest "clear break of bad JS" will happen around 2019, with mostly linters and data flow analysis improving but hitting the eventual limits.
Uninteresting times, in a certain sense. I'm glad we have other tools to compete with.
GWT came with a language that has a poor type system for JavaScript's needs.
Closure Compiler types are/were too verbose. It didn't support most module systems until this year.
Dart didn't provide an easy interface to existing JS libraries. Everything goes through a very cumbersome FFI. Also the only "tooling" was Eclipse.
TypeScript comes with a structural type system (with generics) designed exactly to fit existing JavaScript code perfectly, as well as a type definition scheme for existing code, as well as support for all current JS module systems (ES6, CommonJS, AMD). The tooling also comes in the form of an embeddable language service.
It succeeded because its a type system for JS, and not a compile-to-JS language for another ecosystem.
The difference is about the length of an additional "/ @type /" and the fact that the annotation is in a comment, not in code. That had the benefit that that all code for closure was valid JavaScript, which was key in the days before basically all production JavaScript went through a transpilation step.
> It didn't support most module systems until this year.*
ES6 modules maybe, but it's supported CommonJS and AMD modules since 2011[1] and its own module format before that.
The real reasons I think it didn't catch on are that it came out too early for much of the JS world, the closure library is written verbosely/awkwardly (though there's no reason to touch it when using the compiler, they're associated by name), and many thought you to annotate all existing code to use it instead of adding annotations incrementally.
Software development is always about tradeoffs, in an always-changing environment, at a given time. Tools evolve, and so did almost all of things you mention as a drawback.
TypeScript didn't succeeded yet, but it is emerging with a great press coverage. It will be interesting to see people's preferences changing once they got a feeling how safe programming adds not only to the productivity, but also to longer term maintenance.
And in my opinion, the ecosystem doesn't matter that much. Otherwise Java should have won with its coverage of open source libraries.... or Perl should have won earlier... or... you get the point. It matters in the beginning, but not on the long run.
I think at this point we just have to wait for a good support for webassembly on all main browsers and then everyone can have its own ""isomorphic"" stack and we can finally stop these wars about how to modify javascript and what features to introduce because we all will finally have other alternatives to choose.
Javascript currently doesn't provide viable concurrency which makes translating languages that do a problem.
Parsing Javascript can also be a performance bottleneck for some applications as Javascript parsing is somewhat expensive in comparison with Webassembly.
Performance and size. You can't get better performance or smaller size than a manual written js solution. wasm is supposed to give you both( better performance and smaller payload size). I'm wondering how many people will see value in JS once wasm and compiled to wasm languages are first class citizens. I doubt there will be many(excluding the js developers who may be biases)
> You can't get better performance or smaller size than a manual written js solution.
Except in cases where the handwritten solution must be transformed to a different pattern (doing essentially the same thing). For example, the handwritten solution may write:
element.innterHTML = '<div><span>x</span></div>';
While the compiler will unpack that and would use document.createElement() and would append these nodes together. The example may be lame, but most people would prefer a short string over 4+ lines of createElement() calls in their handwritten version. (Same old story with assembly vs. compiled C code).
Similar pattern may arise for various parts of the JS execution profiles and/or browser runtime quirks.
I don't think it's had much of an effect to be honest. Angular 2 (which has TypeScript going on) isn't even stable yet and seems to have lost a lot of mindset compared to React (which doesn't have TypeScript going on).
Now some of these people started to see some value in tool-based refactoring, and suddenly realize that types are not always bad. TypeScript undeniably improves the situation over JavaScript, with the caveat that it won't stop you shooting in the foot with the bad parts of JS. So now, types are good, nevertheless JS must stay.
I wonder how many years it will take for people to lobby for a clear break of JS backwards-compatibility, to use something that is not ad-hoc, has good tooling and a great base API, etc. Something where sorting number doesn't need additional extra care.
These usually start to matter when the team size is beyond 10 person working on the same part of the codebase. It will be interesting to see how this spans over the years. I bet the earliest "clear break of bad JS" will happen around 2019, with mostly linters and data flow analysis improving but hitting the eventual limits.
Uninteresting times, in a certain sense. I'm glad we have other tools to compete with.