Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Very few languages operate under the same constraints as js. When you ship js you can't guarantee the version of Ecmascript that the client will be running, or the standard library of DOM functions that will be available (which differ slightly from browser to browser), so you end up transpiling your code to the least common denominator.

You also have completely different performance requirements compared to most other languages. If I ship a python app I don't have to worry about reducing the length of variables names to shave off a few bytes, or bundling multiple files together to reduce the number of http requests. Other languages don't need to dynamically load code via http requests, they generally run under the assumption that all of the code is available before execution.

The closest comparison outside of the browser would be to the container ecosystem, which also runs code in an environment agnostic way, and there's plenty of complexity and volatility there (podman, buildah, docker, nerdctl, k8s, microk8s, k3s, k0s, nomad, docker swarm, docker compose, podman compose, et cetera).



> The closest comparison outside of the browser would be to the container ecosystem

And as someone who has worked on both, I can tell you that the container ecosystem is way better and way more deterministic. `Dockerfile` from 10 years back would work today as well. Any non-trivial package.json written even a few years ago would have half the packages deprecated in non-backward compatible way!

There is another similar ecosystem of mobile apps. That's also way superior in terms of the developer experience.

> Other languages don't need to dynamically load code via http requests, they generally run under the assumption that all of the code is available before execution.

And that's not what I am objecting to. My concern is that the core JS specification is so barebones that it fragments right from the start.

1. There isn't a standard project format 2. There isn't a single framework that's backward compatible for 5+ years. 3. There isn't even an agreeement on the right build tools (npm vs yarn vs pnpm...) 4. There isn't an agreement on how to do multi-threaded async work

You make different choices and soon every single JS project looks drastically different from every other project.

Compare this to Java (older than JS!) or Go (newer than JS but highly opinionated). People writing code in Java or Go, don't expect there builds to fail ~1-5% of the times. Nor are the frameworks changed in a backward-compatible way every few years.


> `Dockerfile` from 10 years back would work today as well.

I highly doubt that any Dockerfile from back then would work if it runs `apt-get` (as many do), as the mirrors for the old distribution versions aren't online anymore.

Dockerfiles can be made to be quite deterministic, but many use `FROM` with unpinned tags and install from URLs that can and do go away.


Exactly! Dockerfiles are not deterministic. The build artifacts that they produce (images) are, but the same could be said of js build artifacts (which would be a set of compiled and bundled js files).


Having worked on package management in all the verticals you’ve mentioned, none of what you said is true.

Packages in most ecosystems are fetched over HTTP and those packages disappear. If you’re lucky those packages are stored in a centrally maintained repository like npm, distro repos, etc. If you’re unlucky it’s a decentralized system like early go where anyone can host their own repo. Anyone running builds at scale have caches in place to deal with ecosystem weirdness otherwise your builds stop working randomly through the day.

Re: Go, good luck getting a go package from 10 years back to compile, they directly addressed the repository the code lived in! This was a major problem for large projects that literally failed and were abandoned half way through the dev cycle because their dependencies disappeared.

Re: Docker - Good luck with rerunning a glorified series of shell scripts every build. There’s a reason we stopped doing ansible. When you run simple shell scripts locally they seem infallable. Run that same script over 1000s of consecutive builds and you’ll find all sorts of glorious edge cases. Docker fakes reproducibility by using snapshots at every step, but those are extremely fragile when you need to update any layer. You’ll go to rebake an image from a year ago to update the OS and find out the Dockerfile won’t build anymore.

Apt is a glorified tarball (ar-chive) with a manifest and shell scripts. Pkg too. Each with risks of misplacing files. *nix systems in general all share a global namespace and YOLO unpack an archive followed by running scripts with risk of irreversibly borking your system during an update. We have all sorts of snapshotting flows to deal with this duck tape and popsicle stick approach to package management.

That package management in pretty much any ecosystem works well enough to keep the industry chugging along is nothing short of a miracle. And by miracle I mean many many human lifetimes wasted pulling hair out over these systems misbehaving.

You go back and read the last two decades of LISA papers and they’re all rehashing the same problems maintaining packages across large systems deployments with little real innovation until the Nix paper.


> And as someone who has worked on both, I can tell you that the container ecosystem is way better and way more deterministic. `Dockerfile` from 10 years back would work today as well. Any non-trivial package.json written even a few years ago would have half the packages deprecated in non-backward compatible way!

As I wrote elsewhere [1], Dockerfiles are not deterministic. The build artifacts that they produce are deterministic, but that would be comparing a build artifact to a build system.

> There is another similar ecosystem of mobile apps. That's also way superior in terms of the developer experience.

Mobile app users have different performance expectations. No one bats an eye if a mobile app takes several minutes to download/update, but a website that does so would be considered an atrocity.

> And that's not what I am objecting to. My concern is that the core JS specification is so barebones that it fragments right from the start.

JS is actually really well specified by ECMA. There are so many languages where the formal specification is "whatever the most popular compiler outputs".

> You make different choices and soon every single JS project looks drastically different from every other project.

The same could be said of any other moderately complex project written in a different language. Look at the Techempower benchmarks for Java, and tell me those projects all look identical [2].

> 1. There isn't a standard project format 2. There isn't a single framework that's backward compatible for 5+ years. 3. There isn't even an agreeement on the right build tools (npm vs yarn vs pnpm...) 4. There isn't an agreement on how to do multi-threaded async work

A lot of the complexity you're describing stems from running in the browser. A server-side js project that returns plain html with a standard templating language is remarkably stable. Express has been on version 4.x.x for literally 9 years [3]. Package.json is supported by yarn, npm, and pnpm. As long as you have a valid lock file and install dependencies using npm ci, you really shouldn't have too many issues running most js projects. I'm not sure what issues you've had with multi-threaded async. The standard for multi-threading in js is web workers (which are called worker threads in node). The js ecosystem is not like Scala or Rust, where's there's tokio and akka. JS uses promises for concurrency, and workers for parallelism.

[1] https://news.ycombinator.com/item?id=35002815

[2]https://github.com/TechEmpower/FrameworkBenchmarks/tree/9844...

[3] https://www.npmjs.com/package/express/v/4.0.0


>Mobile app users have different performance expectations. No one bats an eye if a mobile app takes several minutes to download/update, but a website that does so would be considered an atrocity.

Well if it updates in my face I'd be pretty annoyed. The mobile app thing only works when they update in the background/transparently.


Well yeah if you had to wait for apps to update before you could use them you'd definitely be annoyed, but the beauty of mobile (and desktop) apps is that users don't expect to constantly be running the latest version of a given app, which means you can slowly update large apps in the background.

When you visit a website you expect to always be running the the latest version of that website. In fact, most users aren't even consciously aware of the fact that websites have versions at all.


> When you ship js you can't guarantee the version of Ecmascript that the client will be running, or the standard library of DOM functions that will be available (which differ slightly from browser to browser), so you end up transpiling your code to the least common denominator.

Isn't that the same as shipping native binaries? You don't know what version OS or libraries it will run on. That's why you do stuff like link with the oldest glibc you want to to support.


The main difference between shipping a binary and a js file, is that users don't expect binaries to be small, which means you can usually ship an entire runtime with your binary. If you shipped every single js polyfill with your website performance would tank. You also generally differentiate between downloading a binary and running it, and users will tolerate a loading spinner while a massive binary downloads. Webpack will emit a warning if any of your build artifacts are larger than 244KB, whereas a 244KB binary would be considered anemic.


> users don't expect binaries to be small

That seems to only be a "modern times" thing.

Prior to that, minimising the size of shipped programs (binaries, images, doc files, etc) has been important part of release management.


Binaries were definitely leaner in the past, but there's always been that dichotomy between downloading software and running it.

In the browser, users expect software to be available instantly, and that constrains how you build webapps. Users will tolerate the google maps app taking a few minutes to download, but they won't accept the google maps webapp taking several minutes to load in a browser.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: