Hacker News new | past | comments | ask | show | jobs | submit login
JSR: The JavaScript Registry (jsr.io)
245 points by slymax on March 1, 2024 | hide | past | favorite | 178 comments



As a long-time front-end developer, I'm not seeing a strong value proposition here to justify the further fragmentation another package registry is going to cause.

> You publish TypeScript source, and JSR handles generating API docs, .d.ts files, and transpiling your code for cross-runtime compatibility.

This sounds like another building service I don't control. There's already too much magic in publishing transpiled TypeScript packages but at least the current tools let me control exactly what artifacts get pushed out.

> web-standard ECMAScript modules

The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.

> JSR isn't a replacement for the npm registry; it's a superset of npm.

A super of npm means that the packages created for this repository will not work in the npm ecosystem. What is this if not a "replacement" of the npm registry and further fragmentation of the JS ecosystem?

>JSR modules can be used with any JavaScript package manager, and in any project with a node_modules folder.

This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...

> Module authors can count on great editor support from strongly typed modules, without the need to transpile and distribute typings manually.

The website seems to insist that distributing typings is a complicated process when it is just the .d.ts files bundled with the published package and an additional entry in the package.json file.

> Easy publishing with a single command - the CLI will walk you through the rest

It seems like every benefit that this project offers can be fixed in the current ecosystem by better client-side tooling that enforces standards during the publishing process while keeping full backward compatibility with over a decade of packages published and without fragmenting the ecosystem.


In addition to some of the other answers already:

1. The focus on typescript and typings distribution is useful/interesting for the "promise" that all packages in the registry have typings. If you can trust that every package in JSR has types then you avoid/simplify that current cycle of: npm install a package, find out it doesn't have types, try to install a package of the same name with @types/ in front of it, find out it doesn't exist, look for an alternative library with types or write your own types or give up and `declare module "thatpackagename"` any-type it.

2. Just because "most" production websites do bundling doesn't mean that needs to still be the status quo in 2024. Bundling is increasingly a "premature optimization" and you might not need it. It can be worth investigating if your site/app works well enough with just ESM delivered straight to browsers in 2024 versus what you "gain" from bundling. You might be surprised at how well current browsers can load ESM and the optimizations they already do for you. But even if you do still need bundling ESM is a huge advantage over other formats (especially CommonJS) and every current bundler produces better, more optimized output the more ESM you provide. Their outputs also increasingly look like ESM, and if that isn't the current default it is the "soon" default. esbuild in particular (which also backs some of vite's tools and others) has an ESM bundle output that is delightful, isn't yet the default ("soon") but is a simple "format" switch away. Those ESM bundles are great in today's browsers. ESM matters a lot to bundlers, bundlers are increasingly ESM in their internals and output. Browsers are great with ESM. The "web-standard" part is pretty meaningful today and getting stronger.


Where can one read up on [2.]?


I got into some details on how I do light-weight apps (especially during development) without bundling in a nearby comment as well: https://news.ycombinator.com/item?id=39564986

To resummarize: sometimes it is fine to prune and ship node_modules to your web server, you can use import maps to handle "node-style 'bare' imports", and you may only need to "spot bundle" to ESM a single dependent library or two that doesn't ship ESM that will directly work in the browser (which should be a decreasing list).

After that it's a matter of using your browser's Dev Tools to tell you what your performance looks like and where your bottlenecks are. That can tell you a lot to help you determine if you really need to bundle more of your site/app or if you can spot-bundle/adjust an import map for just a specific sub-graph. With HTTP/2 and HTTP/3 the "per-file" "penalty" is a lot less, though still somewhat exists. Even some properly configured HTTP/1.1 servers it is not always as bad as history tells us it was (properly configured HTTP/1.1 did support some connection sharing). About the only remaining thing a bundle might buy you on HTTP/2+ is maybe better behavior with compression on the bundle as a single whole, but even that isn't a given because Brotli's dictionary was designed to work at the whole connection level and that's often the default compression in HTTP/2+. But again, your conditions may vary based on your real code and the specific servers and browsers you need to support, so leverage your browser Dev Tools and check the real numbers.


> This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...

JSR is a registry, not a package manager. npm allows you to add multiple registries and JSR provides an npm API endpoint. From npm's point of view packages coming from JSR are the same as those coming from the npm registry itself. The only difference is where the package is coming from, but all semver resolution, etc are done as usual by whatever package manager you use.

Disclaimer: I wrote a good portion of the npm tarball generation code.


I'm really curious how you handle typescript versions and options that affect emit


- `isolatedModules: true` is required - Decorators: standard spec decorators only - JSX: tsconfig options are lifted into JSX pragma comments on publish (configuration goes into code so that consumers can emit based on these options still)

TypeScript has very very few options that affect emit, and effectively everyone has settled on one subset now.


> TypeScript has very very few options that affect emit, and effectively everyone has settled on one subset now.

Very confused by this statement. There's lots of options that affect emit. And what do you mean a subset has been settled on? Take any 2 TypeScript projects and I'd wager there's a high chance they have options that emit code differently.

Config aside TS emits .d.ts that is not always backwards compatible with older TS versions. Does jsr support typeVersions for such situations?


Another comment in the thread makes a very good point:

> Man, if only they'd debuted with this, but, hindsight and all. My only thing is that TypeScript has no spec, standard, or competing implementations so it seems a bit risky to treat it as a first-class citizen as far as publishing packages goes.

Add to that the frequent compilation issues that happen as TypeScript adds or removes types in the global imports, changes language conventions, and updates compilation checks whenever a new version gets released.


Look at the comment under the comment you mention - those issues are about the TS type system, NOT the TS syntax. TS syntax is highly stable, and is the only thing JSR relies on :)


How is that true if the npm compatibility layer serves compiled js and .d.ts files?


Both `.d.ts` and `.js` files can be created without understanding the TS type system (this is what JSR does). Creating both of those is just a TS syntax transform - one that is highly stable (esbuild and friends all do it already).


I know .is can be emitted as an ast transformation, but I'm surprised by d.ts files. Even inferred types?


We do some basic inference, but that's why we require explicit return types in the public API, see https://jsr.io/docs/about-slow-types . TypeScript itself will ship with a new `isolatedDeclarations` option in the next release which is the same thing. The point is that no inference other than simple literal inference needs to be done to get the type of the public API. That way generating .d.ts files can be done by any tool by merely looking at the syntax without type inference.


> The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.

I mean code itself will be non readable, but in modern build ouput chunks are using regular esm modules, at least for Vite, Rollup, Parcel and so on. Webpack might be different story, but still.


> This sounds like another building service I don't control. There's already too much magic in publishing transpiled TypeScript packages but at least the current tools let me control exactly what artifacts get pushed out.

For tools that natively support TypeScript, you can directly consume the TypeScript source code. No transpilation necessary. The complexities are only introduced (well not really introduced, just moved somewhere else) for tools that do not natively support JS.

> The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.

That doesn't matter. If everyone settles on one standard, even if you do post process, tools become simpler. The idea is not that you must ship your code unbundled to the browser. It is to reduce the amount of complexity incurred in the process from source -> dist. I am sure you agree that the lower the difference between source and dist, the easier debugging and developement in general are :)

> A super of npm means that the packages created for this repository will not work in the npm ecosystem. What is this if not a "replacement" of the npm registry and further fragmentation of the JS ecosystem?

They do work in npm. In npm packages you can depend on JSR packages, and on JSR you can depend on npm packages.

> This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...

Nope! JSR is not a package manager. JSR is only a runtime. When you install into a node_modules folder, the layout in this folder is the same between JSR and npm packages. So any downstream tool does not even need to know JSR is in the loop - it just works. This is one of the key points to ensure easy adoption of JSR by anyone.

> It seems like every benefit that this project offers can be fixed in the current ecosystem by better client-side tooling that enforces standards during the publishing process while keeping full backward compatibility with over a decade of packages published and without fragmenting the ecosystem.

I understand this.

The problem is the following: npm is owned by GitHub/Microsoft, and GitHub/Microsoft does not care about npm. To GitHub, npm is a cost center. npm has no dedicated product manager at GitHub. Last I heard, npm has _two_ dedicated engineers working on it. The last real new feature npm shipped was package provenance (last year). It took npm 5 years to ship the file browser (because GitHub/Microsoft did not care about npm, and thus effectively froze all feature dev work on npm).

If we want innovation in the space, we need to do it in a way that a) keeps backwards compat with npm (which JSR does, both ways), but b) we need to move away from npm in the long term. JSR is an attempt at this (while trying to stay within the constraints posed).


Thank you. I stand corrected on some points. The messy ecosystem of npm, yarn, pnpm, JS, TS, ES, and ESM has all cost me a good amount of white hair.

A good source of my confusion came with not immediately recognizing that a new package registry does not require a new package manager to work with. I think the "jsr" commands in the code samples and the jsr.json make it feel like there's a new package manager in play here. If the team has made it so that npm works with "jsr" and vice versa, kudos to them because that's a big win for the users.


As far as I can tell, the "npx jsr add @luca/flag" command given for using JSR with Node itself calls "npm i @luca/flag@npm:@jsr/luca__flag" and is equivalent. You get a normal node_modules+package.json+package-lock.json output that the npm command is fully able to work with.

It looks like JSR re-publishes all of its packages translated into JS + .d.ts files onto npm under the @jsr scope. Regular npm packages have no trouble depending on JSR packages because they just depend on the npm-published version of them.


yup, we added the `jsr` tool mostly so that you don't have to be aware of the @jsr scope that's used under the hood. We're not publishing to npm though, but rather mapping the @jsr scope to the JSR registry in the project's .npmrc file.


CORRECTION:

An employee of GitHub has reached out to me to inform me that there are more than 2 engineers people working on npm full time, including multiple full time support staff and full time staff to review possible malware. The engineering team working on npm is "substantially larger than 2". Additionally they informed me that a new product manager for npm will be starting tomorrow.

Even though I made no claim to the contrary, I would like to emphasise that there are obviously current product managers at GitHub that product manage npm as _one of_ the projects they manage.


You’re 110% on point.

Publishing is too complicated? Let’s add something else: https://xkcd.com/927/

This is all node and npm’s fault, they’re sinking billions of dollars of developer time worldwide and they should be called out on it.


The Deno announcement post of their new JavaScript registry: https://deno.com/blog/jsr_open_beta


> JSR: The JavaScript Registry

Then on the website:

> Made for TypeScript & ESM

> JSR is designed for TypeScript

> You publish TypeScript source

Seems the <title> tag needs an update to reflect what this really is :)


This is probably a marketing bug on our part. While we did want to design for TypeScript from the outset, you can definitely happily write and publish plain JavaScript code on JSR. We probably need to do a better job explaining and featuring this.


and the name... I suggested TSR (the TypeScript Registry)


The project is called "The JavaScript Registry"

> The JavaScript Registry (JSR) is a modern package registry for JavaScript and TypeScript.

https://jsr.io/docs/introduction


Yeah, this is what I'm complaining about.

If you're building something that is designed for TypeScript, made for TypeScript and publishes TypeScript sources, then I find it really hard to understand why you'd name it "The JavaScript Registry".


Presumably because they don't want to imply that it won't work if you don't use TypeScript.


I don't think it will, unless you use the npm compatibility layer.


An npm compatibility layer doesn't sound like it has anything to do with TypeScript? I'm assuming that you can use JavaScript anywhere you can also use TypeScript.


The jsr registry serves typescript, so you have to use typescript tooling, maybe tsc buxy could be esbuild, demo, w/e.

The npm compatibility layer compiles typescript to JavaScript on the registry so you can use it without any typescript tooling.


Ah! Looks like it does indeed:

> Unlike with native JSR imports, you are not directly importing TypeScript code. Instead JSR transpiles the TypeScript code to JavaScript before it is installed into your node_modules directory. This generally means that your editor experience will suffer, because “Go to definition” and other features will link to transpiled JavaScript code, or to generated .d.ts files.

https://jsr.io/docs/npm-compatibility#limitations

That said, I'm assuming that if you use Deno, you can write plain JS as well (i.e. the TS library will still work with JS). So yes, you'll need to use the npm compatibility layer if you don't use Deno, but those are the only supported use case.

So I think they still don't want to imply that you have to use TypeScript if you use JSR in the runtimes it's targeting.


Any valid JavaScript is automatically also valid TypeScript, though.


because TS doesn't exist without JS?


When would I want to use JSR? The "Why JSR?" section simply says JSR is a superset of NPM that can be used with any javascript package manager. But NPM itself can also be used with any javascript package manager as far as I know. It seems like the benefit they're trying to add is strict type support for packages, but the fact that it's a superset of NPM means all existing packages that lack types are still supported by JSR. So, what benefit is so great that it's worth segmenting the ecosystem?

Side rant: not at all a fan of the decision to punish libraries ("slow types") that use type inference by reducing their score.


In terms of "when you would use it" - it doesn't necessarily solve a different problem than npm. I think it arguably solves the same problem more effectively (though it is certainly early days and npm still does a lot of things JSR doesn't). I also don't know that we're segmenting the ecosystem, as JSR packages interoperate with npm packages, and you can use one from the other. JSR will end up being additive to npm.

For module authors, we're hoping JSR will be helpful in the following ways:

1.) You can develop and publish TypeScript source, and let JSR handle transpilation and generating .d.ts files for runtimes that don't natively support TypeScript. Especially nice if you are using Deno or Bun (that do natively support TypeScript), and don't have tsc in your workflow otherwise.

2.) JSR generates API docs for you on your package page based on your source code and comments.

3.) JSR has a great DX around publishing packages from GitHub Actions using OIDC (no juggling tokens)

4.) JSR automatically provides provenance for published versions of packages

For module consumers, it helps too:

1.) Compatible with both Deno and existing npm-based projects

2.) Package info and docs provided centralized on the jsr.io site

3.) Quality scores that encourage authors to make their packages fast and well documented

4.) Access to TypeScript source for packages (not just transpiled output)

Absolutely more to be done, but we're hoping that JSR will provide enough extra value to both module authors and consumers that both will prefer to use it when possible.


Sounds like a good value prop. One thing that I'm hoping is to reduce dependency on a single private company, npm inc., in publishing and distributing modules. But you didn't list this as a value prop. (I notice other posters are upset at the 'fragmentation' of module distribution, but I see this as a strength, not a weakness.)


> not at all a fan of the decision to punish libraries ("slow types") that use type inference by reducing their score.

Do you feel that using type inference doesn't actually reduce performance, or just that it's not a big enough problem to warrant a reduced score?


The performance of type inference matters for runtimes which work with TypeScript files directly, rather then using the transpiled .js + .d.ts files like Deno. This has several benefits in that we can jump to the source on "go to definition" rather than some random .d.ts file among other things. Deno uses the original TS files for type checking as well and that's why inference also affects type checking performance in some runtimes.

By ensuring explicit return types in the public API, the generation of .d.ts files is turned into a mere syntax transform, rather than requiring the tsc compiler. This is something TypeScript will ship with in the next release itself. They're adding a new `isolatedDeclarations` option which is the same thing.


Type inference in dependencies does not reduce performance for users, but just for JSR. Here they’re going “the Google way” by adding some random metric that benefits them, not users. Once generated, .d.ts files do not contain inference.


One thing I'm not sure about when reading their policy is what'll happen in another kik situation. If someone were to claim the scope Microsoft (for example) that had no relation to Microsoft and then Microsoft came along and wanted the scope what would happen?

If MS got the scope what would happen to the packages in that scope, or if MS didn't get it what is done to clearly communicate that this is an unofficial account (presumably) asking on MS' behalf? In this early on period I'd expect a lot of people to claim the scopes of notable companies and these companies might take issue with that if they choose to use jsr down the line.


We do intend to take a more editorial approach to scopes, and assign scopes to users in a way we think is more intuitive for end users of JSR. We have reserved some obvious scope names already, but in the future, we'd likely entertain requests to reassign ownership of scopes for the benefit of the broader user community (as in the case of a brand owner requesting ownership of their brand name).

So in the case that a user published "@cocacola/foo", previously published versions of "@cocacola/foo" would remain available indefinitely (unless they were found to be malicious), but we would likely be willing to assign ownership of the "@cocacola" scope to a representative from that brand/company if they asked for it and we could verify their identity. The original author of "@cocacola/foo" would need to publish the module going forward under a different scope.


> but in the future, we'd likely entertain requests to reassign ownership of scopes for the benefit of the broader user community (as in the case of a brand owner requesting ownership of their brand name).

So you're basically committing to repeating the Kik drama? For reference: https://en.wikipedia.org/wiki/Kik_Messenger#Open-source_modu...

It would be great to find a way of structuring these registries/repositories in a way so there wouldn't be any name collisions, and also avoid the built-in support for companies to take names away from individuals.


Thankfully JSR won't be capable of a left-pad situation where packages can be unpublished - published packages are immutable[1].

As for the potential for disagreements over whether or not a scope should be transferred, that is a big reason why we want to figure out community involvement in governance sooner rather than later. We are gathering potential volunteers who want to discuss becoming a community moderator - if anyone would be potentially interested, they can sign up to join that conversation[2].

[1] https://jsr.io/docs/immutability [2] https://jsr.io/go/moderator


There are only two ways to do it:

1. Use UUIDs.

2. Use some pre-existing centralized name registry (e.g. domain names).

I don't know why the JS ecosystem is so resistant to either solution. Both are proven options (#1 used by COM, #2 used by Java). They do mean longer package names, but surely that's a small price to pay for a resilient future-proof solution? And besides, who really cares about long dependency names and why?


In the example of "@cocacola/foo" would it allow for the "@cocacola/foo" package to be updated with new versions by the new owners? Or would the foo package essentially be archived and read-only from this point on?


The new scope owner would be able to update "@cocacola/foo" and publish new versions (previous versions would be unaffected).


I'd suggest they require a DNS entry to use a scope and all scopes have a one to one relationship with a domain.


What happens if the domain registration expires? Scope takeover?


Domain expiration is rare, and virtually only happens when the related projects are dead anyways; freeze existing packages (no more updates until the original key for the domain comes back online, with manual override by registry administrators for edge cases) and have a reasonable waiting period (a couple of months to a year) before allowing the new owner to use the namespace (with different project names within).


How do you deal with individuals that may not have a domain? What do you tie their scope to?


Use URN instead of just a domain name. A domain name (with some schema) is obviously a subset, but anyone who doesn't want or need one can then use uuid: URNs.

The XML ecosystem did it that way for namespaces, and I think that it is still the most flexible and the most future-proof approach, since you can always add new schemas as needed.


Subdomains on a service that will do authentication for cheap or free?

Your packages could be published for lucacasonato.imbroke.com


This is a joke: GitHub account name. If it is good enough for Golang, …


Doesn't this get us right back to where we started from? Where you don't want users to be able to register arbitrary names :)


Not if the system is built for DNS first. If a company/individual/organisation has a domain name, they are expected to use the domain. It is much less of an issue if someone has kik.users.registry.com when the company publishes packages under kik.com.


Correct. That's a big part of why it was a joke. (It's also a contributing factor in why I wonder if golang itself is in fact a prank.)


Love this idea, defer the IP problems to DNS where everyone already has expectations set.


Kind of jarring now that the standard library documentation on the Deno website just redirects to JSR.

Speaking of which, the new deno.com site looks like a marketing website designed by an AI. Lacks all of the pragmatic unix-y character that Deno originally had, and is just filled with meaningless numbers and metrics and way too much blank space. It's like they're selling a product... which I guess is what Deno is now. VC funding will do that to you.


I hate modern day landing pages. Even those single page websites where some animation plays while scrolling down were more interesting.


After reading the "Why JSR?" article I still don't understand why this exists.

It looks like it has a few niceties but also some limitations. To catch on you need to convince some significant segment of both package publishers and consumers to switch to it. That means there needs to be some killer feature they just can't get with the existing registries, and I'm not seeing anything like that.


> After reading the "Why JSR?" article I still don't understand why this exists

The cynical answer: they realized it's actually advantageous to have a centralized package registry instead of importing from random urls, and as a for-profit VC backed company they'd rather not have their entire ecosystem depend on the Microsoft-owned NPM.

The optimistic answer: having a second registry makes the entire JS/TS ecosystem less fragile by not having a single point of failure.


Kevin from the Deno team here - happy to answer any questions you have today!


I am not a user of Deno, but I acknowledged that HTTPS module imports were one of the original selling points of Deno compared to Node/NPM. Was it revised at some point that a package repository is needed? What's the back story?


HTTPS imports will continue to work and be supported in Deno. However, as we observed their usage in the wild, a couple problems became clear:

1.) Duplicated dependencies - projects would often download multiple versions of the same dependency, because there was no deduplication happening based on semantic versions.

2.) Disappearing dependencies - under some circumstances, an HTTPS import URL would be unavailable, causing code dependent on these modules to break.

A central package repository could solve for both of these problems. We considered (and very nearly chose) to just use npm, but it introduced functional and UX problems we wanted to solve for users. So we set about building JSR, and did so in a way that wasn't tightly coupled to Deno (JSR didn't need to be - it is useful in the context of any JavaScript runtime).


> a couple problems became clear: 1.) Duplicated dependencies

I’m sorry but that did not come up when designing the system? It’s the whole principle of semver ranges that deno decided to do away with. I cannot believe nobody saw this as a drawback. It’s the first thing I thought when I saw hardcoded versions.


This is a solid reason to build it. Makes sense to me.


In case you are not aware (maybe you are) node has experimental support for http imports. I personally think this feature is a disaster for many reasons, but if you want to use it in your toy apps it is there in node.


> I acknowledged that HTTPS module imports were one of the original selling points of Deno compared to Node/NPM

I am imagine most Deno users considered this a weird wart or foot gun, the rest of deno looks interesting.


I’m curious what this means in regards to this existing “use Deno to also publish to npm” workflow (https://deno.com/blog/dnt-oak) which I was considering using, and then the JSR announcement happened.

The thing I like about the blog post and the mentioned dnt tool is it takes care of lots of bullshit you otherwise need to figure out when publishing to npm.

What is the relationship between JSR and DNT in this case? Should one be used over the other? Together?

If I want to publish a module so it’s available for Deno and NPM, what is the recommended approach now?


As JSR develops, you can expect to see more features like those of dnt start to show up in the npm compatibility layer. We've also been exploring how to create a good DX around simultaneously publishing JSR modules to npm, so publishers can control their namespace there as well. We definitely know it's a usage pattern folks are interested in.

In the immediate term, dnt is still a very strong choice for people that want to develop modules in TypeScript using Deno, and then publish them to npm. In the fullness of time, I expect that JSR will provide a pretty complete solution to this problem as well.


Man, if only they'd debuted with this, but, hindsight and all. My only thing is that TypeScript has no spec, standard, or competing implementations so it seems a bit risky to treat it as a first-class citizen as far as publishing packages goes.


The TypeScript syntax is very stable - there is no spec, but it's very stable. The type system on the other hand: not very stable, and yeah I agree - that would be risky to rely on.

However as JSR only relies on the syntax, not on the type system, it's no problem :)


Is not the 'slow types' problem[0] a type system dependency, or am I missing something?

[0]https://jsr.io/docs/about-slow-types


It does not use TSC / do type analysis. This is purely syntax analysis.

The "slow types" restriction specifically exists because JSR does not use TSC or do type analysis :)


Ah, I see


I'm still surprised by so many projects betting on TS.

ES will introduce types at some point and all this effort around TS will again divide the JS community.


Technically ES did introduce types already: in the "lost version" ES4. Typescript's syntax was partially inspired from ES4 and while not "compatible" with the previous type system there is more compatibility in the syntax than you might expect.

The Type Annotations [0] proposal that TC-39 (in charge of ES standardization) has kept in front of them (at Stage 1 so far) takes the "Python approach" of standardizing the syntax but not the type system still leaving that to external tools (in part because of the lessons from the mistakes of ES4 and the type system that was standardized but never really adopted). The proposal basically assumes that most types will (still) be in the Typescript type system (and would still need Typescript or compatible tool to type check) and maximal (but not complete) compatibility with Typescript syntax is a goal of the proposal.

But you don't don't have to take my word on that, the FAQ at the bottom of that Type Annotations proposal is well written and quite detailed.

[0] https://github.com/tc39/proposal-type-annotations


I used ActionScript 3 back in the noughties and was so excited for ES4 in the browser. Adobe open-sourcing Tamarin made it seem it was a done deal (I didn't follow the politics). I was so disappointed when it was killed. It did mean that when I tried Angular 2 the best part of a decade later I was delighted with how familiar TypeScript felt.


I've been complaining about the death of ES4 since it was killed back in 07 or 08.

The proposal might have compat types with TS but what about all the other TS features like generics. Now more than ever, it seems risky to bet the house on TS.


TC-39 isn't "betting the house on TS": the proposal is generic and intentionally doesn't define types it defines syntax for annotations of types. It's basically "type information may appear between a colon after an identifier and an equals sign or other terminator" and "type is a statement keyword with type information following it' and "interface is a block keyword with type information inside that block". That's pretty much the entirety of the proposal: it doesn't state what that type information is (just parsing limits on what it could not be so that it doesn't blow up language parsing), what it is used for (other than that the general idea at runtime is that type annotations are comments just like existing jsdoc comments), or who uses it. While the proposal is trying to stay syntax compatible (again, not types compatible because the proposal doesn't include types) with Typescript it also points out existing overlaps with Closure and Flow type syntaxes and is built so that both of those can use the new syntax for their own, different type systems.

This is why it is a Python-inspired proposal: Python didn't define types, Python defined "here's some places in the language that are effectively ignored like comments that type systems are expected to use". Python has made them more than "just" comments in later proposals and TC-39 holds the right to do that for ES/JS in the future but the current proposal stops about where Python's first proposal did at "types are just fancy inline comments for compilers like (but not limited to) Typescript/Closure/Flow".

Again the proposal itself is a really good read for specifics of where it looks like Typescript and what parts of Typescript are intentionally not supported. For instance as a good reminder there are only three (!) features of Typescript left that aren't ES-standardized and generate code: enums, namespaces, and class parameter properties. Of those, both enums and namespaces have been marked deprecated for a long time and many lint rules exist to help eliminate them from your code if you are still using them.


Is that a good reason to write in raw JS today?

I'm not betting anything on TypeScript. It was released 11 years ago, and it's incredibly valuable to me. If it takes another 5 years for ES to implement first-class types, and it's a good implementation, I'll start using ES for new work at that time. What's lost here? TS won't just stop compiling.


I'm not referring to end users like yourself but projects like Deno who have invested a fair amount of resources into supporting TS.


Oh, yes, that makes more sense.

Still, native typing is not even on the horizon for ES, so I think my point stands for businesses that are heavily invested in supporting TS. TS is what we need today, it doesn't make sense to ignore it because theoretically it could be obsolete in 5-10 years.


If and when it does, why do you think it wouldn't just be a subset of TS? Why reinvent the wheel when you have a perfectly good one that's beed around for 12 years and is proven in production?


Consumers don't touch any ts, outside the (technically optional) .d.ts

Thecwayvi understand it in theory, the typescript version a library uses could disappear forever the day after a library had been published to JSR and all would be fine.

Until someone wants to maintain the code of course, but then you'd have the exact same problem with the elusive typescript version if the repository only took compile output.

What JSR would break (or make a deliberately difficult path?) is using some private fork of typescript that isn't even available on the day of publication.

The benefit of JSR, as far as I understand it, is offering an easy path to publication that makes type parts not an extra that some might skip but a reliable default. It does make me wonder however, how long-term reliable the finding will be: JSR promises more service than registries before, while also being more tool agnostic. I'm not sure if it's true, but my impression was that in many cases, the registry was primarily bankrolled by however the tooling was funded, e.g. effectively becoming a marketing expense to selling tooling expertise. JSR appears particularly removed from any of that and it's bad enough when the repository fades away in terms of static hosting, it's worse if it also leaves your library without a build process.


Our team at Socket (disclosure: I'm the founder) wrote up an excellent overview of JSR and everything we know so far about it here: https://socket.dev/blog/jsr-new-javascript-package-registry


I can't even tell that I'm blocking anything, but: "You are offline. This site requires an internet connection."


> JSR isn't a replacement for the npm registry; it's a superset of npm.

I first read this line to mean that JSR contains every package on NPM, and was just doing some post-processing on them. But it does seem to be its own registry. Maybe the intent of the line was to communicate that you can install packages from both?


I think this is mostly right - "superset" in that JSR modules can depend on npm modules, and projects using npm can use the npm registry and JSR together. The bottom line we'd want to communicate is that JSR is additive to npm, and the two can be used at the same time.


The usage of the "superset" expression is very confusing to me and I bet to many others.

I would recommend to use other terms such as "Additive" or "Complementary" to describe JSR.


That's not really what "superset" means, so I think you might want to change the wording.


JSR is almost literally a subset of npm in features. Npm allows publishing of anything, JSR only actual TS/ESM. Whether those modules have dependencies doesn’t expand the set IMHO


> I think this is mostly right - "superset" in that JSR modules can depend on npm modules

This is not what most people would think when a registry is said to be a “superset” of another.


Seeing Bun/Node/Deno support reminds me of a question: are there any good resources on writing JS that works in all 3 and the browser? For example if I wanted to write a simple filesystem API that would work the same across bunodeno?


I think you'd have to write a facade package with a consistent API that mapped to the node/deno/bun equivalents, since they're each quite different.

Best bet is to use the Node fs package and rely on the deno/bun compatibility layer.

More generally, Deno pushes for Web Platform APIs, and as more are proposed and implemented the runtime-specific APIs will become fewer and fewer.


Yeah this is basically what I'm trying to do (see response to sibling). Unfortunately I consider Node's APIs the worst of the 3. No shade on them it's just an older design. I really like Deno's approach since my code also needs to work in the browser. Writing HTTP handlers that take Request[0] and return Response[1] is a beautiful bit of symmetry with frontend code and feels like cheating.

[0]: https://developer.mozilla.org/en-US/docs/Web/API/Request

[1]: https://developer.mozilla.org/en-US/docs/Web/API/Response


If you use Node 20+ then there's a lot more compatibility with web platform APIs, so it is possible to write code that works on Node, browser and WinterCG-compatible runtimes like Deno. Obviously you'll need to avoid using Node builtins.


Bun’s goal is to be an actual superset of node, so you can write just for node and expect it to work in bun.

As for deno, I think if you publish ESM you’re mostly covered since deno can install from the npm registry.


You can't, and for good reason. Deno has slightly different API's than Node. Bun also has subtle differences between their STL implementation and Node's, but afaik, they're getting there.


I think maybe I was unclear. I'm talking about writing libraries that abstract across these differences and provide a single API, as sibling describes. I already know it's possible. I made a simple filesystem abstraction here[0] and a very simple HTTP library that uses it here[1]. They both work in Node/Deno and the browser. Unfortunately I ran into issues with Bun's slice implementation[2], but that should be fixed eventually.

My overall question is that I suspect there's a much better way of detecting and using the different backends, and I'm wondering what techniques are out there.

[0]: https://github.com/waygate-io/fs-js

[1]: https://github.com/waygate-io/http-js

[2]: https://github.com/oven-sh/bun/issues/7057


Oh boy. Reading the title ("JSR: The JavaScript Registry" at the time of writing) I had high hopes this might be what I'm looking for: Dependency management for JavaScript in the browser. It's something entirely different.

I tried to look for something lately that:

1. Makes it easy to download specified versions of JS libs.

2. Exposes these libs as proper ES6 modules.

I get why (2) is a bit tricky, it'd be fantastic but more of a nice to have anyway.

I don't really get why there's no popular tool for (1) yet. Sure, I can just fetch the files I want from a CDN and commit them to a vendor directory, or I can write a little script that downloads them from a CDN. Or hey, I can fetch frontend dependencies via NPM and have a script that builds/copies the stuff. But that's a bit much labour for something that I thought would be a relatively common requirement.

Am I in the minority to occasionally want to build web apps _without_ bundlers? Being able to skip the build step is very powerful both for keeping things simple and turnaround times fast, IMHO.


> Exposes these libs as proper ES6 modules.

We don't do this natively (yet?), but esm.sh supports JSR alreay: https://twitter.com/jexia_/status/1762516242626416750


Thanks for pointing this out!


You can just use npm and ship node_modules on your website. It's probably "huge" so you probably want to clean out dev dependencies first (`npm prune --omit=dev` is one way to clean that) and you might find it useful to search for big binaries to filter out and redundant directories that you don't need (libraries that still include all of UMD and CommonJS and ESM builds even though you only need one), and there may still be libraries that don't directly load in the browser and you need to spot bundle with a tool like esbuild to a vendor directory.

Mostly the only other glue you need after that is an import map.

I find this flow useful (ship an optionally pruned node_modules, spot build specific vendor libraries, add import map), especially for lightweight development/testing, and so I did document it specifically from start to finish for one of my libraries (which is sort of a "framework"), it includes a vendor build one-liner:

https://worldmaker.net/butterfloat/#/getting-started?id=setu...

(The Example section after the Dev Environment one shows the import map at the top of the example HTML if you are looking for that. I forgot that's where it was when re-reading this.)


esm.sh is a decent solution for this. It's not totally ideal... but it'll turn NPM modules into ES6-compatible URLs no problem.


Oooh, esm.sh looks pretty close to what I'm looking for, thanks!


Everything just has to sound cool these days because yolo

    yarn dlx jsr add @oak/oak


That does get to be a mouthful! But if you wanted, you can of course do:

yarn global add jsr

so that subsequent installs could just be:

jsr add @oak/oak


I still find `npm install @oak/oak` better.


The server that takes your TypeScript and does all of the work to publish a nice package on npm is great. But why wouldn't they publish to npm after that? Why make a new registry that is introducing fragmentation?


Glad the pinnacle of programming is still there: left-pad


I think this time it won't spread as much because most people just ask their LLM for a leftpad function.


What are the chances that the LLM suggests the NPM package?


The entire source code:

    export function leftPad(str, len, ch) {
      return new Array(len - str.length).fill(ch || ch === 0 ? ch : ' ').join('') + str
    }


It's not even needed anymore: "HN".padStart(10); That's it.


Does pad start respect RTL marks? /s


Seems like the logical way to do it. Though, honestly, unless you specifically need to do lots of different kinds of padding, something like this makes way more sense:

const bespokePad = (x) => ('000000' + x).slice(-6)


that actually seems really inefficient, it could avoid the array allocation and fill call by just moving the length check...


You could also just use `String.prototype.padStart()` [1], and you get a nice optimized C++ implementation :D

[1]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


While it's not quite C++, here is Chromium's implementation: https://source.chromium.org/chromium/chromium/src/+/main:v8/...


It's Torque, which is essentially C++

https://v8.dev/_img/docs/torque/build-process.svg


While funny, worth remembering that .padStart didn't exist when left-pad was first published :)


Pass, not a Rust implementation /s


Is it even AI scale if not performed on the GPU?


No, you're right. The White House says no more C/C++ it's your civic duty.


I wanna see you write it from memory. Now do it without bugs fixed in 2013 in that package.


I get their stance on TypeScript "exported types must be explicit and not use inference", but I feel like this is going to cause a lot of friction in adoption.


1. You can bypass slow types checks during publishing. You will just get a lower JSR score.

2. TypeScript is shipping features in the next release (TS 5.5) has quick-fixes in the editor to make it super easy to add explicit types :) The TS feature is called `isolatedDeclarations`


I would just like to add some of my findings around the developer friction with isolated declarations.

The whole reason isolated declarations is taking so long is because we also worry about making the requirements of this too onerous. To this end with isolated declarations, you do have to specify most types, but not all. For expression where we can reasonably infer the type from local context, we do so.

This should ease some of the developer friction. Not all of it. Some things will not be inferrable, but things like primitive types, object literals, tuples will be so it should make it easier.

We also worked on a code fixer that adds explicit type annotations where this new flag would raise an error, again to help people migrate if they choose to do so.


To be clear for people reading wondering what this is about, this is only a hard recommendation for the public API types. The reason for it, is by adding explicit types to the boundary of your package, the package becomes way faster for users to type check because every user's machine doesn't need to do all the inference work and type check internal types or packages not related to the types. Additionally, it makes the published code more resilient to changes to TypeScript's inference in the future because it's not doing inference. It also becomes way easier to generate documentation for the package (also, the ability to generate .d.ts or bundled .d.ts files without a type checker becomes easier).

Right now, the publish command errors and asks you to fix the issues or bypass it entirely via `--allow-slow-types`. In the future there will be a `--fix` flag to write the explicit types for you.


One solution to the package dependency swamp is to make a project with little to no dependencies? I'm rambling but I would curious to experiment with an approach where, when I needed a bit of code to solve a problem the "package manager" (more like code procurer) would just find a snippet of code I need, perhaps reference it's origin, perhaps add related unit tests and then I would copy paste it into my own code base.


I've experimented[0] with something along these lines. The basic idea is that it's desirable to be able to reuse snippets/functions across projects, but you shouldn't need to go full left-pad. So basically you run tuplates.py on your project, it goes through all files and any line comment that includes "tuplate_start(URI)" it goes out and fetches the URI (local filesystem or HTTP supported), and replaces everything until it finds a line comment with "tuplate_end". Nice thing is it enables basic templating in any language, and instead of checking templates into source control, you're committing working code. Also combines really nicely with jsdelivr where you can import specific versions of files directly from GitHub.

[0]: https://github.com/anderspitman/tuplates


i think this could function in a small project scenario and i like the idea. Don't think that would be super maintainable in larger enterprise applications. You'd essentially be on the hook for maintaining more code as well.


Heard about it on Syntax podcast.

There is nothing wrong with JS-based development that can be fixed by _more_ ways to do the same thing, even if it adds a few new neat tricks.


Looks good to me. Lots of nice fixes.

Unfortunately, it seems we’re stuck with semver, because that’s a package manager thing and not a registry thing. Better use a lockfile.

Edit: another issue is that when I tried to sign up, in the GitHub auth page, it asks for “act on my behalf” permission. No thanks.


I think SemVer is a good thing. What do you mean by "we're stuck" with it?


For dependencies where you didn’t explicitly specify the exact version, taking the latest version is nondeterministic - it varies over time. Someone else who checks out your code will get different results.

I was hoping for something like Go’s minimum version dependency resolution.

A lock file does solve the immediate issue.


> Someone else who checks out your code will get different results.

> I was hoping for something like Go’s minimum version dependency resolution.

Uh and that’s different how? The default semver range is “minimum dependency version” with an upper bound in the current major.


The upper bound resolves to a different version after a new minor version is published. That’s nondeterministic. Someone else checking out your project will get different results, unless you use a lock file.


See, you don’t know how npm works. Npm does not care about the lockfile of dependencies. “Someone checking out my project” always gets the latest version of each dependency within the semver range, until their lockfile locks a version into place.


I meant someone checking out the exact same Github repo (for an application), not someone depending on a library I released. If you have a lock file, it works in that case, I believe?

But actually I use Deno, and I'm honestly not sure how that works either.

As for library dependencies, I know how it works in Go, which is how it should work. When you add a dependency, and it has its own dependencies that you don't use directly, you get the same version that they tested with, which is the lowest one they specified. It only gets overridden if something else requires a higher version.

This means that when a new version gets released somewhere, nothing happens until people notice it, bump a version, and hopefully test it. No library version changes unless there's a commit.

Taking the latest minor version means that it hasn't been tested by anyone downstream. (How could they, when it didn't exist before?)

New library versions should be tested by direct dependencies before they get used by indirect dependencies.


I know this is a nit and not the responsibility of jsr (looks great all! Good job! Can’t wait to use it) but the fact that we still have a “node_modules” folder as a standard for the ecosystem. This sucks. I get it. I know why. Node is king. However there are other runtimes out there that use this pattern simply because they don’t want to reinvent a package management system (I’m with them on this) but have to use “node_modules” as their module folder simply for compatibility. I decided not to follow suit and use @modules instead. :P

I like JSR, and this in no way reflects my opinion of js,ts,jsr, or you all. I just think it’s time to move on from some dahl-isms (decisions made while getting node production ready).


What's the best way to include a binary blob (wasm binary) in your package? For NPM I've been using a bundler (esbuild's `binary` loader) but I'm not sure of the best way to do that in a modern, jsr-friendly way.


We will soon support WASM imports (`import source foo from "./foo.wasm"`). [1]

[1]: https://github.com/tc39/proposal-source-phase-imports


Excellent, this will make my life easier. Thanks!


If you‘re using emscripten, check out the SINGLE_FILE option, that embeds the WASM as a base64-encoded string into the JS.


This is a terrible name. JSR already stands for "Java Specification Request" which is basically an RFC or standard for Java (not JavaScript).

This is going to make Google searches even more difficult. Searching for Java jobs and getting JavaScript jobs is already bad enough. This name is just making a big mess with semantic name collision in our mental namespaces.

Just make it "RfJS" maybe for "Registry for JavaScript" (if that doesn't class). I know Firefox had a few naming rounds going through Phoenix and Firebird IIRC first.

No opinion on the actual project itself.


JSR is also a company that produces synthetic rubber, a semiconductors company, a pharma company, a journal, and a company that produces merch for rock musicians. All of these rank higher on Google for JSR than Java Specification Requests.

Given that you're going to have a hard time pulling up the JSR you're looking for without adding qualifiers anyway, I don't think this name choice is going to substantially change anything.


This is great!

I very much like the added TypeScript support (which the standard NPM registry does not have.) and that it's open-source.

I'll see about getting my packages uploaded onto here, too -- just having built-in support for TypeScript makes packaging x100 easier.

Regarding NPM, it's absolutely insane that we've been depending upon a closed-source, monolithic nightmare with poor UX for so long. You can't even use comments in package.json -- overall, it feels like a holdover we haven't figured out how to replace.

Furthermore, it's scary that Microsoft, through GitHub, now hold so much power over the JavaScript industry -- enshittification will most likely ensue.


I'm not sure jsr fixes comments in package.json, and I wouldn't blame npm for the miss feature.

package.json is used at runtime by node.js and strictly parsed as json. Npm or just could strip comments on publish, but local development of the package would break.

Not sure if bun or demo use pjson at runtime or if they allow comments.

I think it's interesting that JavaScript projects at the package root despite most projects compiling to dist nowadays. I'd like tool to generate the whole package root, including a transpiled or generated package.json


> package.json is used at runtime by node.js and strictly parsed as json

It still constitutes an unfriendly developer experience -- people have had to make workarounds to document what complicated scripts do, and why they use certain dependencies, etc. I understand how JSON.parse works, but this still isn't the best behaviour.

Note that this is just one example of unfriendly UX in NPM as a whole -- this isn't the hill I'm dying on, so to speak, I'm just giving an example of how I believe the industry is being held back by tools which just haven't kept up with demand.


> I'd like tool to generate the whole package root, including a transpiled or generated package.json

That's exactly what JSR does. We generate the package.json ourselves.


In development, or on publish?


Is there a similar site but for browser only javascript libraries?

I am not a web developer, and always find NPM etc way too much for my occasional needs.

When I search internet for anything, 90% of the times I get an NPM based library that can not work in browser with just a <script src="lib.js">.

JavaScript has improved a lot and does not really need Node/compilation steps for almost all my use cases. Is there anything where I can search for browser only, client side javascript libs?


You can usually browse npm for what you need, and when you want to use it, look at unpkg.com/<name of package>

Usually this will give you a .min.js file ready for you to script include on your site, for example, unpkg.com/mithril

Sometimes library authors don't default the export to a browser ready minified version or ESM module, in which case, you can snoop around the built package at unpkg.com/browse/mithril/.

Most READMEs worth a damn should tell you how to use their libraries without npm if possible. Unfortunately, a lot of library authors suck.


> Unfortunately, a lot of library authors suck.

The nerve of them not catering to the small percentage of use cases for a project they often do in their free time and often for free. The entitlement is real.


Sorry I forgot we're not allowed to criticize open-source maintainers ever. But as a library author myself, it takes less than 10 seconds to add such a section in my README.


Unpkg just serves the raw files, which may contain require() calls or import from other npm packages. That won’t work.

The closest service is https://esm.sh, but you can’t download from it.


Unpkg serves whatever is published to NPM, and if it's a library intended for the browser, that often includes minified versions ready for use in script tags, for example, https://unpkg.com/mithril@2.2.2/mithril.min.js. Sometimes the default export is CJS (which has require() calls), in which case, you can usually use the browse url that I mentioned to see if there's another export you can use.

https://esm.sh/ is definitely a good option too if you're OK with modules.


Yeah but that's rarely the case, you can't offer that as a generic "solution" to the problem at hand. It's just Mithril's choice to both publish to npm and to officially link to unpkg. Packages can make different choices, like publishing to GitHub releases or just to include the file in the GitHub repo.

The reality is that only frameworks and very popular browser-specific packages do that.


I said "often", but you say "rarely", so I guess we'll just have to agree to disagree here. :)


For that you probably want unpkg, there might be some other similar alternatives that you can look for if you want. They're essentially mirrors of NPM but they automatically bundle packages up and serve it via http like a normal web JS file.

Some things still won't work because they rely on some server side library, like loading files or something, but you won't want those for web development anyway. You might also find that some libraries don't handle being included as a script very well, but you can instead use the import statement and that will probably work better - use <script type="module"> and inside that use import something from "<whatever the unpkg URL is>".

For production usage this isn't ideal because you're not as in control of the code you're depending on, it's less reliable, and it's more requests than if you just bundled everything together, but not everything needs to be for production usage!


> Is there anything where I can search for browser only

There used to be bower, thankfully that’s now gone.

> JavaScript has improved a lot

So has its ecosystem. Setting up parcel or Vite is straightforward and should get you out of those problems. The time when you could just download a .js file and stick it in your vendors folder is long long gone. Please bundle and minify your files before serving them.


I think you're looking for esm.sh . It transpiles both npm and jsr packages for the browser so that they can be used in a simple script tag.


`<script src="lib.js">` is legacy, you are supposed to `import` nowadays, like from unpkg.


I am not sure what the value of this. It's having a subset of the npmjs registry as it's only for ESM-only packages.

If you want to use ESM modules on a website in a commercial setting; the security team will demand you host on a CDN under your own control etc. It's fun for personal projects?


JSR must be firmly embedded into my memory, because all I see is a 6502 assembly instruction. :)

https://en.wikipedia.org/wiki/MOS_Technology_6502#Instructio...


if it is made for typescript, shouldn't it be tsr?


I didn't understand the purpose of this even after browsing the site.


> JSR isn't a replacement for the npm registry

Shame.


Not to sound overly critical but what value is Featured Packages | New Packages etc...?



Can JSR also be used as a CDN for browser module imports?


Not yet, but esm.sh mentions they have experimental support now to check out: https://twitter.com/jexia_/status/1762516242626416750


Looks like just another example of https://xkcd.com/927/


Obligatory xkcd reference. https://xkcd.com/927/


_ s




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: