As a long-time front-end developer, I'm not seeing a strong value proposition here to justify the further fragmentation another package registry is going to cause.
> You publish TypeScript source, and JSR handles generating API docs, .d.ts files, and transpiling your code for cross-runtime compatibility.
This sounds like another building service I don't control. There's already too much magic in publishing transpiled TypeScript packages but at least the current tools let me control exactly what artifacts get pushed out.
> web-standard ECMAScript modules
The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.
> JSR isn't a replacement for the npm registry; it's a superset of npm.
A super of npm means that the packages created for this repository will not work in the npm ecosystem. What is this if not a "replacement" of the npm registry and further fragmentation of the JS ecosystem?
>JSR modules can be used with any JavaScript package manager, and in any project with a node_modules folder.
This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...
> Module authors can count on great editor support from strongly typed modules, without the need to transpile and distribute typings manually.
The website seems to insist that distributing typings is a complicated process when it is just the .d.ts files bundled with the published package and an additional entry in the package.json file.
> Easy publishing with a single command - the CLI will walk you through the rest
It seems like every benefit that this project offers can be fixed in the current ecosystem by better client-side tooling that enforces standards during the publishing process while keeping full backward compatibility with over a decade of packages published and without fragmenting the ecosystem.
1. The focus on typescript and typings distribution is useful/interesting for the "promise" that all packages in the registry have typings. If you can trust that every package in JSR has types then you avoid/simplify that current cycle of: npm install a package, find out it doesn't have types, try to install a package of the same name with @types/ in front of it, find out it doesn't exist, look for an alternative library with types or write your own types or give up and `declare module "thatpackagename"` any-type it.
2. Just because "most" production websites do bundling doesn't mean that needs to still be the status quo in 2024. Bundling is increasingly a "premature optimization" and you might not need it. It can be worth investigating if your site/app works well enough with just ESM delivered straight to browsers in 2024 versus what you "gain" from bundling. You might be surprised at how well current browsers can load ESM and the optimizations they already do for you. But even if you do still need bundling ESM is a huge advantage over other formats (especially CommonJS) and every current bundler produces better, more optimized output the more ESM you provide. Their outputs also increasingly look like ESM, and if that isn't the current default it is the "soon" default. esbuild in particular (which also backs some of vite's tools and others) has an ESM bundle output that is delightful, isn't yet the default ("soon") but is a simple "format" switch away. Those ESM bundles are great in today's browsers. ESM matters a lot to bundlers, bundlers are increasingly ESM in their internals and output. Browsers are great with ESM. The "web-standard" part is pretty meaningful today and getting stronger.
To resummarize: sometimes it is fine to prune and ship node_modules to your web server, you can use import maps to handle "node-style 'bare' imports", and you may only need to "spot bundle" to ESM a single dependent library or two that doesn't ship ESM that will directly work in the browser (which should be a decreasing list).
After that it's a matter of using your browser's Dev Tools to tell you what your performance looks like and where your bottlenecks are. That can tell you a lot to help you determine if you really need to bundle more of your site/app or if you can spot-bundle/adjust an import map for just a specific sub-graph. With HTTP/2 and HTTP/3 the "per-file" "penalty" is a lot less, though still somewhat exists. Even some properly configured HTTP/1.1 servers it is not always as bad as history tells us it was (properly configured HTTP/1.1 did support some connection sharing). About the only remaining thing a bundle might buy you on HTTP/2+ is maybe better behavior with compression on the bundle as a single whole, but even that isn't a given because Brotli's dictionary was designed to work at the whole connection level and that's often the default compression in HTTP/2+. But again, your conditions may vary based on your real code and the specific servers and browsers you need to support, so leverage your browser Dev Tools and check the real numbers.
> This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...
JSR is a registry, not a package manager. npm allows you to add multiple registries and JSR provides an npm API endpoint. From npm's point of view packages coming from JSR are the same as those coming from the npm registry itself. The only difference is where the package is coming from, but all semver resolution, etc are done as usual by whatever package manager you use.
Disclaimer: I wrote a good portion of the npm tarball generation code.
- `isolatedModules: true` is required
- Decorators: standard spec decorators only
- JSX: tsconfig options are lifted into JSX pragma comments on publish (configuration goes into code so that consumers can emit based on these options still)
TypeScript has very very few options that affect emit, and effectively everyone has settled on one subset now.
> TypeScript has very very few options that affect emit, and effectively everyone has settled on one subset now.
Very confused by this statement. There's lots of options that affect emit. And what do you mean a subset has been settled on? Take any 2 TypeScript projects and I'd wager there's a high chance they have options that emit code differently.
Config aside TS emits .d.ts that is not always backwards compatible with older TS versions. Does jsr support typeVersions for such situations?
Another comment in the thread makes a very good point:
> Man, if only they'd debuted with this, but, hindsight and all. My only thing is that TypeScript has no spec, standard, or competing implementations so it seems a bit risky to treat it as a first-class citizen as far as publishing packages goes.
Add to that the frequent compilation issues that happen as TypeScript adds or removes types in the global imports, changes language conventions, and updates compilation checks whenever a new version gets released.
Look at the comment under the comment you mention - those issues are about the TS type system, NOT the TS syntax. TS syntax is highly stable, and is the only thing JSR relies on :)
Both `.d.ts` and `.js` files can be created without understanding the TS type system (this is what JSR does). Creating both of those is just a TS syntax transform - one that is highly stable (esbuild and friends all do it already).
We do some basic inference, but that's why we require explicit return types in the public API, see https://jsr.io/docs/about-slow-types . TypeScript itself will ship with a new `isolatedDeclarations` option in the next release which is the same thing. The point is that no inference other than simple literal inference needs to be done to get the type of the public API. That way generating .d.ts files can be done by any tool by merely looking at the syntax without type inference.
> The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.
I mean code itself will be non readable, but in modern build ouput chunks are using regular esm modules, at least for Vite, Rollup, Parcel and so on. Webpack might be different story, but still.
> This sounds like another building service I don't control. There's already too much magic in publishing transpiled TypeScript packages but at least the current tools let me control exactly what artifacts get pushed out.
For tools that natively support TypeScript, you can directly consume the TypeScript source code. No transpilation necessary. The complexities are only introduced (well not really introduced, just moved somewhere else) for tools that do not natively support JS.
> The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.
That doesn't matter. If everyone settles on one standard, even if you do post process, tools become simpler. The idea is not that you must ship your code unbundled to the browser. It is to reduce the amount of complexity incurred in the process from source -> dist. I am sure you agree that the lower the difference between source and dist, the easier debugging and developement in general are :)
> A super of npm means that the packages created for this repository will not work in the npm ecosystem. What is this if not a "replacement" of the npm registry and further fragmentation of the JS ecosystem?
They do work in npm. In npm packages you can depend on JSR packages, and on JSR you can depend on npm packages.
> This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...
Nope! JSR is not a package manager. JSR is only a runtime. When you install into a node_modules folder, the layout in this folder is the same between JSR and npm packages. So any downstream tool does not even need to know JSR is in the loop - it just works. This is one of the key points to ensure easy adoption of JSR by anyone.
> It seems like every benefit that this project offers can be fixed in the current ecosystem by better client-side tooling that enforces standards during the publishing process while keeping full backward compatibility with over a decade of packages published and without fragmenting the ecosystem.
I understand this.
The problem is the following: npm is owned by GitHub/Microsoft, and GitHub/Microsoft does not care about npm. To GitHub, npm is a cost center. npm has no dedicated product manager at GitHub. Last I heard, npm has _two_ dedicated engineers working on it. The last real new feature npm shipped was package provenance (last year). It took npm 5 years to ship the file browser (because GitHub/Microsoft did not care about npm, and thus effectively froze all feature dev work on npm).
If we want innovation in the space, we need to do it in a way that a) keeps backwards compat with npm (which JSR does, both ways), but b) we need to move away from npm in the long term. JSR is an attempt at this (while trying to stay within the constraints posed).
Thank you. I stand corrected on some points. The messy ecosystem of npm, yarn, pnpm, JS, TS, ES, and ESM has all cost me a good amount of white hair.
A good source of my confusion came with not immediately recognizing that a new package registry does not require a new package manager to work with. I think the "jsr" commands in the code samples and the jsr.json make it feel like there's a new package manager in play here. If the team has made it so that npm works with "jsr" and vice versa, kudos to them because that's a big win for the users.
As far as I can tell, the "npx jsr add @luca/flag" command given for using JSR with Node itself calls "npm i @luca/flag@npm:@jsr/luca__flag" and is equivalent. You get a normal node_modules+package.json+package-lock.json output that the npm command is fully able to work with.
It looks like JSR re-publishes all of its packages translated into JS + .d.ts files onto npm under the @jsr scope. Regular npm packages have no trouble depending on JSR packages because they just depend on the npm-published version of them.
yup, we added the `jsr` tool mostly so that you don't have to be aware of the @jsr scope that's used under the hood. We're not publishing to npm though, but rather mapping the @jsr scope to the JSR registry in the project's .npmrc file.
An employee of GitHub has reached out to me to inform me that there are more than 2 engineers people working on npm full time, including multiple full time support staff and full time staff to review possible malware. The engineering team working on npm is "substantially larger than 2". Additionally they informed me that a new product manager for npm will be starting tomorrow.
Even though I made no claim to the contrary, I would like to emphasise that there are obviously current product managers at GitHub that product manage npm as _one of_ the projects they manage.
This is probably a marketing bug on our part. While we did want to design for TypeScript from the outset, you can definitely happily write and publish plain JavaScript code on JSR. We probably need to do a better job explaining and featuring this.
If you're building something that is designed for TypeScript, made for TypeScript and publishes TypeScript sources, then I find it really hard to understand why you'd name it "The JavaScript Registry".
An npm compatibility layer doesn't sound like it has anything to do with TypeScript? I'm assuming that you can use JavaScript anywhere you can also use TypeScript.
> Unlike with native JSR imports, you are not directly importing TypeScript code. Instead JSR transpiles the TypeScript code to JavaScript before it is installed into your node_modules directory. This generally means that your editor experience will suffer, because “Go to definition” and other features will link to transpiled JavaScript code, or to generated .d.ts files.
That said, I'm assuming that if you use Deno, you can write plain JS as well (i.e. the TS library will still work with JS). So yes, you'll need to use the npm compatibility layer if you don't use Deno, but those are the only supported use case.
So I think they still don't want to imply that you have to use TypeScript if you use JSR in the runtimes it's targeting.
When would I want to use JSR? The "Why JSR?" section simply says JSR is a superset of NPM that can be used with any javascript package manager. But NPM itself can also be used with any javascript package manager as far as I know. It seems like the benefit they're trying to add is strict type support for packages, but the fact that it's a superset of NPM means all existing packages that lack types are still supported by JSR. So, what benefit is so great that it's worth segmenting the ecosystem?
Side rant: not at all a fan of the decision to punish libraries ("slow types") that use type inference by reducing their score.
In terms of "when you would use it" - it doesn't necessarily solve a different problem than npm. I think it arguably solves the same problem more effectively (though it is certainly early days and npm still does a lot of things JSR doesn't). I also don't know that we're segmenting the ecosystem, as JSR packages interoperate with npm packages, and you can use one from the other. JSR will end up being additive to npm.
For module authors, we're hoping JSR will be helpful in the following ways:
1.) You can develop and publish TypeScript source, and let JSR handle transpilation and generating .d.ts files for runtimes that don't natively support TypeScript. Especially nice if you are using Deno or Bun (that do natively support TypeScript), and don't have tsc in your workflow otherwise.
2.) JSR generates API docs for you on your package page based on your source code and comments.
3.) JSR has a great DX around publishing packages from GitHub Actions using OIDC (no juggling tokens)
4.) JSR automatically provides provenance for published versions of packages
For module consumers, it helps too:
1.) Compatible with both Deno and existing npm-based projects
2.) Package info and docs provided centralized on the jsr.io site
3.) Quality scores that encourage authors to make their packages fast and well documented
4.) Access to TypeScript source for packages (not just transpiled output)
Absolutely more to be done, but we're hoping that JSR will provide enough extra value to both module authors and consumers that both will prefer to use it when possible.
Sounds like a good value prop. One thing that I'm hoping is to reduce dependency on a single private company, npm inc., in publishing and distributing modules. But you didn't list this as a value prop. (I notice other posters are upset at the 'fragmentation' of module distribution, but I see this as a strength, not a weakness.)
The performance of type inference matters for runtimes which work with TypeScript files directly, rather then using the transpiled .js + .d.ts files like Deno. This has several benefits in that we can jump to the source on "go to definition" rather than some random .d.ts file among other things. Deno uses the original TS files for type checking as well and that's why inference also affects type checking performance in some runtimes.
By ensuring explicit return types in the public API, the generation of .d.ts files is turned into a mere syntax transform, rather than requiring the tsc compiler. This is something TypeScript will ship with in the next release itself. They're adding a new `isolatedDeclarations` option which is the same thing.
Type inference in dependencies does not reduce performance for users, but just for JSR. Here they’re going “the Google way” by adding some random metric that benefits them, not users. Once generated, .d.ts files do not contain inference.
One thing I'm not sure about when reading their policy is what'll happen in another kik situation. If someone were to claim the scope Microsoft (for example) that had no relation to Microsoft and then Microsoft came along and wanted the scope what would happen?
If MS got the scope what would happen to the packages in that scope, or if MS didn't get it what is done to clearly communicate that this is an unofficial account (presumably) asking on MS' behalf? In this early on period I'd expect a lot of people to claim the scopes of notable companies and these companies might take issue with that if they choose to use jsr down the line.
We do intend to take a more editorial approach to scopes, and assign scopes to users in a way we think is more intuitive for end users of JSR. We have reserved some obvious scope names already, but in the future, we'd likely entertain requests to reassign ownership of scopes for the benefit of the broader user community (as in the case of a brand owner requesting ownership of their brand name).
So in the case that a user published "@cocacola/foo", previously published versions of "@cocacola/foo" would remain available indefinitely (unless they were found to be malicious), but we would likely be willing to assign ownership of the "@cocacola" scope to a representative from that brand/company if they asked for it and we could verify their identity. The original author of "@cocacola/foo" would need to publish the module going forward under a different scope.
> but in the future, we'd likely entertain requests to reassign ownership of scopes for the benefit of the broader user community (as in the case of a brand owner requesting ownership of their brand name).
It would be great to find a way of structuring these registries/repositories in a way so there wouldn't be any name collisions, and also avoid the built-in support for companies to take names away from individuals.
Thankfully JSR won't be capable of a left-pad situation where packages can be unpublished - published packages are immutable[1].
As for the potential for disagreements over whether or not a scope should be transferred, that is a big reason why we want to figure out community involvement in governance sooner rather than later. We are gathering potential volunteers who want to discuss becoming a community moderator - if anyone would be potentially interested, they can sign up to join that conversation[2].
2. Use some pre-existing centralized name registry (e.g. domain names).
I don't know why the JS ecosystem is so resistant to either solution. Both are proven options (#1 used by COM, #2 used by Java). They do mean longer package names, but surely that's a small price to pay for a resilient future-proof solution? And besides, who really cares about long dependency names and why?
In the example of "@cocacola/foo" would it allow for the "@cocacola/foo" package to be updated with new versions by the new owners? Or would the foo package essentially be archived and read-only from this point on?
Domain expiration is rare, and virtually only happens when the related projects are dead anyways; freeze existing packages (no more updates until the original key for the domain comes back online, with manual override by registry administrators for edge cases) and have a reasonable waiting period (a couple of months to a year) before allowing the new owner to use the namespace (with different project names within).
Use URN instead of just a domain name. A domain name (with some schema) is obviously a subset, but anyone who doesn't want or need one can then use uuid: URNs.
The XML ecosystem did it that way for namespaces, and I think that it is still the most flexible and the most future-proof approach, since you can always add new schemas as needed.
Not if the system is built for DNS first. If a company/individual/organisation has a domain name, they are expected to use the domain. It is much less of an issue if someone has kik.users.registry.com when the company publishes packages under kik.com.
Kind of jarring now that the standard library documentation on the Deno website just redirects to JSR.
Speaking of which, the new deno.com site looks like a marketing website designed by an AI. Lacks all of the pragmatic unix-y character that Deno originally had, and is just filled with meaningless numbers and metrics and way too much blank space. It's like they're selling a product... which I guess is what Deno is now. VC funding will do that to you.
After reading the "Why JSR?" article I still don't understand why this exists.
It looks like it has a few niceties but also some limitations. To catch on you need to convince some significant segment of both package publishers and consumers to switch to it. That means there needs to be some killer feature they just can't get with the existing registries, and I'm not seeing anything like that.
> After reading the "Why JSR?" article I still don't understand why this exists
The cynical answer: they realized it's actually advantageous to have a centralized package registry instead of importing from random urls, and as a for-profit VC backed company they'd rather not have their entire ecosystem depend on the Microsoft-owned NPM.
The optimistic answer: having a second registry makes the entire JS/TS ecosystem less fragile by not having a single point of failure.
I am not a user of Deno, but I acknowledged that HTTPS module imports were one of the original selling points of Deno compared to Node/NPM. Was it revised at some point that a package repository is needed? What's the back story?
HTTPS imports will continue to work and be supported in Deno. However, as we observed their usage in the wild, a couple problems became clear:
1.) Duplicated dependencies - projects would often download multiple versions of the same dependency, because there was no deduplication happening based on semantic versions.
2.) Disappearing dependencies - under some circumstances, an HTTPS import URL would be unavailable, causing code dependent on these modules to break.
A central package repository could solve for both of these problems. We considered (and very nearly chose) to just use npm, but it introduced functional and UX problems we wanted to solve for users. So we set about building JSR, and did so in a way that wasn't tightly coupled to Deno (JSR didn't need to be - it is useful in the context of any JavaScript runtime).
> a couple problems became clear: 1.) Duplicated dependencies
I’m sorry but that did not come up when designing the system? It’s the whole principle of semver ranges that deno decided to do away with. I cannot believe nobody saw this as a drawback. It’s the first thing I thought when I saw hardcoded versions.
In case you are not aware (maybe you are) node has experimental support for http imports. I personally think this feature is a disaster for many reasons, but if you want to use it in your toy apps it is there in node.
I’m curious what this means in regards to this existing “use Deno to also publish to npm” workflow (https://deno.com/blog/dnt-oak) which I was considering using, and then the JSR announcement happened.
The thing I like about the blog post and the mentioned dnt tool is it takes care of lots of bullshit you otherwise need to figure out when publishing to npm.
What is the relationship between JSR and DNT in this case? Should one be used over the other? Together?
If I want to publish a module so it’s available for Deno and NPM, what is the recommended approach now?
As JSR develops, you can expect to see more features like those of dnt start to show up in the npm compatibility layer. We've also been exploring how to create a good DX around simultaneously publishing JSR modules to npm, so publishers can control their namespace there as well. We definitely know it's a usage pattern folks are interested in.
In the immediate term, dnt is still a very strong choice for people that want to develop modules in TypeScript using Deno, and then publish them to npm. In the fullness of time, I expect that JSR will provide a pretty complete solution to this problem as well.
Man, if only they'd debuted with this, but, hindsight and all. My only thing is that TypeScript has no spec, standard, or competing implementations so it seems a bit risky to treat it as a first-class citizen as far as publishing packages goes.
The TypeScript syntax is very stable - there is no spec, but it's very stable. The type system on the other hand: not very stable, and yeah I agree - that would be risky to rely on.
However as JSR only relies on the syntax, not on the type system, it's no problem :)
Technically ES did introduce types already: in the "lost version" ES4. Typescript's syntax was partially inspired from ES4 and while not "compatible" with the previous type system there is more compatibility in the syntax than you might expect.
The Type Annotations [0] proposal that TC-39 (in charge of ES standardization) has kept in front of them (at Stage 1 so far) takes the "Python approach" of standardizing the syntax but not the type system still leaving that to external tools (in part because of the lessons from the mistakes of ES4 and the type system that was standardized but never really adopted). The proposal basically assumes that most types will (still) be in the Typescript type system (and would still need Typescript or compatible tool to type check) and maximal (but not complete) compatibility with Typescript syntax is a goal of the proposal.
But you don't don't have to take my word on that, the FAQ at the bottom of that Type Annotations proposal is well written and quite detailed.
I used ActionScript 3 back in the noughties and was so excited for ES4 in the browser. Adobe open-sourcing Tamarin made it seem it was a done deal (I didn't follow the politics). I was so disappointed when it was killed. It did mean that when I tried Angular 2 the best part of a decade later I was delighted with how familiar TypeScript felt.
I've been complaining about the death of ES4 since it was killed back in 07 or 08.
The proposal might have compat types with TS but what about all the other TS features like generics. Now more than ever, it seems risky to bet the house on TS.
TC-39 isn't "betting the house on TS": the proposal is generic and intentionally doesn't define types it defines syntax for annotations of types. It's basically "type information may appear between a colon after an identifier and an equals sign or other terminator" and "type is a statement keyword with type information following it' and "interface is a block keyword with type information inside that block". That's pretty much the entirety of the proposal: it doesn't state what that type information is (just parsing limits on what it could not be so that it doesn't blow up language parsing), what it is used for (other than that the general idea at runtime is that type annotations are comments just like existing jsdoc comments), or who uses it. While the proposal is trying to stay syntax compatible (again, not types compatible because the proposal doesn't include types) with Typescript it also points out existing overlaps with Closure and Flow type syntaxes and is built so that both of those can use the new syntax for their own, different type systems.
This is why it is a Python-inspired proposal: Python didn't define types, Python defined "here's some places in the language that are effectively ignored like comments that type systems are expected to use". Python has made them more than "just" comments in later proposals and TC-39 holds the right to do that for ES/JS in the future but the current proposal stops about where Python's first proposal did at "types are just fancy inline comments for compilers like (but not limited to) Typescript/Closure/Flow".
Again the proposal itself is a really good read for specifics of where it looks like Typescript and what parts of Typescript are intentionally not supported. For instance as a good reminder there are only three (!) features of Typescript left that aren't ES-standardized and generate code: enums, namespaces, and class parameter properties. Of those, both enums and namespaces have been marked deprecated for a long time and many lint rules exist to help eliminate them from your code if you are still using them.
I'm not betting anything on TypeScript. It was released 11 years ago, and it's incredibly valuable to me. If it takes another 5 years for ES to implement first-class types, and it's a good implementation, I'll start using ES for new work at that time. What's lost here? TS won't just stop compiling.
Still, native typing is not even on the horizon for ES, so I think my point stands for businesses that are heavily invested in supporting TS. TS is what we need today, it doesn't make sense to ignore it because theoretically it could be obsolete in 5-10 years.
If and when it does, why do you think it wouldn't just be a subset of TS? Why reinvent the wheel when you have a perfectly good one that's beed around for 12 years and is proven in production?
Consumers don't touch any ts, outside the (technically optional) .d.ts
Thecwayvi understand it in theory, the typescript version a library uses could disappear forever the day after a library had been published to JSR and all would be fine.
Until someone wants to maintain the code of course, but then you'd have the exact same problem with the elusive typescript version if the repository only took compile output.
What JSR would break (or make a deliberately difficult path?) is using some private fork of typescript that isn't even available on the day of publication.
The benefit of JSR, as far as I understand it, is offering an easy path to publication that makes type parts not an extra that some might skip but a reliable default. It does make me wonder however, how long-term reliable the finding will be: JSR promises more service than registries before, while also being more tool agnostic. I'm not sure if it's true, but my impression was that in many cases, the registry was primarily bankrolled by however the tooling was funded, e.g. effectively becoming a marketing expense to selling tooling expertise. JSR appears particularly removed from any of that and it's bad enough when the repository fades away in terms of static hosting, it's worse if it also leaves your library without a build process.
> JSR isn't a replacement for the npm registry; it's a superset of npm.
I first read this line to mean that JSR contains every package on NPM, and was just doing some post-processing on them. But it does seem to be its own registry. Maybe the intent of the line was to communicate that you can install packages from both?
I think this is mostly right - "superset" in that JSR modules can depend on npm modules, and projects using npm can use the npm registry and JSR together. The bottom line we'd want to communicate is that JSR is additive to npm, and the two can be used at the same time.
JSR is almost literally a subset of npm in features. Npm allows publishing of anything, JSR only actual TS/ESM. Whether those modules have dependencies doesn’t expand the set IMHO
Seeing Bun/Node/Deno support reminds me of a question: are there any good resources on writing JS that works in all 3 and the browser? For example if I wanted to write a simple filesystem API that would work the same across bunodeno?
Yeah this is basically what I'm trying to do (see response to sibling). Unfortunately I consider Node's APIs the worst of the 3. No shade on them it's just an older design. I really like Deno's approach since my code also needs to work in the browser. Writing HTTP handlers that take Request[0] and return Response[1] is a beautiful bit of symmetry with frontend code and feels like cheating.
If you use Node 20+ then there's a lot more compatibility with web platform APIs, so it is possible to write code that works on Node, browser and WinterCG-compatible runtimes like Deno. Obviously you'll need to avoid using Node builtins.
You can't, and for good reason. Deno has slightly different API's than Node. Bun also has subtle differences between their STL implementation and Node's, but afaik, they're getting there.
I think maybe I was unclear. I'm talking about writing libraries that abstract across these differences and provide a single API, as sibling describes. I already know it's possible. I made a simple filesystem abstraction here[0] and a very simple HTTP library that uses it here[1]. They both work in Node/Deno and the browser. Unfortunately I ran into issues with Bun's slice implementation[2], but that should be fixed eventually.
My overall question is that I suspect there's a much better way of detecting and using the different backends, and I'm wondering what techniques are out there.
Oh boy. Reading the title ("JSR: The JavaScript Registry" at the time of writing) I had high hopes this might be what I'm looking for: Dependency management for JavaScript in the browser. It's something entirely different.
I tried to look for something lately that:
1. Makes it easy to download specified versions of JS libs.
2. Exposes these libs as proper ES6 modules.
I get why (2) is a bit tricky, it'd be fantastic but more of a nice to have anyway.
I don't really get why there's no popular tool for (1) yet. Sure, I can just fetch the files I want from a CDN and commit them to a vendor directory, or I can write a little script that downloads them from a CDN. Or hey, I can fetch frontend dependencies via NPM and have a script that builds/copies the stuff. But that's a bit much labour for something that I thought would be a relatively common requirement.
Am I in the minority to occasionally want to build web apps _without_ bundlers? Being able to skip the build step is very powerful both for keeping things simple and turnaround times fast, IMHO.
You can just use npm and ship node_modules on your website. It's probably "huge" so you probably want to clean out dev dependencies first (`npm prune --omit=dev` is one way to clean that) and you might find it useful to search for big binaries to filter out and redundant directories that you don't need (libraries that still include all of UMD and CommonJS and ESM builds even though you only need one), and there may still be libraries that don't directly load in the browser and you need to spot bundle with a tool like esbuild to a vendor directory.
Mostly the only other glue you need after that is an import map.
I find this flow useful (ship an optionally pruned node_modules, spot build specific vendor libraries, add import map), especially for lightweight development/testing, and so I did document it specifically from start to finish for one of my libraries (which is sort of a "framework"), it includes a vendor build one-liner:
(The Example section after the Dev Environment one shows the import map at the top of the example HTML if you are looking for that. I forgot that's where it was when re-reading this.)
The server that takes your TypeScript and does all of the work to publish a nice package on npm is great. But why wouldn't they publish to npm after that? Why make a new registry that is introducing fragmentation?
Seems like the logical way to do it. Though, honestly, unless you specifically need to do lots of different kinds of padding, something like this makes way more sense:
I get their stance on TypeScript "exported types must be explicit and not use inference", but I feel like this is going to cause a lot of friction in adoption.
1. You can bypass slow types checks during publishing. You will just get a lower JSR score.
2. TypeScript is shipping features in the next release (TS 5.5) has quick-fixes in the editor to make it super easy to add explicit types :) The TS feature is called `isolatedDeclarations`
I would just like to add some of my findings around the developer friction with isolated declarations.
The whole reason isolated declarations is taking so long is because we also worry about making the requirements of this too onerous. To this end with isolated declarations, you do have to specify most types, but not all. For expression where we can reasonably infer the type from local context, we do so.
This should ease some of the developer friction. Not all of it. Some things will not be inferrable, but things like primitive types, object literals, tuples will be so it should make it easier.
We also worked on a code fixer that adds explicit type annotations where this new flag would raise an error, again to help people migrate if they choose to do so.
To be clear for people reading wondering what this is about, this is only a hard recommendation for the public API types. The reason for it, is by adding explicit types to the boundary of your package, the package becomes way faster for users to type check because every user's machine doesn't need to do all the inference work and type check internal types or packages not related to the types. Additionally, it makes the published code more resilient to changes to TypeScript's inference in the future because it's not doing inference. It also becomes way easier to generate documentation for the package (also, the ability to generate .d.ts or bundled .d.ts files without a type checker becomes easier).
Right now, the publish command errors and asks you to fix the issues or bypass it entirely via `--allow-slow-types`. In the future there will be a `--fix` flag to write the explicit types for you.
One solution to the package dependency swamp is to make a project with little to no dependencies?
I'm rambling but I would curious to experiment with an approach where, when I needed a bit of code to solve a problem the "package manager" (more like code procurer) would just find a snippet of code I need, perhaps reference it's origin, perhaps add related unit tests and then I would copy paste it into my own code base.
I've experimented[0] with something along these lines. The basic idea is that it's desirable to be able to reuse snippets/functions across projects, but you shouldn't need to go full left-pad. So basically you run tuplates.py on your project, it goes through all files and any line comment that includes "tuplate_start(URI)" it goes out and fetches the URI (local filesystem or HTTP supported), and replaces everything until it finds a line comment with "tuplate_end". Nice thing is it enables basic templating in any language, and instead of checking templates into source control, you're committing working code. Also combines really nicely with jsdelivr where you can import specific versions of files directly from GitHub.
i think this could function in a small project scenario and i like the idea. Don't think that would be super maintainable in larger enterprise applications. You'd essentially be on the hook for maintaining more code as well.
For dependencies where you didn’t explicitly specify the exact version, taking the latest version is nondeterministic - it varies over time. Someone else who checks out your code will get different results.
I was hoping for something like Go’s minimum version dependency resolution.
The upper bound resolves to a different version after a new minor version is published. That’s nondeterministic. Someone else checking out your project will get different results, unless you use a lock file.
See, you don’t know how npm works. Npm does not care about the lockfile of dependencies. “Someone checking out my project” always gets the latest version of each dependency within the semver range, until their lockfile locks a version into place.
I meant someone checking out the exact same Github repo (for an application), not someone depending on a library I released. If you have a lock file, it works in that case, I believe?
But actually I use Deno, and I'm honestly not sure how that works either.
As for library dependencies, I know how it works in Go, which is how it should work. When you add a dependency, and it has its own dependencies that you don't use directly, you get the same version that they tested with, which is the lowest one they specified. It only gets overridden if something else requires a higher version.
This means that when a new version gets released somewhere, nothing happens until people notice it, bump a version, and hopefully test it. No library version changes unless there's a commit.
Taking the latest minor version means that it hasn't been tested by anyone downstream. (How could they, when it didn't exist before?)
New library versions should be tested by direct dependencies before they get used by indirect dependencies.
I know this is a nit and not the responsibility of jsr (looks great all! Good job! Can’t wait to use it) but the fact that we still have a “node_modules” folder as a standard for the ecosystem. This sucks. I get it. I know why. Node is king. However there are other runtimes out there that use this pattern simply because they don’t want to reinvent a package management system (I’m with them on this) but have to use “node_modules” as their module folder simply for compatibility. I decided not to follow suit and use @modules instead. :P
I like JSR, and this in no way reflects my opinion of js,ts,jsr, or you all. I just think it’s time to move on from some dahl-isms (decisions made while getting node production ready).
What's the best way to include a binary blob (wasm binary) in your package? For NPM I've been using a bundler (esbuild's `binary` loader) but I'm not sure of the best way to do that in a modern, jsr-friendly way.
This is a terrible name. JSR already stands for "Java Specification Request" which is basically an RFC or standard for Java (not JavaScript).
This is going to make Google searches even more difficult. Searching for Java jobs and getting JavaScript jobs is already bad enough. This name is just making a big mess with semantic name collision in our mental namespaces.
Just make it "RfJS" maybe for "Registry for JavaScript" (if that doesn't class). I know Firefox had a few naming rounds going through Phoenix and Firebird IIRC first.
JSR is also a company that produces synthetic rubber, a semiconductors company, a pharma company, a journal, and a company that produces merch for rock musicians. All of these rank higher on Google for JSR than Java Specification Requests.
Given that you're going to have a hard time pulling up the JSR you're looking for without adding qualifiers anyway, I don't think this name choice is going to substantially change anything.
I very much like the added TypeScript support (which the standard NPM registry does not have.) and that it's open-source.
I'll see about getting my packages uploaded onto here, too -- just having built-in support for TypeScript makes packaging x100 easier.
Regarding NPM, it's absolutely insane that we've been depending upon a closed-source, monolithic nightmare with poor UX for so long. You can't even use comments in package.json -- overall, it feels like a holdover we haven't figured out how to replace.
Furthermore, it's scary that Microsoft, through GitHub, now hold so much power over the JavaScript industry -- enshittification will most likely ensue.
I'm not sure jsr fixes comments in package.json, and I wouldn't blame npm for the miss feature.
package.json is used at runtime by node.js and strictly parsed as json. Npm or just could strip comments on publish, but local development of the package would break.
Not sure if bun or demo use pjson at runtime or if they allow comments.
I think it's interesting that JavaScript projects at the package root despite most projects compiling to dist nowadays. I'd like tool to generate the whole package root, including a transpiled or generated package.json
> package.json is used at runtime by node.js and strictly parsed as json
It still constitutes an unfriendly developer experience -- people have had to make workarounds to document what complicated scripts do, and why they use certain dependencies, etc. I understand how JSON.parse works, but this still isn't the best behaviour.
Note that this is just one example of unfriendly UX in NPM as a whole -- this isn't the hill I'm dying on, so to speak, I'm just giving an example of how I believe the industry is being held back by tools which just haven't kept up with demand.
Is there a similar site but for browser only javascript libraries?
I am not a web developer, and always find NPM etc way too much for my occasional needs.
When I search internet for anything, 90% of the times I get an NPM based library that can not work in browser with just a <script src="lib.js">.
JavaScript has improved a lot and does not really need Node/compilation steps for almost all my use cases. Is there anything where I can search for browser only, client side javascript libs?
You can usually browse npm for what you need, and when you want to use it, look at unpkg.com/<name of package>
Usually this will give you a .min.js file ready for you to script include on your site, for example, unpkg.com/mithril
Sometimes library authors don't default the export to a browser ready minified version or ESM module, in which case, you can snoop around the built package at unpkg.com/browse/mithril/.
Most READMEs worth a damn should tell you how to use their libraries without npm if possible. Unfortunately, a lot of library authors suck.
The nerve of them not catering to the small percentage of use cases for a project they often do in their free time and often for free. The entitlement is real.
Sorry I forgot we're not allowed to criticize open-source maintainers ever. But as a library author myself, it takes less than 10 seconds to add such a section in my README.
Unpkg serves whatever is published to NPM, and if it's a library intended for the browser, that often includes minified versions ready for use in script tags, for example, https://unpkg.com/mithril@2.2.2/mithril.min.js. Sometimes the default export is CJS (which has require() calls), in which case, you can usually use the browse url that I mentioned to see if there's another export you can use.
https://esm.sh/ is definitely a good option too if you're OK with modules.
Yeah but that's rarely the case, you can't offer that as a generic "solution" to the problem at hand. It's just Mithril's choice to both publish to npm and to officially link to unpkg. Packages can make different choices, like publishing to GitHub releases or just to include the file in the GitHub repo.
The reality is that only frameworks and very popular browser-specific packages do that.
For that you probably want unpkg, there might be some other similar alternatives that you can look for if you want. They're essentially mirrors of NPM but they automatically bundle packages up and serve it via http like a normal web JS file.
Some things still won't work because they rely on some server side library, like loading files or something, but you won't want those for web development anyway. You might also find that some libraries don't handle being included as a script very well, but you can instead use the import statement and that will probably work better - use <script type="module"> and inside that use import something from "<whatever the unpkg URL is>".
For production usage this isn't ideal because you're not as in control of the code you're depending on, it's less reliable, and it's more requests than if you just bundled everything together, but not everything needs to be for production usage!
> Is there anything where I can search for browser only
There used to be bower, thankfully that’s now gone.
> JavaScript has improved a lot
So has its ecosystem. Setting up parcel or Vite is straightforward and should get you out of those problems. The time when you could just download a .js file and stick it in your vendors folder is long long gone. Please bundle and minify your files before serving them.
I am not sure what the value of this. It's having a subset of the npmjs registry as it's only for ESM-only packages.
If you want to use ESM modules on a website in a commercial setting; the security team will demand you host on a CDN under your own control etc. It's fun for personal projects?
> You publish TypeScript source, and JSR handles generating API docs, .d.ts files, and transpiling your code for cross-runtime compatibility.
This sounds like another building service I don't control. There's already too much magic in publishing transpiled TypeScript packages but at least the current tools let me control exactly what artifacts get pushed out.
> web-standard ECMAScript modules
The "web standard" part is meaningless considering that most production websites will bundle the files together as part of their build/optimization process for size and loading speed, leaving only a giant chunk(s) that resembles nothing like the original ES modules.
> JSR isn't a replacement for the npm registry; it's a superset of npm.
A super of npm means that the packages created for this repository will not work in the npm ecosystem. What is this if not a "replacement" of the npm registry and further fragmentation of the JS ecosystem?
>JSR modules can be used with any JavaScript package manager, and in any project with a node_modules folder.
This makes the already complicated module resolution logic in most bundler/packaging tools even more complicated as they must account for the intricacies of another package manager. A house of cards being stacked on another house of cards...
> Module authors can count on great editor support from strongly typed modules, without the need to transpile and distribute typings manually.
The website seems to insist that distributing typings is a complicated process when it is just the .d.ts files bundled with the published package and an additional entry in the package.json file.
> Easy publishing with a single command - the CLI will walk you through the rest
It seems like every benefit that this project offers can be fixed in the current ecosystem by better client-side tooling that enforces standards during the publishing process while keeping full backward compatibility with over a decade of packages published and without fragmenting the ecosystem.