Ah, in this case, I would then have to commit my dependencies into my VCS to maintain reproducible builds. I'm not sure I like that solution very much either. I've seen node_modules in multiple GBs, and I'm sure Deno's dependency sizes are going to be similar.
Checking in dependencies to version control is the sane option. Then you can more easily see what's updated and track regressions.
Some people like to refactor their code any time there is a syntax sugar added to the language - often adding a few bugs while doing it, which is a PITA, but version control is still better then no version control.
You will ask, what about adding the OS to your SCM too, yeh why not have the full software stack. But you can generally draw a line between strong abstraction layers: Hardware | Kernel | OS | runtime | your app. Some modules do have strong abstraction layers, but others are just pure functions which you could just as well copy into your own repo.
I have only used Go once at work, and I actually dislike most of it (and dependency management was one of the annoying things with Go), nonetheless it is has never been a show stopper and there have been thousands of developers using it when vendoring was the only option.
Go dependency management is quite good now with "go mod", plus your dependency tree isn't going to look anything like your typical JavaScript dependencies, otherwise you're doing it wrong..
> that's what people using Go have been doing for years without complaining
I haven't seen anyone commit vendor and not complain about it. But now you finally don't have to commit vendor for reproducible builds. All you need is a module proxy. The "all you need" is not really meant seriously of course.
And I personally prefer to not commit vendor and complain about it.
You could use a separate git repository for the dependencies. That way you keep your core project repo tight and small and clean, but you still have your dependencies under version control. If that separate repo grows to a few GBs or more it doesn't really hurt anything.
Which then raises the question - how is it better than NPM?
If there are going to be centralized repositories (like NPM), and if I have to download my dependencies into a $DENO_DIR (like NPM), and if I am then loading these dependencies from local files (like NPM), how is it any different to NPM? Except for being less secure by default?
This is starting to look like a case of being different just so you can say you're different.
NPM is a dependency management failure which is why you are ending up with hundreds of dependencies in the first place. It sounds like you want to reproduce that insanity in Deno. Deno is set up in such a way to dissuade you from the stupidity by default but allow it in very few steps if you cannot imagine a world without it.
In my opinion this is Deno’s biggest selling point.
> Deno is set up in such a way to dissuade you from the stupidity by default but allow it in very few steps if you cannot imagine a world without it.
Could you elaborate on this? Is it that Deno is against the whole 'small packages that do one thing well' principle and instead in favor of complete libaries? How exactly would it dissuade me from installing hundreds of dependencies?
The default design style for a Deno application is that the application becomes a single file. Just like packages coming off Steam. This requires that dependencies are packaged into the application before it is distributed to others. The idea there is to include only what you need deliberately and it manage it as a remotely written extension of your application.
Having a single executable file, makes distribution easier, but while I'm developing the app, I'll still have to manage all of it's dependencies right? How does Deno aid during development?
> The idea there is to include only what you need deliberately and it manage it as a remotely written extension of your application.
I have a node app, in which I deliberately only included the dependencies I need. The package.json lists exactly 8 dependencies. However, the node_modules folder already has 97 dependencies installed into it. The reason of course is that these are dependencies of dependencies of dependencies of dependencies.
Wouldn't Deno have this same issue? Are the dependencies also distributed in compiled form as a single file akin to windows DLLs?
Except that now, the download deps in CI step can fail if one of hundreds of websites for my hundreds of dependencies goes down. If the main NPM repository goes down, I can switch to a mirror and all of my dependencies will be available again.
To be the rubber duck, if wiping the cache at each build is a risk to your CI, what could you do to keep your CI up?
1 - not wipe the cache folder at each build? It's easy and secure. Oh and your build will be faster.
2 - use a cached mirror of the deps you use? It's like 10min to put in place and is already used in companies that care about security and availability anyway.
3 - you have https://deno.land/x if you want to put all your eggs in the same npm basket
Yes, I think I'd probably settle for solution number 2.
I still don't understand how this is better than NPM, and how Deno solves the horrible dependency management of Node, but maybe if I actually build something with Deno I'll get some answers.
> [With NPM] the mechanism for linking to external libraries is fundamentally centralized through the NPM repository, which is not inline with the ideals of the web.
> Centralized currency exchanges and arbitration is not in line with the ideals of the web! - Cryptocurrency
Nek minute. Besides, let's get real here; they will just end up centralized on GitHub. How exactly is that situation much different than npm or any other language ecosystems library directory being mirror-able?
I'd highly recommend mirroring packages anyway. Obviously this isn't always necessary for small projects, but if you're building a product, the laws of the universe basically mandate that centralized package management will screw you over, usually at the worst possible time.
Which again brings me back to something I'm still not understanding - How is Deno's package management better than NPM if it is extremely similar to NPM, but slightly less secure?
I'm only asking because lots of people seem to be loving this new dependency management, so I'm pretty sure I'm missing something here.
We need to distinguish between npm, the service (https://www.npmjs.com/) and npm, the tool.
Deno has the functionality of npm, the tool, built-in.
The difference is that like Go, Deno imports the code directly from the source repository.
In practice it's going to be github.com (but can be gitlab or any code hosting that you, the author of Deno module, use).
NPM is a un-necessary layer that both Go and Deno has removed.
It's better because it's simpler for everyone involved.
In Go, I don't need to "publish" my library. People can just import the latest version or, if they want reproducibility, an explicit git revision. Compared to Go, publishing to npm is just unnecessary busy work.
I've seen JavaScript libraries where every other commit is related to publishing a new version to npm, littering the commit history.
In Go there's no need for package.json, which mostly replicates the information that was lost when publishing to npm (who's the author? what's the license? where's the actual source repository?).
As to this being insecure: we have over 10 years of experience in Go ecosystem that shows that in practice it works just fine.
The simplest approach is to either import anything anywhere, or have a local module that import external dependencies and then have your code import them via that local module.
NPM, the tool, has had the feature to be able to install directly from GitHub instead of npmjs.org for many many years as well. No one really used it unless as a workaround for unpublished fixes because it has no other tangible benefits.
I like it because it's simpler. I know what happens when I import from a URL. I'd have a hard time whiteboarding exactly what happens when I `npm install`.
My least favorite thing about importing from NPM is that I don't actually know what I'm importing. Sure, there might be a GitHub repository, but code is uploaded to NPM separately, and it is often minified. A malicious library owner could relatively easily inject some code before minifying, while still maintaining a clean-looking repo alongside the package.
Imports from URL would allow me to know exactly what I'm getting.