Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does anyone else see the import directly from URL as a larger security/reliability issue than the currently imperfect modules?

I'm sure I'm missing something obvious in that example, but that capability terrifies me.



I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway. You can even depend to your non-npm repo (github, urls...) from a npm-based package.

If you want to "feel" as safe, you have import maps in deno, which works like package.json.

Overall, I think Deno is more secure because it cuts the man in the middle (npm) and you can make a npm mirror with low effort, a simple fork will do. Which means you can not only precisely pin which code you want, but also make sure nobody knows you use those packages either.

Take it with an open mind, a new "JSX" or async programming moment. People will hate it, then will start to see the value of this design down the road.


> I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway

npm installs aren't the same as installing from a random URL, because:

* NPM (the org) guarantees that published versions of packages are immutable, and will never change in future. This is definitely not true for a random URL.

* NPM (the tool) stores a hash of the package in your package-lock.json, and installing via `npm ci` (which enforces the lockfile and never updates it in any case) guarantees that the package you get matches that hash.

Downloading from a random URL can return anything, at the whims of the owner, or anybody else who can successfully mitm your traffic. Installing a package via npm is the same only the very first time you ever install it. Once you've done that, and you're happy that the version you're using is safe, you have clear guarantees on future behaviour.


My assumption would be that new men in the middle will arise, but this time, you can pick which one to use.


btw: https://github.com/denoland/deno/issues/1063

they know there is a bad mitm vector and won't fix it


This is why I think a content addressable store like IPFS would shine working with Deno


That solves this specific problem nicely, although AFAIK IPFS doesn't guarantee long-term availability of any content, right? If you depend on a package version that's not sufficiently popular, it could disappear, and then you're in major trouble.

It'd be interesting to look at ways to mitigate that by requiring anybody using a package version to rehost it for others (since they have a copy locally anyway, by definition). But then you're talking about some kind of IPFS server built into your package manager, which now needs to be always running, and this starts to get seriously complicated & practically challenging...


One advantage of having a centralized repository is that the maintainers of that repository have the ability to remove genuinely malicious changes (even if it's at the expense of breaking builds). Eliminating the middle man isn't always a great thing when one of the people on the end is acting maliciously.


I'm just thinking out loud here, but it seems to me that you could just make sure you're importing all your dependencies from trusted package repos, right? And since the URL for a package is right there in the `import` statement, it seems like it'd be pretty easy to lint for untrusted imports.

I don't detest NPM in the way that some people do, but I have always worried about the implications of the fact that nearly the entire community relies on their registry. If they ever fell over completely, they would have hamstrung a huge amount of the JS community.


It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

It's also exactly what the websites you visit do. ;)


> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

This is definitely false. For all the problems with the NPM registry and the Node dependency situation, an NPM package at a specific version is not just at the whims of whatever happens to be at the other end of a URL at any given moment it's requested. This is a huge vulnerability that the Node/NPM world does not currently have.


That is a fair point. I don't think most people who use npms really pay much attention, though, and you're still just an npm update away from getting something unexpected (because really, who puts explicit versions in package.json?).

Deno does have lockfiles: https://deno.land/manual/linking_to_external_code/integrity_...

I prefer imports from URLs. And I loathe npm. I get why people would disagree though.


Deno has lock files and caches files locally on first import.


I'm not sure how a lock file would help in this scenario, unless you're also committing your cache to source control (like a lot of folks did in the bad old days of NPM). The local cache is great, but that doesn't prevent the content of those URLs changing for someone who doesn't have access to your cache.


yeah, but we regularly clear out our cache and lock files, so this doesn't really solve the issue, unless you're commiting all of your packages


Why are you _regularly_ clearing lock files? If you're bypassing lock files you're going to have the exact same issue with npm or yarn or any other package manager that downloads from the internet.


Dunno about OP but I pin versions in package.json because it allows me to control the versions and upgrade major versions only when explicit and necessary, and rely only on the lock file to keep it the same between commit time and the production build.


That doesn’t actually work and gives you a false sense of reproducibility and stability. Sure your top level dependencies might not change without explicit changes to package.json but every time you run npm install without a lock file all transitive dependencies are re-resolved and can change.

Always commit your lock files people


What about the dependencies of your dependencies? You're gonna get burned when a breaking change gets introduced a few levels deeper than your package.json. Not everyone follows semver perfectly, and sometimes malicious code gets distributed as one of these transitive dependencies.


That's fine for one developer pushing to production from their own machine. But I've you have aCI server and you're working with other people you're going to want to know that everyone is working with the same modules.


What! Clearing lock files seems wild. How do you know you're getting the right code when you install dependencies?


For Deno the only issue is the first time when you do not have it cached. Deno compiles in all dependencies when building so the only point of failure is the machine you’re building on.

I don’t know the state of the art anymore, but I’m sure they have ways to make it easy to vendor deps in the repo.


> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

I'm not sure how this works in detail here, but at least in NPM you got a chance to download packages, inspect them and fix the versions if so desired. Importantly, this gave you control over your transitive dependencies as well.

This seems more like the curl | bash school of package management.

Edit: This is explained in more detail at https://deno.land/manual/linking_to_external_code and indeed seems a lot more sane.

> It's also exactly what the websites you visit do. ;)

Well yes, and it causes huge problems there already - see the whole mess we have with trackers and page bloat.


The good thing about this is you can effectively build a register service that serves the same level of trust that npm provides, because at the end of the day that is the only deferential in this scenario as npm can just as well return malicious code.


Thanks for sharing that link. Seems much more sane, but not without issues. I'm sure this will continue to be iterated upon.

Even with all NPMs flaws, I do feel this is a bit of throwing the baby out with the bath water. Time will tell.


AFAIK there is no option to allow a website to read and write random files anywhere to my hard drive period. At most a website can ask the user select a file or offer one for downloading. In the future maybe it can be given a domain specific folder.

That's not true here. If I'm running a web server I'm going to need to give the app permission to read the files being served and access to the database. That something that never happens in the browser.


The tldr is Deno also gives you a chance to download + inspect packages, and then lock dependencies. The mechanism for import is different, but the tooling is good.


Sure do. I wonder if they have a checksum mechanism like browsers do?

You can add an “integrity” attribute to script tags in the browser.

https://developer.mozilla.org/en-US/docs/Web/Security/Subres...


One advantage of urls is that you can link to a specific git sha, tag, or branch for a dependency, e.g. on github.


So exactly like existing tooling can already do, then?


Sure, I probably phrased that poorly -- it's not a unique advantage, but benefit of having URLs be the only way to link to dependencies versus a centralized, dominant package manager.


It's not just about the integrity. The url may very well provide what they claim to provide, so checksums would match, but it's the direct downloading and running of remote code that is terrifying.

This is pretty much like all the bash one-liners piping and executing a curl/wget download. I understand there are sandbox restrictions, but are the restrictions on a per dependency level, or on a program level?

If they are on a program level, they are essentially useless, since the first thing I'm going to do is break out of the sandbox to let my program do whatever it needs to do (read fs/network etc.). If it is on a per dependency level, then am I really expected to manage sandbox permissions for all of my projects dependencies?


If you afraid of "direct" downloading and executing some of that code, then what do you think happen when you npm install/pip install a package? I'm very interested if you can expose a new attack vector that didn't exist with the previous solutions.


You can generate modules on the fly on the server, that require next generated module recursively blowing up your disk space. If deno stores those files uncompressed, you can generate module full of comments/zeros so it compresses very well for attacker and eats a lot of space on consumer side.


Does Deno have some built in way to vendor / download the imports pre-execution? I don't want my production service to fail to launch because some random repo is offline.




You can also use the built in bundle command to bundle all of your dependencies and your code into a single, easily deployable file. https://deno.land/manual/tools/bundler.


Deno caches local copies and offer control on when to reload them. in term of vendoring you can simply download everything yourself and use local paths for imports.


How would this work with transitive dependencies? Sure I can control which parts I import myself, but how do I keep a vendored file from pulling in another URL a level deeper?


Unlike node, recommended deno practice is to check-in your dependencies to the VCS.

> Production software should always bundle its dependencies. In Deno this is done by checking the $DENO_DIR into your source control system, and specifying that path as the $DENO_DIR environmental variable at runtime.

https://deno.land/manual/linking_to_external_code


    du -hs node_modules
    
    1.7G node_modules


> in term of vendoring you can simply download everything yourself and use local paths for imports.

So I basically have to do manually, what NPM/yarn do for me already?


I do not speak for the project, but based on my understanding part of the point was to avoid the magic of npm.

You can use lock-files, bundles, and many other features that makes dependencies management easier.


Ah from that perspective I can see how this might appear to be better. Personally, I like the 'magic' of NPM (which to be honest I don't really think is all that magical, it's quite transparent what's happening behind the scenes). This 'magic' means I no longer have to write 200 line makefiles, so it definitely makes my life easier.


Some of that convenience will still be included, a couple of things that deno will do differently from node will be that there is no standard index.* file to load and import path include the extension.


I assume you would just download the packages and serve them yourself.


espacially since https is not enforced! https://github.com/denoland/deno/issues/1063


CMIIW, wouldn't enforced https means you can't use intranet or localhost url?


you could use a flag to re-enable http :)


More than likely programming as a whole will get better because of this...

Do you trust this thing? Better off developing it yourself, or working with something you trust then.


deno requires that you give the process explicitly which permissions it has. I think it's much better than praying that a package has not gone rough like with node. If you don't trust the remote script, run it without any permission and capture the output. Using multiple process with explicit permissions are much safer.


I'm wondering about the practicality of importing from URLs. I didn't see it addressed, but an import like this will be awfully hard to remember.

    import { serve } from "https://deno.land/std@0.50.0/http/server.ts";
Anyone know if there are alternatives or a plan for this aside from "use an IDE to remember it for you"?


The convention is to make a `deps.ts` and re-export what you need. Like this: https://deno.land/x/collections/deps.ts

I don't find versioned URLs much more difficult to work with than package@<version> though.


i'm wondering if they'll end up adding a 'dependencies.json' to eliminate the boilerplate from 'deps.ts' and to simplify tooling. that'd be revolutionary! ;)

jokes aside, i wonder how import-via-url will impact tooling. having to parse arbitrary JS (or even run it, for dynamic imports?) seems like it'd make writing a "list all dependencies" tool much harder than a "dumb" JSON/TOML/whatever file would. though i guess Go does a similar thing, and afaik they're fine


Well they do have import maps! I think everyone likes shorthand package names.


You are not alone, this is very unsafe in my humble opinion.


How is it any different than how it works in the browser?


Does it also terrify you when code running in a browser does it?


The code running in my browser isn't a multi-tenant production server, with access to the filesystem and DBs.


Except that with Deno, everything IO related is turned off by default and has to be granted access before it becomes a process. It's the first bullet point on the landing page.

Here is the page with more detail. https://deno.land/manual/getting_started/permissions

It can even restrict access down to a specific directory or host. This is cool.

Whereas any NPM module can map your subnet, lift your .ssh directory, and yoink environment variables, wily-nily.

It's happened before.


That still doesn't prevent imported modules from yoinking anything you did grant access to, though. For instance, if my service connects to a DB then `uuid` can slurp the contents.

It'd be nice to have some capability model where modules can only access things through handles passed to them, but probably infeasible for a project like this.


You can actually run things as Workers in Deno and get some sandboxing abilities: https://github.com/denoland/deno/blob/master/docs/runtime/wo...


From the article: "Also like browsers, code is executed in a secure sandbox by default. Scripts cannot access the hard drive, open network connections, or make any other potentially malicious actions without permission."


That just means you have to run with the -http -fs, etc. flags. But you are using those when writing any nontrivial Deno app like a webserver anyways.

"web browsers already do this ;)" isn't a good comparison.


"But I have to turn all that stuff on" is also not a good comparison.

Actually, no Deno webserver I've written gets fs access. Some only get --allow-net.


I think that's the main selling point of deno, sandboxing.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: