Ryan Dahl commented on the thread, and it's being fixed in node. Nice to see assessment and responsiveness at the core of the project.
I'm evaluating node.js as an application platform choice for a large public infrastructure project. One thing that concerns me is (my perception here) a lack of public hardening of the server that's yet to come. I've been around long enough to see that effect on PHP, Django, Rails, etc.
We'll continue our evaluation but it's encouraging to know that issues like these are being discovered and addressed.
A look at node's HTTP parsing code, 1500 lines of hand-coded and rather pretty C, makes it clear that Ryan Dahl cares a lot about HTTP in node doing the Right Thing.
(This is also very handy for people writing HTTP servers and clients in other languages, since it's independent of node, and really fast and feature-complete.)
This is a particular case of http://news.ycombinator.com/item?id=3401900. Basically, weak hash functions allow you to create lots of hash table collisions, degrading performance to that of a linked list. It is common to put POST'ed data into a hash table (the equivalent of ?foo=bar&baz=qux becomes {"foo": bar, "baz": qux}).
This title is a great piece of FUD. Anyone paying a little more attention would have seen the original paper (http://www.nruns.com/_downloads/advisory28122011.pdf) which states: "PHP 5, Java, ASP.NET as well as v8 are fully vulnerable to this issue and PHP 4,
Python and Ruby are partially vulnerable".
That's not to say that Node doesn't need to fix this (and it seems like Bert from the core team is) but it's not a Node specific issue.
Node.js and client-side JavaScript should treat security issues differently because they face different risks. E.g. A DoS against client-side Javascript is not a big deal (because it might slow down a single browser, or even just a single tab within a browser). However, on the server side, a DoS could take down an entire service which is much more significant.
Thus you could say that V8 is "secure" on the client side but "insecure" on the server side because of the different risk assessments. It is poor security practice to take software designed for one security environment and assume it will be secure in other environments. If Node.js wants to have a secure system they will need to take these security issues into consideration and harden their system appropriately.
He's saying it's FUD because the headline is misleading, not because he's trying to downplay the security issue. You and grandparent are likely in agreement with respect to your comment.
(The headline is misleading because the issue affects several major language runtimes, V8 included – yet only Node.js is mentioned.)
I wonder if this is a bigger deal for Node because it's single-threaded? Just one malicious POST request could slow down the entire server, whereas other languages that spawn a process for each request could easily kill a process that's using 100% CPU, right?
If you are using external processes (tools like gd's converters and so forth) yes. For code that is only running in node's environment it used to be that running a process or more per core (using something like nginx as a reverse proxy to tie them to one port) was considered the way to make better use of extra CPU resource.
There is even a cluster module built in now to remove the need for an extra external tool to manage the processes: http://nodejs.org/docs/latest/api/cluster.html (there are more fuller featured options available as extra modules, I'm not sure how they compare efficiency-wise with the in-built one). I'm guessing this isn't the way to go if the processes need to communicate, but I've not looked into it overly deeply yet (my experiments with node not having grown to the point of needing to take advantage of more than one core).
It's like everyone figured out how hashes work just a couple of days ago. What happened to spark all of this conversation? I also keep hearing that this problem is solved with randomized hashing functions, but my best guess is that this doesn't eliminate collisions. It just then becomes roughly impossible to generate a set of keys that would cause enough collisions to actually be a problem. Hooray for data structures.
Yes, if they are implemented via hash tables and do not randomize their hash generation somehow.
The talk at 28c3 specifically mentions PHP, Java, ASP.net, Python. Ruby is fine, but other variants of Ruby are apparently also vulnerable.
> Didn't I see this same thing about PHP the other day?
This affect all languages using hashtables with non-randomized hash functions to store POST arguments, it's been discussed on Python's mailing list for instance. It's also been noted on the Erlang list, but all erlang frameworks apparently use proplists for POST mappings, so none of them is affected.
I believe that the critical ASP.NET framework update that Microsoft pushed just before the new year was to fix this issue as well, amongst a few others.
Not necessarily specific to node.js, but in general, instead of a standard webserver, use netcat, on multiple obscure ports, where each instance of netcat acts once and is discarded.
Thank you for the link, but my approach is neither "half-assed" nor "voodoo". I've used netcat, as described, for a small project, and it worked well. I'm about to do the same, for a big project, and I expect that it will again work well. Standard webservers are bloated. Speed, security and stability can be enhanced by distributing work across a system of one-shot processes. I take some inspiration from Jef Poskanzer's design decisions in thttpd.
I'm evaluating node.js as an application platform choice for a large public infrastructure project. One thing that concerns me is (my perception here) a lack of public hardening of the server that's yet to come. I've been around long enough to see that effect on PHP, Django, Rails, etc.
We'll continue our evaluation but it's encouraging to know that issues like these are being discovered and addressed.