Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Open Source Chat Server in Node.js with dynamic rooms (techcentroid.com)
19 points by mlakkadshaw on May 28, 2012 | hide | past | favorite | 19 comments


Not to be a killjoy, but isn't this the hello world of node projects?


Yes it is. It's the first project I've created in node. EDIT: Only difference form other hello world servers is that it displays online users and supports dynamic rooms (Dynamic rooms are not working due to excessive traffic)


It seems like it's really getting killed under the load. This also isn't the first node project demo I've seen deployed that gets destroyed by a bunch of people visiting it to check it out. Anyone have some tips for deploying Node in a manner that your server won't just get destroyed? What's the point of supporting 1000's of users in say, socket.io if most people's deployment schemes allow for a max of say, 250 concurrent connections?

Any tips or explanations would be most welcome.


Be sure to use Node's cluster API (http://nodejs.org/docs/v0.6.0/api/cluster.html) so each core on your server gets utilized. Doesn't look like this, looking at the source, is doing that so unless they're manually running an app instance on each core and load balancing between them then they may be underutilizing their hardware.

Beyond that: using Nginx to serve any static assets (http://stackoverflow.com/questions/5009324/node-js-nginx-and...) and standard stuff like making sure everything's cached that should be, load balancing, etc.


At ClassDojo we support many thousands of concurrent users on node.js using cluster to create multiple worker processes and using multiple boxes with Amazon ELB in front of them. All static assets are served from the CDN.

Handling your state in memory is fine for examples such as this, but in general you should defer all state to the database layer or use something like redis. That way your app server will remain entirely stateless so any node.js process on any box can serve a request identically - you can just scale up by adding additional boxes.


A very fine way is to have multiple processes and connect to any at random. Now, use a message queue, say like redis, or rabbit mq to read messages and delete from queue. Since a client is only connected to only one of the servers, it eliminates the chances of sending the same message more than once to the same client. This helps in splitting the incoming messages and the outgoing(which is usually far bigger, since one message is delivered to all others in a chat room).


XSS Vulnerable. USE AT YOUR OWN RISK


Please explain


https://github.com/lakkadshah/SImple-Chat-Server/issues/1

<IMG """><SCRIPT>alert("XSS")</SCRIPT>">


When you set your name you can include arbitrary HTML it seems.

It's hard to tell.. Not sure if I am just running javascript on my machine.. It's very laggy.

Edit: Looks like it is fixed now.


You're opening yourself up to a serious race condition by using a global variable ('params') to share state across requests. The delay between the GET and the socket connect could easily be seconds; under load, it will likely be next to impossible to join the right room.

You probably want to use cookies (which socket.io helpfully supplies) for this instead.


Thanks, I'll look into it and release a update asap


Similar open source app written in asp.net mvc + signalr: http://jabbr.net. I believe the author has been working on getting it to scale on azure.


Seriously, limit the damn message rates. I can crash the server with one line in the chrome js console.


I'll be releasing an update soon, I've exposed sendMessage, I'll change it into something like form submission and then extract the message from the response.


The challenge in any chat server is how to scale it. The rest is easy.


and the site title is still "Bootsrap from twitter"


XSS Vulnerability fixed. Thanks guys


You should make use of Redis pub/sub




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: