I read both HN and LW, and enjoy both. I've noticed people on HN describing LW as "cultish" a couple of times now, and am constantly suprised by this. Can you shed any light on why you feel this way?
I think there are two main things about LW that strike some people as cultish. (There are others, less important.) Both are less true than they were, say, a year ago.
1. Its distinctive brand of rationalism grew out of this huge long series of blog posts by Eliezer Yudkowsky, conventionally referred to on LW as "The Sequences". So: we have a group of people united by their adherence to a set of writings by a single person -- a mixture of generally uncontroversial principles and more unusual ideas. It's not a big surprise if this reminds some people of religious scriptures and the prophets who write them.
2. The LW culture takes seriously some ideas that (a) aren't commonly taken very seriously in the world at large, and (b) share some features with some cults' doctrines. Most notably, following Yudkowsky, a lot of LW people think it very likely that in the not too distant future the following will happen: someone will make an AI that's a little bit smarter than us and able to improve itself (or make new AIs); being smarter than us, it can make the next generation better still; this iteration may continue faster and faster as the AIs get smarter; and, perhaps on a timescale of days or less, this process will produce something as much smarter than us as we are smarter than bacteria, which will rapidly take over the world. If we are not careful and lucky, there are many ways in which this might wipe out humanity or replace us with something we would prefer not to be replaced by. -- So we have a near-omnipotent, incomprehensible-to-us Intelligence, not so far from the gods of various religions, and we have The End Of The World (at least as we know it), not so far from the doomsdays of various religions.
Oh, and LW is somewhat associated with Yudkowsky's outfit, MIRI (formerly the Singularity Institute), and Yudkowsky is on record as saying that the Right Thing to do is to give every cent one can afford to them in order to reduce the probability of a disastrous AI explosion. Again, kinda reminiscent of (e.g.) a televangelist telling you to send him all your money because God is going to wrap things up soon. On the other hand, I do not believe that's his current position.
For the avoidance of doubt, I do not myself think LW is very cult-like.
> The fact that you disagree and think you understand the theory much better than I do and can confidently say the Babyfucker will not hurt any innocent bystanders, is not sufficient to exempt you from the polite requirement that potential information hazards shouldn't be posted without being wrapped up in warning envelopes that require a deliberate action to look through. Likewise, they shouldn't be referred-to if the reference is likely to cause some innocently curious bystander to look up the material without having seen any proper warning labels. Basically, the same obvious precautions you'd use if Lovecraft's Necronomicon was online and could be found using simple Google keywords - you wouldn't post anything which would cause anyone to enter those Google keywords, unless they'd been warned about the potential consequences. A comment containing such a reference would, of course, be deleted by moderators; people innocently reading a forum have a reasonable expectation that Googling a mysterious-sounding discussion will not suddenly expose them to an information hazard. You can act as if your personal confidence exempts you from this point of netiquette, and the moderator will continue not to live in your personal mental world and will go on deleting such comments.
Roko's Basilisk itself is fine. An interesting idea that is fun to consider. The notion that the idea is too dangerous to discuss publicly is rather cultish.
If I understand correctly, such violent reactions to Roko's Basilisk only come from a minority of LW people, but a prominent minority...