Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> “You’re [X]?! The quantum physicist who’s always getting into arguments on the Internet, and who’s essentially always right, but who sustains an unreasonable amount of psychic damage in the process?”

> “Yes,” I replied, not bothering to correct the “physicist” part.

Didn't read much beyond that part. He'll fit right in with the rationalist crowd...



No actual person talks like that —- and if they really did, they’ve taken on the role of a fictional character. Which says a lot about the clientele either way.

I skimmed a bit here and there after that but this comes off as plain grandiosity. Even the title is a line you can imagine a hollywood character speaking out loud as they look into the camera, before giving a smug smirk.


I assumed that the stuff in quotes was a summary of the general gist of the conversations he had, not a word for word quote.


I don't think GP objects to the literalness, as much as to the "I am known for always being right and I acknowledge it", which comes off as.. not humble.


He is known for that, right or wrong.


I mean, Scott's been wrong on plenty of issues, but of course he is not wont to admit that on his own blog.


> —-

0.o

> No actual person talks like that

I think it is plausible that there are people (readers) that find other people (bloggers) basically always right, and that would be the first think they would say to them if they met them. n=1, but there are some bloggers that I think are basically always right, and I am socially bad, so there is no telling what would I blurt out if I met them.


To be honest, if I encountered Scott Aaronson in the wild I would probably react the same way. The guy is super smart and thoughtful, and can write more coherently about quantum computing than anyone else I'm aware of.


if only he stayed silent on politics...


Why would you comment on the post if you stopped reading near its beginning? How could your comments on it conceivably be of any value? It sounds like you're engaging in precisely the kind of shallow dismissal the site guidelines prohibit.


Aren't you doing the same thing?


No, I read the comment in full, analyzed its reasoning quality, elaborated on the self-undermining epistemological implications of its content, and then related that to the epistemic and discourse norms we aspire to here. My dismissal of it is anything but shallow, though I am of course open to hearing counterarguments, which you have fallen short of offering.


You are clearly dismissing his experience without much though. If a park ranger stopped you in the woods and said there was a mountain lion up ahead would you argue that he doesn't have enough information to be sure from such a quick glance?

Someone spending a lot of time to build one or multiple skills doesn't make them an expert on everything, but when they start talking like they are an expert on everything because of the perceived difficulties of one or more skills then red flags start to pop up and most reasonable people will notice them and swiftly call them out.

For example Elon Musk saying "At this point I think I know more about manufacturing than anyone currently alive on earth" even if you rationalize that as an out of context deadpan joke it's still completely correct to call that out as nonsense at the very least.

The more a person rationalizes statements like ("AI WILL KILL US ALL") these made by a person or cult the more likely it is that they are a cult member and they lack independent critical thinking, as they outsourced their thinking to group. Maybe their thinking is "the best thoughts", in-fact it probably is, but it's dependent on the group so their individual thinking muscle is weaken, which increases their morbidity (Airstricking a data center will get you killed or arrested by the US Gov. So it's better for the individual to question such statements rather than try to rationalize them using unprovable nonsense like god or AGI).


If a park ranger said "I looked over the first 2% of the park and concluded there's no mountain lions" - that is, made an assessment on the whole from inspection of a narrow segment - I don't think I would take his word on the matter. If OP had more experience to support his statement, he should have included it, rather than writing a shallow, one-sentence dismissal.


I think the recently popular way of saying "I looked over 2% and assumed it generalizes" these days is to call the thing ergodic.

Which of course the blog article is not, but then at least the complaint wouldn't sound so obviously shallow.


That's great, I'm definitely going to roll that into my vocabulary.


Also...

> they gave off some (not all) of the vibes of a cult

...after describing his visit with an atmosphere that sounds extremely cult-like.


At least one cult originates from the Rationalist movement, the Zizians [1]. A cult that straight up murdered at least four people. And while the Zizian belief system is certainly more extreme then mainstream Rationalist beliefs, it's not that much more extreme.

For more info, the Behind the Bastards podcast [2] did a pretty good series on how the Zizians sprung up out of the Bay area Rationalist scene. I'd highly recommend giving it a listen if you want a non-rationalist perspective on the Rationalist movement.

[1]: https://en.wikipedia.org/wiki/Zizians [2]: https://www.iheart.com/podcast/105-behind-the-bastards-29236...


There's a lot more than one of them. Leverage Research was the one before Zizians.

Those are only named cults though; they just love self-organizing into such patterns. Of course, living in group homes is a "rational" response to Bay Area rents.


[flagged]


Ziz did leave the rationalist movement, but to imply there's no link is simply dishonest. Ziz spent a lot of time in the Bay area Rationalist scene, and a lot of her belief system has clear Rationalist origins (such as timeless decision theory and the general obsession with an all-powerful AI). Ziz did not come up with this stuff in a vacuum, she got it from being a rationalist.

Also, I'm not sure if you did this intentionally, but Ziz is a trans woman. She may have done some awful shit, but that doesn't justify misgendering her.


Just because there is some weak relation to rationalism doesn't justify guilt-by-association. You could have pointed equally out that most of the Zizians identified as MtF transexuals and vegans, and then blamed the latter groups for being allegedly extreme. Which would be no less absurd.

> She may have done some awful shit

Murdering several people is slightly worse than "awful shit".


It's not just some weak relation.

The Behind the Bastards podcast works by having the podcaster invite a guest on the show and tell the story of an individual (or movement, like the Zizians) to provide a live reaction. And in the discussion about the Zizians, the light-bulb moment for the guest, the point where they made the connection "oh, now I can see how this story is going to end up with dead people," happens well before Ziz breaks with the Rationalists.

She ultimately breaks with the Rationalists because they don't view animal welfare as important as a priority as she does. But it's from the Rationalists that she picks up on the notion that some people are net negatives to society... and that if you're a net negative to society, then perhaps you're better off dead. It's not that far a leap to go from there to "it's okay for me to kill people if they are a net negative to society [i.e., they disagree with me]."


> But it's from the Rationalists that she picks up on the notion that some people are net negatives to society

That belief has nothing to do specifically with rationalism. (In fact, I think most people believe that some people are net negative for society [otherwise, why prisons?], but there is no indication that this belief would be more prevalent for rationalists.)


The podcast Behind the Bastards described Rationalism not as a cult but as the fertile soil which is perfect for growing cults, leading to the development of cults like the Zizians (who both the Rationalists and Zizians are at pains to emphasize their mutual hostility to one another, but if you're not part of either movement, it's pretty clear how Rationalism can lead to something like the Zizians).


I don't think that podcast has very in-depth observations. It's just another iteration of east coast culture media people who used to be on Twitter a lot, isn't it?

> the fertile soil which is perfect for growing cults

This is true but it's not rationalism, it's just that they're from Berkeley. As far as I can tell if you live in Berkeley you just end up joining a cult.


I lived in Berkeley for a decade and there weren't many people I would say were in a cult. It's actually quite the opposite. There's way more willingness to be weird and do your own thing there.

Most of the rationalists I met in the Bay Area moved there specifically to be closer to the community.


No, Guru Eliezer Yudkowsky wrote an essay about how people asking "This isn’t a cult, is it?" bugs him, so it's fine actually. https://www.readthesequences.com/Cultish-Countercultishness


Hank Hill: Are y'all with the cult?

Cult member: It's not a cult! It's an organization that promotes love and..

Hank Hill: This is it.


Does this quote make more sense in context? I don't watch the show.


It makes perfect sense as a comment, seeing the clip from the show doesn't add anything.


I think i'm seeing it now. "This is it" implies Hank is actually looking for the cult in question, something that I guess seems odd to me without knowing the plot of the episode.

Thanks for clarifying though! Oh wait, you didn't.


Extreme eagerness to disavow accusations of cultishness ... doth the lady protest too much perhaps? My hobby is occasionally compared to a cult. The typical reaction of an adherent to this accusation is generally "Heh, yeah, totally a cult."

Edit: Oh, but you call him "Guru" ... so on reflection you were probably (?) making the same point... (whoosh, sorry).


> Extreme eagerness to disavow accusations of cultishness ... doth the lady protest too much perhaps?

You don't understand how anxious the rationalist community was around that time. We're not talking self-assured confident people here. These articles were written primarily to calm down people who were panickedly asking "we're not a cult, are we" approximately every five minutes.


I got to that part, thought it was a joke, and then... it wasn't.

Stopped reading thereafter. Nobody speaking like this will have anything I want to hear.


Scott's done a lot of really excellent blogging in the past. Truthfully, I think you risk depriving yourself of great writing if you're willing to write off an author because you didn't like one sentence.

GRRM famously written some pretty awkward sentences but it'd be a shame if someone turned down his work for that alone.


Since I had not engaged with the ASOIAF community for quite some time, I had forgotten that, apparently, the phrase "fat pink mast" is stored somewhere in my long term memory. I sometimes struggle to remember my own wedding anniversary, but "fat pink mast" pops right to the surface.

I'd like to thank my useless brain for deciding to write that one down.


Is it not a joke? I’m pretty sure it was.


It doesn’t really read like a joke but maybe. Regardless, I guess I can at least be another voice saying it didn’t land. It reads like someone literally said that to him verbatim and he literally replied with a simple, “Yes.” (That said, while it seems charitable to assume it was a joke but that doesn’t mean it’s wrong to assume that.)


I'm certain it's a joke. Have you seen any Scott Aaronson lecture? He can't help himself from joking in every other sentence


I laughed, definitely read that way to me


I think the fact that we aren't sure says a lot!


It says a lot, but what does it say a lot about?

https://www.smbc-comics.com/comic/aaaah


If that was a joke, all of it is.

*Guess I’m a rationalist now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: