Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not that great intelligence (be it "humans with amplified intelligence" or AI) is that powerful.

It's only because we overestimate intelligence we think of that, probably as part of the "Great Man" fallacy.

Science is not advanced merely by people with "amplified intelligence" -- it's advanced through thousands of scientists working independently and in co-operation and tons of hard "manual" work of testing, verification, experimentation etc.

Not just some sage coming in with his insights, a la Newton and Einstein (though that did happen with more frequence in the previous centuries when more "low hanging fruit" discoveries were available).

As for politics, it's usually "sociopaths" and/or good manipulators and liars that go far ahead, not people with high IQ specifically -- and those traits are even conflicting sometimes (e.g. people with high IQ but Aspergers).



As you're yourself saying, sages with insights did occur relatively frequently when there were the low hanging fruit in different sciences still available. But there is no reason to believe that there exists just one level of these low hanging fruit that just happens to be at around the level that exceptional human intelligence can grasp. Which means that that this hypothetical "amplified intelligence" could vrey well have it's own, next level of "low hanging fruit" that is simply too complex for current human intelligence to grasp, but is well within the reach of the more powerful intelligence.


>Which means that that this hypothetical "amplified intelligence" could vrey well have it's own, next level of "low hanging fruit" that is simply too complex for current human intelligence to grasp, but is well within the reach of the more powerful intelligence.

Perhaps, though I doubt it. We imagine intelligence as some kind of infinite scale, that can extend forever.

I would place my bets on diminishing returns.


A "fallacy" would imply some hidden logical error or some loose thought, but this isn't the case: the idea that "great men are important" isn't by itself fallacious. The error lies in denying the contribution of "small men".

Furthermore, even if "small men" are important, why wouldn't they benefit by becoming greater? Instead of waiting on destiny to grant us another Leibniz or another von Neumann, maybe we could harness such powers directly from the source.

That, of course, if it works. Tampering with the brain is certainly one of those things that might have very unintended consequences. What neuronal adaptations wold arise from sticking a chip in someone's brain? For instance, the majority of savants have some sort mental impairment. What if by optimizing for specific skills one might perhaps find it difficult to learn new skills? Or maybe the contrary: optimizing for learning and generalist thinking might not increase computational intelligence, or even reduce it.


> Science is not advanced merely by people with "amplified intelligence" -- it's advanced through thousands of scientists working independently and in co-operation and tons of hard "manual" work of testing, verification, experimentation etc.

Correct. Hence I think one low-hanging fruit in science would be to not only share information through peer-reviewed articles (or even ArXiv) but actual fine-grained collaboration between different researchers

Instead of waiting for a published paper (which might take months or years), there might be a way of saying "ok this is promising" or "this is crap" or "do it this way, it works better"

Oh and of course stop crap like pseudocode algorithms, Excel calculations in favour of actual source code and open data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: