Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The human brain doesn't use a billion dollars in compute power, figure out what it is doing.

This may not be true, if we’re talking about computers reaching general intelligence parity with the human brain.

Latest estimates place the computational capacity of the human brain at somewhere between 10^15 to 10^28 FLOPS[1]. The worlds fastest supercomputer[2] reaches a peak of 2 * 10^17 FLOPS, and it cost $325 million[3].

To realistically reach 10^28 FLOPS today is simply not possible at all: If we projected linearly from above, the dollar cost would be $16 quintillion (1.625 * 10^19 dollars).

So, when it comes to trying to replicate human intelligence in today’s machines, we can only hope the 10^15 FLOPS estimates are more accurate than the 10^28 FLOPS ones — but until we do replicate human level general intelligence, it’s very difficult to prove which projection will be correct (an error bar spanning 13 orders of magnitude is not a very precise estimate).

P.S. Of course, if Moore’s law continues for a few more decades, even 10^28 FLOPS will be commonplace and cheap. Personally, I am very excited for such a future, because then achieving AGI will not be contingent on having millions or billions of dollars. Rather, it will depend on a few creative/innovative leaps in algorithm design — which could come from anyone, anywhere.

[1] https://aiimpacts.org/brain-performance-in-flops/

[2] https://en.m.wikipedia.org/wiki/TOP500#TOP_500

[3] https://en.m.wikipedia.org/wiki/Summit_(supercomputer)



The dirty secret though is that AI isn't doing anything nearly comparable to what a whole human brain is doing. It's performing the functions of perhaps a small subset of NNs in the brain or maybe the equivalent of what a small rodent's brain is capable of doing. Obstacle avoidance, route planning, categorizing objects in vision, even language related functions are more of a mapping than an understanding. I think the point still stands that we're making very inefficient use of the hardware that we have. Universities need to be smarter about this and figure out how such a limited network of squishy cells do all the things they do, that's the whole point of concentrating smart people in an environment where they're given the freedom to pursue ideas without worrying about whether or not it generates a profit. You learn the 'how' and the 'why' rather than just the 'what makes money'.


Most AI is doing overly complicated versions of plain old decision trees or just pattern matching.

Or in the worst cases they are presenting one thing, and really just relying on hundreds or thousands of people in Bangalore to pore through the data sets and tag and categorize.


"Dirty secret" is a weird term for something practitioners and researchers are trying to tell anyone who will listen. It's a dirty secret on the marketing side.


Of course, and we’re already seeing incredible results, even from size-compressed deep neural networks running on custom acceleration hardware embedded now in most major smartphones.

I was simply responding to the parent post’s false claim (”The human brain doesn’t use a billion dollars in compute power, figure out what it is doing.”), in isolation from the rest of the post (which I generally agree with).


The higher numbers there (eg. 10²⁸) are irrelevant. It claims a single neuron is operating at 10¹⁶ operations per second—that is, as much computation is happening within a single neuron as is the sum of all computation between neurons in the whole brain!

Bostrom's estimate of 10¹⁷ is much, much more reasonable.

Note that this is still a number biased in favour of the brain, since for the brain you are measuring each internal operation in an almost fixed-function circuit, and for Summit you are measuring freeform semantic operations that result from billions of internal transitions. A similar fixed-function measure of a single large modern CPU gives about 10¹⁷ ops/s as well; the major difference is that a single large modern CPU is running a much smaller amount of hardware many times faster, and uses binary rather than analogue operations.


While I agree that 10^17 seems like a more accurate number, don’t forget that each neuron contains ~10^5 synapses which all process at timescales of 10s of microseconds. This gives you an additional factor of 10^9


The 10¹⁷ includes that factor. 10¹⁰ neurons by 10⁵ synapses by 10² Hz. It seems unlikely to me that the meaningful temporal resolution is going to be 1000x that of the firing rate, but if you want to add a factor of 10 or so I wouldn't object.


Neurons also seem to be doing local, protein-based computations.


There's a "tiny" secret - the beautiful power and plasticity of the human brain comes from it being a part of a physical body.

I recommend checking out some Antonio Damasio books for a fascinating read on this topic.


This sounds interesting.

I'm guessing you mean that much of the power of the human brain comes from its ability to interact with its environment?

I've been noticing more and more of a trend in recent years to treat the brain as separate from the body, and as if it is "trapped" in the body.


More than interaction, it's a living organisation in the service of life.

Damasio has a very interesting take on how emotions, consciousness etc. emerge from the the way the brain & body together process information from the "external" world.


> if Moore’s law continues for a few more decades

Moore's law has already been dead for years.


Where'd you pull that out? Yeah intel has dropped the ball and stopped innovating but TSMC and others still have us on track.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: