Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Depends on how you define AGI - if you define it as an AI that can learn to perform generalist tasks - then yes, transformers like GPT 5 (or 3) are AGI as the same model can be trained to do every task and it will perform reasonably well.

But I guess what most people would consider AGI would be something capable of on-line learning and self improvement.

I don't get the 2035 prediction though (or any other prediction like this) - it implies that we'll have some magical breakthrough in the next couple years be it in hardware and/or software - this might happen tomorrow, or not any time soon.

If AGI can be achieved using scaling current techniques and hardware, then the 2035 date makes sense - moores law dictates that we'll have about 64x the compute in hardware (let's add another 4x due to algorithmic improvements) - that means that 250x the compute will give us AGI - I think with ARC-AGI 2 this was the kind of compute budget they spent to get their models to perform on a human-ish level.

Also perf/W and perf/$ scaling has been slowing in the past decade, I think we got like 6x-8x perf/W compared to a decade ago, which is a far cry than what I wrote here.

Imo it might turn out that we discover 'AGI' in the sense that we find an algorithm that can turn FLOPS to IQ that scales indefinitely, but is very likely so expensive to run, that biological intelligences will have a huge competitive edge for a very long time, in fact it might be that biology is astronomically more efficient in turning Watts to IQ than transistors will ever be.



> I think with ARC-AGI 2 this was the kind of compute budget they spent to get their models to perform on a human-ish level.

It was ARC-AGI-1 that they used extreme computing budgets to get to human-ish level performance. With ARC-AGI-2 they haven't gotten past ~30% correct. The average human performance is ~65% for ARC-AGI-2, and a human panel gets 100% (because humans understand logical arguments rather than simply exclaiming "you're absolutely right!").


>> if you define it as an AI that can learn to perform generalist tasks - then yes, transformers like GPT 5 (or 3) are AGI

Thank you, this is the definition we need a proper term for, and this is what most experts mean when they say we have some kind of AGI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: