Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It doesn't really seem like there's much utility in defining it. It's like defining "heaven."

It's an ideal that some people believe in, and we're perpetually marching towards it



No, it’s never going to be precise but it’s important to have a good rough definition.

Can we just use Morris et al and move on with our lives?

Position: Levels of AGI for Operationalizing Progress on the Path to AGI: https://arxiv.org/html/2311.02462v4

There are generational policy and societal shifts that need to be addressed somewhere around true Competent AGI (50% of knowledge work tasks automatable). Just like climate change, we need a shared lexicon to refer to this continuum. You can argue for different values of X but the crucial point is if X% of knowledge work is automated within a decade, then there are obvious risks we need to think about.

So much of the discourse is stuck at “we will never get to X=99” when we could agree to disagree on that and move on to considering the x=25 case. Or predict our timelines for X and then actually be held accountable for our falsifiable predictions, instead of the current vide based discussions.


This is a great reply, thank you.

For me, I just zoom out a little further and say: at the rate AGI is approaching, what is the utility in trying to regulate it ahead of time?

Seems like advancement is slow enough that society can/will naturally regulate it based on what feels comfortable.

And it's a global phenomenon that can't have rules applied at the protocol level like the internet, because it's so culturally subjective.

Precedents need to be set first, and I think we'll only be able to call them when we see them.


It’s a good point. For epistemic hygiene I think it’s critical to actually have models of the growth rate and what is implied. Eg we are seeing exponential growth on many capability metrics (some with doubling-times of 7 months), but haven’t joined this up to economic growth numbers. In models where the growth continues you could imagine stuff getting crazy quickly, eg one year AI contributes 0.5% GDP only measurable in retrospect, next year 2%, year after 8%.

Personally I don’t think politicians are capable of adapting fast enough to this extreme scenario. So they need to start thinking about it (and building and debating legislation) long before it’s truly needed.

Of course if it turns out that we are living in one of the possible worlds where true economically meaningful capabilities are growing more slowly, or bottlenecks just happen to appear at this critical phase in the growth curve, then this line of preparation isn’t needed, but I’m more concerned about downside tail risk than the real but bounded costs of delaying progress by a couple years. (Though of course, we must ensure we don’t do to AI what we did to nuclear).

Finally I’ll note in agreement with your point, that there are a whole class of solutions that are mostly incomprehensible or inconceivable to most people at this time (ie currently fully outside the Overton Window). Eg radical abundance -> UBI might just solve the potential inequities of the tech, and therefore make premature job protection legislation vastly harmful on net. I mostly say “just full send it” when it comes to these mundane harms, it’s the existential ones (including non-death “loss of control” scenarios) that I feel warrant some careful thought. For that reason while I see where you are coming from, I somewhat disagree on your conclusion; I think we can meaningfully start acting on this as a society now.


This is great food for thought.

I like your idea of developing a new economic model as a proxy for possible futures; that at least can serve as a thinking platform.

Your comment inspired me to look at historical examples of this happening. Two trends emerged:

1. Rapid change always precedes policy. Couldn't find any examples of the reverse. That doesn't discount what you're saying at all, it reiterates that we probably need to be as vigilant and proactive as possible.

and related:

2. Things that seem impossible become normative. Electricity. The Industrial Revolution. Massive change turns into furniture. We adapt quickly as individuals even if societies collectively struggle to keep up. There will be many people that get caught in the margins, though.

Consider me fully convinced!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: