a) For AGI - I personally think there is an intellectual limitation in the human to go to this point. Case in point: Dogs can't talk. Arguing the 'inevitability' is to some extent arguing that one day dogs will be able to talk, because why shouldn't they? Will we make 'general AI' close enough to fool many people many times, which is essentially composed of a bazillion cogs wired together and rigged to appear general? probably. But to the level that it meta-tunes itself? no.
b) For your 'engineered brain': If you are mimicking natural intelligence with chemistry, biology, etc, it is a clone - you are still dependant on understanding natural processes which you didn't create, so it is only a clone and not at all 'artificial'.
If you'll notice, the philosophical limitations of 'b' are somewhat the same as 'a' - e.g. copying processes that already exist manually is not the same as creating it from scratch..
> arguing that one day dogs will be able to talk, because why shouldn't they?
You misunderstand how evolution works. Humans are the product of over 10 million years of brutal selection pressure in favour of intellect. Before that, we indeed had about the same vocabulary as dogs (you should really say wolves, by the way - dogs are our invention).
If wolves/dogs were subject to the same pressure, then there's no reason why they would not eventually adapt in the same way humans did. However, humans got there first, and have so thoroughly colonised the earth that there is no chance of this happening.
> But to the level that it meta-tunes itself?
Well, you're extending AGI here into some kind of singularity runaway intelligence explosion. That's not within the scope of my argument. I have no opinion on that.
> it is only a clone and not at all 'artificial'
In my view, anything that is not naturally occurring is artificial.
a) For AGI - I personally think there is an intellectual limitation in the human to go to this point. Case in point: Dogs can't talk. Arguing the 'inevitability' is to some extent arguing that one day dogs will be able to talk, because why shouldn't they? Will we make 'general AI' close enough to fool many people many times, which is essentially composed of a bazillion cogs wired together and rigged to appear general? probably. But to the level that it meta-tunes itself? no.
b) For your 'engineered brain': If you are mimicking natural intelligence with chemistry, biology, etc, it is a clone - you are still dependant on understanding natural processes which you didn't create, so it is only a clone and not at all 'artificial'.
If you'll notice, the philosophical limitations of 'b' are somewhat the same as 'a' - e.g. copying processes that already exist manually is not the same as creating it from scratch..