I believe you. I'd be shocked however to learn that it is the consensus in the AI community. Also, many brilliant people said heavier-than-air flight will never happen. Fermi himself didn't believe in supercritical nuclear chain reactions. I'm sure there are some other examples.
But the interesting question is: why is it never going to happen? I can see several reasons:
(1) Dualism is true: we have a soul without which our bodies could not function, and souls cannot be uploaded with mere earthly artefacts. Such machine not only wouldn't be us, they would be nearly empty shells. Like zombies, only mechanical.
(2) Subjective identity does not transfer to the machine along with the information contained in our bodies. While the machine would be a perfect copy, behaving like us and all, it wouldn't be us. Only problem, the others can't make the difference, so your elaborate form of suicide goes undetected.
(3) Mind uploading requires an understanding of the human mind that is forbidden to us because of its deep self-referential nature. Gödel theorems and such. A superior being could do it, but we cannot.
(4) Our minds run on quantum stuff that we can't copy without screwing them (measurements of quantum system tend to have significant consequences on the system).
(5) We could never attain the necessary technological level. Moore's law may cease to apply, we could screw ourselves in a major catastrophic event (World War III, virus from the lab, environmental collapse, Gray Goo accident, Skynet turning the solar system into paper clips… or the banal asteroid).
I think AI people only have authority on (3). Anyway, to the best of my knowledge:
(1) is bunk. The world most likely runs on math, and probably relatively simple math. Sure, we haven't found the complete laws of physics yet, but I have hope.
I wouldn't bet my life on (2) just yet, but it does sound improbable. My confidence is grounded on the fact that current physics say that copy&paste transportation works. If you were disassembled and scanned into a hard drive, then reassembled elsewhere with local material, it would still be you, not just a copy. Mind uploading only have the additional problem of the change of substrate (from neurons to silicon). I'm not sure it's a real problem though, because we could still imagine downloading you back into your body, only with a few additional memories.
(3) Shouldn't be much of a problem, provided we only need a low-level understanding of our inner workings. Self-referential or not, computer viruses are capable of copying themselves, because they don't need to comprehend themselves to do the copying.
(4) Is highly improbable. I know of no evidence pointing that way, and our neurons tend to respond in much more obvious ways to stimuli (easily detectable electrical impulses, modification of the blood stream, sizeable chemical reactions…). Anyway, we should probably ask neuroscientists.
(5) is the most probable in my opinion. That would really suck, but there is a good chance we bamboozle ourselves too deeply to ever recover.
Conclusion: like I said, it's a long shot. Not too long a shot however to give up hope just yet.
Your computer virus analogy is a bit off as the virus was created by a superior being. Self-replication is something that was programmed into the virus, the virus did not develop this ability itself.
But the interesting question is: why is it never going to happen? I can see several reasons:
(1) Dualism is true: we have a soul without which our bodies could not function, and souls cannot be uploaded with mere earthly artefacts. Such machine not only wouldn't be us, they would be nearly empty shells. Like zombies, only mechanical.
(2) Subjective identity does not transfer to the machine along with the information contained in our bodies. While the machine would be a perfect copy, behaving like us and all, it wouldn't be us. Only problem, the others can't make the difference, so your elaborate form of suicide goes undetected.
(3) Mind uploading requires an understanding of the human mind that is forbidden to us because of its deep self-referential nature. Gödel theorems and such. A superior being could do it, but we cannot.
(4) Our minds run on quantum stuff that we can't copy without screwing them (measurements of quantum system tend to have significant consequences on the system).
(5) We could never attain the necessary technological level. Moore's law may cease to apply, we could screw ourselves in a major catastrophic event (World War III, virus from the lab, environmental collapse, Gray Goo accident, Skynet turning the solar system into paper clips… or the banal asteroid).
I think AI people only have authority on (3). Anyway, to the best of my knowledge:
(1) is bunk. The world most likely runs on math, and probably relatively simple math. Sure, we haven't found the complete laws of physics yet, but I have hope.
I wouldn't bet my life on (2) just yet, but it does sound improbable. My confidence is grounded on the fact that current physics say that copy&paste transportation works. If you were disassembled and scanned into a hard drive, then reassembled elsewhere with local material, it would still be you, not just a copy. Mind uploading only have the additional problem of the change of substrate (from neurons to silicon). I'm not sure it's a real problem though, because we could still imagine downloading you back into your body, only with a few additional memories.
(3) Shouldn't be much of a problem, provided we only need a low-level understanding of our inner workings. Self-referential or not, computer viruses are capable of copying themselves, because they don't need to comprehend themselves to do the copying.
(4) Is highly improbable. I know of no evidence pointing that way, and our neurons tend to respond in much more obvious ways to stimuli (easily detectable electrical impulses, modification of the blood stream, sizeable chemical reactions…). Anyway, we should probably ask neuroscientists.
(5) is the most probable in my opinion. That would really suck, but there is a good chance we bamboozle ourselves too deeply to ever recover.
Conclusion: like I said, it's a long shot. Not too long a shot however to give up hope just yet.