Let's explore the concept of "cloud based ones" neurons. Let's say your brain has just three neurons, connected with each other. You remove one of them, and connect the two remaining one an external terminal - a physical device capable of transmitting electrical currents, being controlled by some complicated software "in the cloud". That physical device becomes the third neuron. Instead of one terminal, it could be two different devices, together playing the role of the original third neuron for each of the two other original neurons. For simplicity, let's assume there's just one terminal, serving both remaining original neurons, just like the third original neuron did. From the point of view of the two remaining biological neurons, nothing has changed. But from the point of view of the third neuron - the replaced one - the system did change. Now it is connected to three other terminals: two original one, and one other physical terminal "in the cloud". That other terminal in the cloud is part of the software, so it is connected to potentially billions of other terminals (e.g. transistors). And all of them are now involved in playing the role of the original third neuron that we replaced. So we effectively replaced a system of three interconnected neurons with a system of billions of interconnected neurons. This can have a profound effect on the "consciousness" of the original brain (in quotes because I don't consciousness is possible with just three neurons).
I don’t see why I would. It would change you and I said that in my original comment. But it would change you slowly over time on the order of years, much like how children change dramatically from say 7 to 17. They are still the same person even though their mental capacity and personality of grown and evolved significantly. I should have said a billion neurons a month, instead of a week though, since weekly would be replacement in less than two years but monthly would be over 8 years.
>So we effectively replaced a system of three interconnected neurons with a system of billions of interconnected neurons
If I had any quibble, it would be this statement. I while I would say it replaces a system of 3 parts with one with billions, I wouldn’t say it is effectively billions of neurons. In the same way I would say that emulating a mechanical watch in software would have much more transistors per watch part emulated, I wouldn’t say there were effectively billions of second hands.
We should clarify the definition of "uploading". I see two cases:
1. We gradually replace each neuron in-place - without any external connections. This means a biological brain slowly turns into a silicon brain (or whatever material used for new neurons)
2. We gradually replace each neuron with a connection to an external computational system which provides the signals that the original neuron would output. This is what I mean when I say "brain uploading".
The first case is more straightforward, and I'm not discussing it here - I expect it to be possible when the technology is ready.
The second case might have fundamental issues. The main issue is about preservation of consciousness - let's try to define it first. Consciousness is a property of a brain, which most likely depends on the signals between neurons, and which might or might not additionally depend on each neuron internal electrochemical state. When we change a brain we change its consciousness. If we change it too much, we might alter consciousness to the degree where it is not longer the same identity. This might be happening to a degree as a person grows, though I'd argue that the person would still be the same, in a sense that the person would consider himself to remain the same person over the years. For example, I remember myself 20 years ago, and even though I was somewhat different in terms of views, habits, drives, etc, I'm sure I was still me - I was not my neighbor Mike - I'm still the same captain of mostly the same ship. If we replace neurons as in case 1 above, I believe the continuity of consciousness can be preserved. But in case 2, I'm not so sure, because we are fundamentally changing the structure of the brain, not just individual components. Going back to my original example of just 3 neurons, replacing one neuron with a silicon based device in-place does not fundamentally change the overall structure of the 3-neuron brain. It's still just 3 neurons operating together as a whole. However, if we replace one neuron with a connection to an external computer, we no longer have the 3-neuron brain. We have 2 neurons which operate as before, because they still receive the same inputs from the third neuron as before, but the third neuron has fundamentally changed - it is now receiving external signals that the original third neuron did not receive, and therefore, the structure of the original brain is now different. I don't know if we can claim the consciousness of the original 3-neuron brain is preserved. The problem is that the other two neurons cannot adapt to this change, because from their point of view nothing has changed. Now if we consider that the brain has 100B neurons, and we replaced 50B neurons per case 2 - now I'm not sure what is happening to the consciousness because half of the brain is receiving external signals from some complex computational system. Does new consciousness extend to that external system? Why, or why not? If yes, how is it impacted?
Re: "emulating a mechanical watch in software". Can we upload the mechanical watch to the cloud? No, we can only replicate it. But that is not relevant to our discussion, because a mechanical watch has no consciousness which could change or be transferred.
That is an interesting perspective. I actually think that a gradual shift to an emulated brain would keep you more “you” than replacing neurons in place with non-biological neurons. The reason is the emulation and is why I used the watch analogy, which was not about duplication vs transferring, but was about how emulation of physical objects and the connections between them is the same, for all intents and purposes, as the “real” version.
In your 3 neuron brain analogy, the emulated neuron would not be receiving external signals. It’s physical properties would be emulated in software so the neuron would function i.e. behave the same way, as the original. Much the same way that an emulated mechanical watch mainspring will tension when you “wind” it and release that energy to “power” the watch the same as the watch it was emulating. Of course the winding, tension, etc. aren’t really happening. They are software running in a computer, but the key is they are behaving in the same way as a real watch component.
Back to our emulated third neuron, it would not really be going through the electrochemical process of changing its action potential based on the chemical signal received from one of the other two neurons. What it would be doing is emulating that process in software in the same way as the biological one it replaced. It is not “receiving external signals” in that it is not a computer serving as a neuron, or like plugging your brain into a computer so you can directly interface with it. It is a computer running software and that software is running a physics engine of a neuron and the only external signals transmitted to that neuron running in software is an emulated duplicate of the signal sent from one of the other two neurons. However the neuron responds based on the rules of the physics engine, which again is built so that the neuron will behave the same as its biological model, would be converted to a real signal sent back to the other two neurons. So, nothing would fundamentally change for the other neurons to adapt to.
Now, lets look at the non-biological silicon (or whatever material) brain. A silicon neuron would not behave the same as a biological one because it is made of totally different materials, likely chosen to act faster, more efficiently, more accurately, etc. In that case, the new neuron would not respond in the same as the biological one it replaced. I would say that would alter the functioning of the brain much more drastically.
I'm having trouble understanding, or perhaps, accepting, your point of view. When you say "It’s physical properties would be emulated in software", what really happens is still electrical signals flowing between transistors. "Software" is a concept - it does not exist in a physical world - it's our interpretation of the physical processes that happen when we apply voltage to transistors. I'm not 100% sure, but I think that consciousness might similarly be a concept - "our" interpretation of what is going on between our biological neurons. Note I put "our" in quotes, because it's self-referential here, and that's why I'm not 100% sure. But if we get back to my 3 neuron scenario, the third neuron which is emulated by the external system has not disappeared into the "software". It's still a physical device, which functions in a physical world, and is part of the original brain, despite the fact that it is now physically different from the original biological neuron it has replaced. It still produces (relays) the same signals towards the original two neurons, but it does not compute those signals. Both the inputs and outputs of this new proxy neuron are different, because it has to communicate with the external system. The two original neurons still communicate with a physical device, not with some abstract software entity.
However, if we look at consciousness differently - as something that is purely in the signals and not in the physical devices carrying the signals, then it changes the perspective. In that case it's harder for me to imagine the effect of extending computational signals into the cloud, because I'm not sure if the external signals become an extension of the original consciousness, or if they become more like sensory inputs to the local brain - but then the local brain becomes smaller? It's hard for me to comprehend this perspective.
>"Software" is a concept - it does not exist in a physical world - it's our interpretation of the physical processes that happen when we apply voltage to transistors.
Not exactly. Software is still a physical thing, in that it is a specific configuration of transistors, or disk magnetization, etc. An emulation is a specific configuration of those transistors to replicate how the original object behaves according to physics.
> Both the inputs and outputs of this new proxy neuron are different, because it has to communicate with the external system. The two original neurons still communicate with a physical device, not with some abstract software entity.
They do only interact with the software indirectly, but if it is properly emulated, it will behave in the exact same way. To simplify even more than a 3 neuron system, I want to talk about logic gates. As I’m sure you know, logic gates are the basic building block of a computer that perform simple operations like AND, NOT, OR. Well, you can build mechanical version, like this [1] one with marbles. So you can build a super simple computer using marbles and physical gates. Now what if you build a machine that would input and output marbles according to software emulation of a physical gate and a falling marble? Even though the physical logic gates connected to it would be interacting with a complex physical computer, not the abstract software entity, the results would be identical. If you built a computer with half physical logic gates and half marble input/output machines controlled by abstracted emulated logic gates, the computer would still function the same. I think that if you did it with neurons, the brain would function the same.
It seems to me that neurons are biological variations of logic gates. They receive electrochemical input based on a limited number of options, and based on physical differences in how the neuron is wired and what input it receives, the neuron produces and output based on a limited number of options. We have seen through many medical cases that every part of your personality can be changed by physical changes to your brain, either injury, disease, or surgery. There is no core “you” separate from the interaction of 100 billion neurons. So to me, if half those neurons are actually a software “abstraction”, I don’t see why my subjective experience would be any different.
Can you poke holes in the reasoning above?