You only know you are you because of your memories. The future brain will have the memories, so it will think it's you, just like you think today that you are the same you that were awake 3 days ago and 10 years ago.
Are you afraid of nonexistence when you fall asleep, and envy the tommorow's you that he will live in your body and get to see what the future brings?
I don't think there's much difference between these scenarios - the conciousness is interrupted in both cases.
I think there is. If you cloned me while I'm sleeping and I never knew about it, you could torture and kill the other "me" and I'd be none the wiser, so that definitely seems like it isn't "me" because for whatever definition of me that I have, it involves this particular instance of conscious experience. Other conscious experiences, even if they are similar, are not me if they don't share my conscious experiences.
Programming wise, I'm an instance of an object, not a class.
This is the part I'm always surprised people don't seem to get. If someone makes an exact copy of me, the original me is still going to think "I'm me. That other thing is a copy of me."
So, if the original me has to die for the clone to be created... unless my consciousness somehow jumps to the clone, I experience a death which I do not come back from.
> unless my consciousness somehow jumps to the clone
You don't have conciousness when you sleep (at least during some parts of it). Your conciousness disappears every night for several hours and is recreated every morning. Why is it not a problem, but when there is 2 of you - it suddenly becomes a problem?
If you were sleeping while someone made a clone - would that make it OK? If no - why?
There's no evidence of anything external that has to "jump".
I disagree that I disappear during states of "unconsciousness". Sleep/general anesthesia/drugs are just different states of brain activity, but I'm still "me" during these states. We don't say people become no one, or someone else when they sleep.
So, a copy of your consciousness (if such a thing is possible) is different than sleeping. No matter how much the clone believes it is me, it will never be me. I still die in my scenario, even though the other feels differently about it.
To add to that, asking "who am I" is a philosophical question, and I'm not a strict empiricist, so I don't categorically dismiss the idea that there exists an "I" outside of my natural body.
I think the point about unconscious states was not that you "become no one", but rather that it is an interruption to the continuity of consciousness.
As a thought experiment, imagine you were magically replicated during sleep such that, at the time of replication, both "you"s were completely physically identical. Both would wake up thinking that they are the same person that went to sleep, but now there are two of them. Are either/both/none of them "you"?
My answer would be that they both are, and don't think there's anything special about our physical selves, nor any reason that we're limited to being singletons (to give a code analogy). The combination of our physical self and our cumulative experience and memories are what defines "us". Thinking of identity in possessive terms and the desire to think of ourselves as unique is a quirk of psychology, I think.
What about multiple personality disorder? Are there 2 of you time-sharing a body?
I don't see how you can distinguish "different brain-states that are still me", and "different brain-states that are not me" other than by looking at the data in these brain-states. If the data is the same - what makes it ok to say "these 2 brainstates are the same person, and these 2 are not"?
> We don't say people become no one, or someone else when they sleep.
We say a lot of things that are wrong or simplified, it's not an argument anymore than the word "sunrise" is an argument for geocentrism.
> I don't categorically dismiss the idea that there exists an "I" outside of my natural body.
I agree that existence of souls would change things, but it's untestable and unnessesary to explain all phenomena we experience, so I consider it a waste of time to discuss these.
I'm not sure, are you comfortable living one day only, and leaving all the problems to be for future you to take care of? Because for that logic, you basically don't exist that much at all, and the moment you go to sleep, the rest of your life is somebody else's problem, because you effectively die. That is what you're saying (and that is what people like us would expect from mind upload).
So you tell me why it's a problem, but it seems, I may go to sleep all I want, but when I wake up the next day, all my problems are still my own. I doubt that if someone makes a clone of me, and I go to sleep, that I'm going to wake up as them and that other clone. How would that even work if there's two of us? There's only one of "me" to go around.
Oh, it's very awkward, yes, because it implies there's something else going on here that is required to hold this information, but we're all supposed to think that's impossible...
> Or you can decide the leave everything and become a monk. I don't see how that's an argument one way or another.
You seem to have missed what I said.
I mean that, if I don't wash the dishes today, and I go to sleep, it's still me who's going to have to wash those dishes tomorrow. It's not someone else. It's me. That "me" is the question. I can't just not wake up as "me" tomorrow. If continuity wasn't a thing then the concept of "me" shouldn't exist at all, so there would be no responsible party in the first place, but every morning I find that I need to wake up, go to work, and then sometimes wash the dishes.
Of course, the entire experience of "me" can be falsified by a 3rd party, sure, but the world itself could also be falsified, neither of these beliefs are particularly actionable. But "the person who didn't wash the dishes and the person who then suffers the consequences is a continual entity" is actionable, it means I should wash my dishes.
> Why? It seems to be the crux of your argument, but you don't justify it in any way.
Because I can't wake up and control two people at the same time. That doesn't even happen in the case of multiple personality disorder.
You sure can. You could have a stroke and change personality completely. The fact that it's still considered "you" by society and the future you doesn't mean it's true.
> "the person who didn't wash the dishes and the person who then suffers the consequences is a continual entity"
This only requires you to believe that the future you is you, it doesn't require it to be true. It's similar to arguing "There must be a God, because why else would I pray". Well you can simply be wrong.
> If continuity wasn't a thing then the concept of "me" shouldn't exist at all
Why? There's a lot of concepts out there that exist contrary to facts. Free will probably doesn't exist. Absolute time doesn't exist yet people use it routinely as their model of reality.
> I can't wake up and control two people at the same time. That doesn't even happen in the case of multiple personality disorder.
You could make a device that controls muscles of another body basing on your neural impulses. Would that change your opinion about identity and consciousness? I doubt it. If I'm right, then that's not your real argument.
> If you cloned me while I'm sleeping and I never knew about it, you could torture and kill the other "me" and I'd be none the wiser
If I tortured you, then healed you and removed your memories you would be none the wiser as well. Does that mean I weren't torturing you, but someone else?
If I cloned you, tortured your clone and killed it while you were asleep, then merged his memories to you - was it you who I was torturing or not? Why is the difference meaningful?
Programming isn't very good analogy, because there is no definition of conciousness, so even if we recreate the problematic situations in some programming model - we still don't have any answers. Example:
You can serialize an instance of a class, delete the instance, then deserialize it. The memory address will be different, the == operator in some languages will return true, in many others - false, but there's no one answer if the object is the same or not when it comes to conciousness. There is no clear analog of pointer identity in real world. If we had evidence of souls that would be it, but we don't :)
Some GC languages can move objects in memory when the application is running. Is this the same object? What does this say about conciousness? Nothing IMHO.
When GC moves object in memory and updates all references to it - is this the same object, or a different one? What if it created a pool of objects, serialized object at address 1, loaded a different object there, and deserialized old object from address 1 to address 2? Is the object at address1 or the object at address 2 the old one? The pointer equality isn't very useful here.
> for whatever definition of me that I have, it involves this particular instance of conscious experience
> If I cloned you, tortured your clone and killed it while you were asleep, then merged his memories to you - was it you who I was torturing or not? Why is the difference meaningful?
If I merged the bash logs where I nuked rm -rf / on a given computer into the logs of one where I have not ran that command, the result is not the same.
> You can serialize an instance of a class, delete the instance, then deserialize it.
This would not be the same object since you can do that, but not delete the object.
Programming is a great analogy. I think the answer for identity/consciousness, even in programming involves an uninterrupted flow of unique locations in time/space that has continued existing as a cohesive whole.
> Some GC languages can move objects in memory when the application is running. Is this the same object?
In general, if there can be "two" instances simultaneously utilizing a given approach (like mind transfer), then it is not the same "one." But this is a flow, so ship of theseus alterations like you propose do not change identity, they are part of the flow.
> Then sleeping is the same as death.
I can't rule that out, actually, although I seem to be conscious some of it, so there's that.
> If I merged the bash logs where I nuked rm -rf / on a given computer into the logs of one where I have not ran that command, the result is not the same.
What if you had 2 git repositories, and merged changes from one to another? Or 2 disk images?
Why would memories be logs and not contents of the repository in this analogy? And if you care about changes to the brain state that aren't memories - just add these changes to the brainstate. You argue that no matter how perfectly we merge these brainstates, it's still not "me" after the merge. I argue - if there's no difference after the split, then the question is meaningless.
> This would not be the same object since you can do that, but not delete the object.
So what? Why would existence of a copy change anything? In programming it doesn't, yet you seem to argue it does.
You can make a clone of a disk or a memorydump, and restore it on another computer. Why would the uniqueness matter? You can run 100 copies on 100 computers and use that for redundancy.
When a program is running on modern CPU it executes code predictively, going both ways on some ifs. Does that split identities into 2 and immediately kills one of them?
Let's say for a moment that universe is a simulation, and there is a backup. Does that make "us" suddenly not "us" because there is a copy? Why?
Because of all these corner cases I don't think programming is a good analogy.
> Why would existence of a copy change anything? In programming it doesn't, yet you seem to argue it does.
It isn't the copy that changes things. It is the potential of a copy that makes it glaringly obvious that the result is not the same instance. If there is some process that makes a second me, I don't have access to the copy of me's thoughts. This proves second me isn't actually me because I'm still here, and we are trying to define what I am, and at-minimum, my definition of my consciousness does not include other consciousnesses which I don't have access to.
> Does that split identities into 2 and immediately kills one of them?
Yes.
While I think programming isn't perfect as an analogy, there is no other vein in which people think that has as many good metaphors for this. On the contrary, I think these corner cases are where you can actually refine your reasoning about these things. I think many of your corner cases are really useful because it is entirely conceivable that many could leave the realm of silicon and enter the world of flesh and blood (like memory merges, etc).
There's no merge. The question is not whether it is a crime, that's irrelevant.
The question is, which one you prefer I terminate. You, the one I'm talking to, or the other person in the other room. The preference most people have is "the other one" which pretty much lays bare that these two instances are not, in fact, equivalent because they are "in" one of these consciousnesses, and not the other.
> then these 2 copies diverged and are not the same
I agree.
For me, this implies divergence, even divergence of location (which happens automatically on a copy) is a change of identity, and so true identity must be inextricably linked to a continuity of location through time. Because of this, I reason that any copy operation, even a destructive one, is a change of identity and not-me, because it is really just a creation of a second instance and as mentioned, I would prefer this instance keep running over another if faced with my murderous thought experiment.
If merged into me, I agree it could become part of new-me.
Brain activity continues during sleep and general anaesthesia (even in the absence of sufficient electrical energy to show up on an EEG scan, chemical reactions are taking place). I see no reason whatsoever to believe that this continued brain activity bears no relation to our perception of continuity in consciousness.
I think the question you seem to be asking flippantly deserves more thought than you are giving it.
Even if memories are structural, network theory tells us that all parts of a network are not necessarily visible or accessible from all points at all times. The "You" making an executive decision could only be seeing a subset of your entire memory structure.
This difficulty can be even more clearly expressed by looking at the how neural networks even work. It's not just structure that determines output. It's a combination of structure and sigmoid function, spread out over time as results of previous computations fine tune the network.
You'd have the hardware to run your person on, but you'd have no information on the full informational state stored by which circuits are firing, which are building up to fire, and how fast each one is building up.
Even if you did, you'd lack a "reboot" harness capable of generating that state on demand across the entire brain that could both physically share the same space the brain is occupying AND not interfere with the mechanisms by which it operates.
What you haven't realized is that your conception of identity has been fine tuned since birth through your use of language to represent "You". Whatever woke up after the theoretical reboot would NOT be the "You" that went into it. It would be "You" to itself. Little more, little less.
If you sit down and think it through, you should be able to realize that even if you could build something to do this theoretically, practically implementing it would be a fool's errand courtesy of physics. Namely sensitive dependence on initial conditions, and the Pauli Exclusion Principle.
Nevermind the Engineering considerations. How do you measure a successful reboot? What is your error tolerance?
Or the sociological/ethical implications. Can the environment/society support you? Could you adapt? Would you even be able to assimilate? Would you bring some sort of toxic ethic or moral taint that the society has considered taboo?
Hell, we have kids based on a natural drive endemic to our biology. Now you're talking about placing on our descendants the literal decision of "Should we let this one exist to influence us?"
Maybe it seems I'm being flippant, but I gave it quite a lot of thought. It's one of a recurrent themes of "rationalist" sci-fi, a genre I like very much.
> You'd have the hardware to run your person on, but you'd have no information on the full informational state stored by which circuits are firing, which are building up to fire, and how fast each one is building up.
You just include that in the brainscan.
> It would be "You" to itself.
Yes. And I think this is enough. Current me would be a stranger to me 20 years ago.
> How do you measure a successful reboot? What is your error tolerance?
There was a sci-fi story where people had backup chips in their brains that measured all activity and learned over years to simulate that activity. If the simulated and real activity agreed to 99.999% over a long period of time the biological brains were euthanized and the chip took over the body.
IMHO it's quite good standard for error tolerance. If all measurable effects are the same for a long period of time (let's say a year).
I'm not persuaded Pauli Exclusion Principle is making this impossible, it depends on how sensitive our brains are to quantum effects, and I don't think we have data one way or another (do correct me if I'm wrong).
It might be impossibly high standard, but even then - if my future decisions depend on quantum fluctuations that will be simulated differently on a different hardware - my current identity isn't built on these future decisions YET. If we're really throwing dice making some decisions - I don't consider these dice or these results to be part of my identity now. If I reverted time and got a different decision the second time - I'd still think of that person as me, just in a different mood.
As for disruption of society, economy, etc - sure, these are big concerns. But we're talking philosophy here, not sociology.
What about after general anesthesia?
You only know you are you because of your memories. The future brain will have the memories, so it will think it's you, just like you think today that you are the same you that were awake 3 days ago and 10 years ago.
Are you afraid of nonexistence when you fall asleep, and envy the tommorow's you that he will live in your body and get to see what the future brings?
I don't think there's much difference between these scenarios - the conciousness is interrupted in both cases.