This has significant implications for the basic concepts undergirding democracy.
Machine intelligence can be cloned. If we gave machines rights, then ballot-stuffing would become trivial: have an AI clone itself a million times and vote for the candidate that you prefer. It'd be about as reliable as an online poll.
This isn't a problem for human voting because humans are scarce. We can reproduce, but it takes a little less than 20 years to do so, and the human development process ensures the possibility of value drift. Children are not identical to their parents. There are a few parts of the world with active "outbreed our political opponents" ideologies (e.g. Palestine), but that only works if the parents are angry about a situation that is likely to transfer to their kids.
This isn't even entirely a sci-fi hypothetical. Think about online art - e.g. stock image marketplaces, art gallery sites, etc. Those are now entirely flooded with AI art being passed off as human. The marketplaces are unable or unwilling to filter them out. If you're a human, the scarce attention[0] that you would normally get from, say, recommendation features, hashtag search, or chronological timelines, has now been diluted away by a bunch of scam victims[1] trying to peddle their prompt generations.
[0] "Attention Is All You Need, but it's a how-to guide for social media influencers"
This is not entirely correct and we need to get into the weeds to have a proper answer. Most certainly machine's memories are easier to duplicate and replicate than biologicals'. But that's certainly just a distinction of technologies.
We really need to get into the understanding of what the concept of self is. Which I have no answer. But here's the thought experiment to understand the premise. Take your self right now (or any point in the past, but it's easier to be biased that way) and think of a possible major life changing decision you could take. Simulate yourself making different possible decisions (easiest if binary, but it never will be that simple in reality). Project yourself 10 years or so down each path. Are those two people "the same person?" There's certainly arguments for either direction and anyone saying they have a clear well defined answer is fooling you.
Personally, I believe no, they are not. This is because my belief on the self is conditioned on experiences. Without a doubt these people will respond to certain things differently, despite likely having many similar or even identical responses to many other things.
But despite this I still think your argument and concern is valid about ballot-stuffing, especially since my interpretation of self is also conditioned on time and I believe your argument is mostly focused on the instantaneous (or local temporal) cloning. I think this could present a possible solution, in that we define age for machines differently and this is conditioned on the cloning, transfering, pretraining, whatever.
But certainly I have no doubt that what we often take for granted and treat as trivial will reveal its actual complexity. We fool ourselves into thinking simplicity exists, and certainly this is a very useful model, but the truth is that nothing is simple. I think it is best we start to consider and ponder nuances now rather than when we are forced to. After all, the power of humans world modeling and simulation is one of the things that differentiate us from other animals (who many have these same capabilities, but I'm not aware of any that has them remotely to the same degree. Fucking nuance gotta go an make everything so difficult... lol).
They’re not the same self but then again neither of them are the same self as you are now. Ship of theseus.
But then the self itself is an abstraction. Consider Indra’s Net, the subconscious, dissociative identity disorder, and all realms of complication.
I suspect that the best way to understand the difficulty of talking about consciousness is that it’s a weakness of how language works.
Similar to arguments about whether God could create a 4-sided triangle? God’s omniscient, says one side, so yes. God still has to follow logic, says another. Yet my stance is that it’s an ill-posed question. Just because words can fit together grammatically doesn’t mean the phrase is meaningful.
I think the self is just an abstraction and label to group together a class of linguistic phrases or bodily behaviors. Where are these or those words coming from? Some come from my ears with a high pitch, some from my ears with a low pitch, some come from inside.
Not sure I’m making my point but I suspect language is to blame for the difficulty in understanding consciousness
I think you and I are in agreement and I'm uncertain if you're responding to me or kmeisthax. Or if you're rebuting my comment or supporting it. But in general I agree with what you said.
Yeah I think when we have artificial sentience we will have to have different specifics. It makes sense. Should be the same with different biologicals too. I think this is how we should generally think about artificial sentient creatures, think about aliens.
But I think at an abstract level we should all be equal. Specific will be different, but general abstract rights should be the same. Like what you point out has to deal with death. But it can get more nuanced and real fast. Removing a biological's arm is significant destruction. Removing a robot's arm is still damage, but not life altering as it can be either reattached (if it was simply disassembled), likely easily repairable, and most certainly replaceable. So the punishment should be different. The reverse situation might be forcing one into a MRI machine. Annoying for human, death for the robot. Backups also are tricky as we have to get into the whole philosophical debate about what self means and without a doubt there is "death" between the time/experiences that were lost (maybe bad analogy is force teleporting someone into the future, but where they just take over the consciousness of the future self and have no memories of the time between despite it actually having happened).
Yeah, I agree that it's going to make things more complicated and it is very much worth thinking about. It's important if you believe in aliens too (why wouldn't you?), because if it is ever possible to make contact with them (I'm certain we have not already. You're not going to convince me with tic-tacs), we will need to adapt to them too. It's a general statement for "non-human life."
IMO I think this is why it is so important to focus on the spirit of the law rather than the letter. The letter is a compression of the spirit and it is without a doubt a lossy compression. Not to mention that time exists...
I would conditionally be in favor of that actually. But it may be difficult to properly contextualize, especially not being a creature that does this.
Sleep is analogous but incomplete. Maybe closer to anesthesia? Like if you forcefully placed someone into a coma we'd consider that a crime, but we don't consider it to be the case for a doctor, even if a doctor does it (acting as a doctor, not just being a doctor) without the person's consent. Context matters. This aspect to me comes down to reasonable (like medical) and/or necessity (like sleep)
I'm sure we'd also have to consider lifetime lengths. I don't think someone drugging me for a day should receive the same punishment as someone that did it for a month who didn't do the same as someone that took years from me. And which years matter. The question is how we deal with this for entities with different lifespans.
(sorry if I'm verbose, distillation takes time. I also communicate better through analogies and I think it is also illustrative of the spirit argument as you must understand intent over what's actually said)
So I think the spirit of these laws is centered around robing someone of time, because time is a non-reversible (and definitely not invertible) process that has a has significant value. That's what the laws' underlying intent is (at least partially) aligned to. So that's what I'd call the spirit. It's quite possible other entities see time differently and length of time has different value impacts as well as the means for removing said time.
Overall I think these things are deceptively simple. But in reality nuance dominates. I think this is a far more general phenomena than many care to admit, probably because our brains are intended to simplify as it's far more energy efficient. I mention this though because it is critical to understanding the argument and how (at least I personally) we can make future predictions and thus what we must consider.
Alright, another for you because I like the cut of your gib.
Consider the octopus, whose nervous system is distributed into nodes in the head and limbs. Would severing a limb of a hypothetical sentience-uplifted octopus be a greater crime than severing the limb of a human?
A human loses twice as much in terms of limb, but ignore that for sake of argument.
The octopus loses a more significant part of its nervous system. This feels like another aspect of robbing a sentience of agency.
So with sentient machines, if I removed a stick of RAM or underclocked the CPU, what do you think of these?
I feel like you should be able to infer my answer. It's about impact. I don't know enough to confidently say one thing or another. But I'm sure someone can and it should be reasonable.
Going without electricity for any amount of time just amounts to a temporary loss of consciousness, whereas animals starve.
Data can be duplicated with ease.
Lots of differences between carbon-based and hypothetical silicon-based life.