Murder is a well-known uniquely biological crime where one person kills another. This shouldn’t apply to someone who is immortal via uploading their brain, but this idea is not as simple as it may seem. There has been a fair bit of debate over whether deleting a person’s extra digital consciousness counts as murder and whether or not it should be legal for a licenced organisation to have this power. To be clear, this is not about deleting all copies of a digital brain, rather the extra copies when there are multiple active copies, either by accident or by illegal replication, hence ‘murder and copied consciousness’.
What is murder?
Murder is a staple in many crime dramas and news stories. Everyone understands the concept and most people recognise that it is one of the worst crimes committed on a regular basis. The slight grey area comes when you talk about euthanasia (assisted suicide) which is regarded as murder in some countries with a few exceptions. The actual definition for murder is given by Google as ‘the unlawful premeditated killing of one human being by another’. The definition for euthanasia is given as ‘the painless killing of a patient suffering from an incurable and painful disease or in an irreversible coma’.
The arguments about murder and copied consciousness
Of course, if someone was to delete all copies of a person’s digital brain without their consent that would effectively be murder. What we want to know is more realistic scenarios which have a significant grey area.
One such situation would be if a person were to illegally copy their consciousness. In this scenario, a copied consciousness would be considered as rogue, sort of like the digital being equivalent of an undocumented immigrant. In this case, though, there is no technological limit to the number of illegal copies a person could make and theoretically a wealthy person could, if unchecked, make a large militia with a goal that would harm society. Even without this, each digital being would technically be the same person, which comprises the entire financial, legal and governmental system.
Another way that this can happen is an accidental second copy. This is possible when a person’s active brain copy gets destroyed, and two backup copies are accidentally activated to ‘resurrect’ the person. Potentially a backup database could get its priority value wrong and activate a stored consciousness when it shouldn’t, effectively making a second ‘you’ without your knowledge. In this case, such a clear error could be fixed either by deleting the copied consciousness or by deleting both active copies and reverting to a common state, whichever makes more logistical sense. Alternatively, if deactivation was possible that could avoid the murder topic entirely.
Arguments against the ability to delete
Onto the actual arguments. Though this position can be complex with many niche distinctions and semantic reasoning, the argument can effectively be summarised as such:
‘When you create a copied consciousness, small or large changes that each copy experiences create a slightly different cognitive make-up and memory. Such a change is significant enough for the copy to be defined as technically a different person, and thus to delete this digital cognition would be the same as deleting a person’s entire brain archive or murder.’
Proponents of this side are essentially arguing that any change in experiences creates a slightly different mental outlook, enough to change how a person would behave or think. Personally, I do think that this argument does hold water, with the argument aligning well with my belief that what we interpret as free will is an illusion (you can learn more about this philosophical argument with a free audible trial here).
Arguments for the ability to delete
In favour of the ability for a licensed body to delete we have a very different type of argument. Of course, someone could simply reject the idea that slightly different experiences cause distinct a cognition but this idea isn’t really substantiated. What most people on this side, including me, would argue, is that a copied consciousness is practically the same, even if this isn’t technically true.
More to the point, in many cases, it would be very hard to make a distinction between two branched consciousnesses. If allowed to integrate into society normally, there will be cases where two or more digital beings claim and truly believe to have one identity and then claim possessions that are in the name of this identity. This becomes more and more of an issue the longer an extra copy is allowed to go on.
For example, if you have a bank account for John Smith and two people legitimately having John Smith as their original brain-uploader, the bank would not know which John Smith is the legitimate one – they both are! This could get much worse. Assume John Smith copy 1 commits a crime. The police may arrest John Smith copy 2 and all the evidence would be against John Smith 2, despite John Smith 2 never committing the crime.
To avoid this, proponents of this side argue that people must consent to allow extra copies to be deleted by a licenced organisation and to have this organisation enforce a maximum of one active copy at a time. In short, the effects of allowing multiple versions of one person to go around will be so disastrous that it simply cannot be allowed to go on. The only possible workaround would be such a societal shift that it could destroy things like democracy, which relies on one vote per person, property, which requires clear ownership, legally recognised partnerships like marriage and civil partnerships and so much more. If we want to keep a hold of these things which so define the modern world, we would likely need to delete all extra active copies of a digital brain.
Likely regulatory outcome on copied consciousness
Frankly speaking, the opinion of the public doesn’t always matter to regulators. What many government officials will do depends on what they see as beneficial. For them, the argument that doesn’t compromise the system that they have been trusted to run will always win out. What is potentially more interesting is how other crimes will change. Things like assault may be ignored for digital beings and instead be treated as property damage. Safety laws may change due to digital beings being able to survive much harsher environments and certain insurance options may be done away with.