Metaphysics of a Digital Brain: why it really doesn’t matter

Metaphysics of a Digital Brain: why it really doesn’t matter

First of all, a small disclaimer: the points regarding metaphysics in this post are mostly opinions, not facts. You do not need to agree with the points, they are simply there for contemplation and discussion in the comment section. Alternative arguments are welcomed, though this does not mean they are immune from criticism, which is also true of all arguments put forth within this post.

What does ‘metaphysics’ mean?

First of all, we need to establish a definition for metaphysics. According to a Google search, metaphysics means: ‘the branch of philosophy that deals with the first principles of things, including abstract concepts such as being, knowing, identity, time, and space.’ As far as we are concerned this applies to identity, being and life. The relevant questions about digital immortality can be simplified as ‘Is a digital copy of our cognition really the same person?’ and ‘Does an artificial body with someone’s cognition really count as life?’ Though other phrasings may be used, these are the questions under examination.

Would a digital brain truly be the same person?

When discussing the possibility of uploading our cognition into a digital form, many people have persistently argued that the resulting being would be a different person. One might argue that the digital brain is not the same as the biological brain and therefore cannot be considered as such. This is the same argument about metaphysics as the teletransportation paradox, (also known as the ‘duplicates paradox’), wherein the question is raised of whether a teleported person made with different atoms in the same exact relative configuration is truly the same person. The paradox continues to ask what would be the result of making multiple copies, and even keeping the original intact. Some would argue that the copy is not the same person as they are not from the same materials as the original person.

Here’s what I think:

Naturally, a person whose brain has been converted into a digital form would not retain their biological body, but that would incite the Theseus’ ship analogy, where eventually all parts of a ship are replaced and the question of whether the ship truly is the same ship is asked. This analogy can be extended to more common cases: an amputee with an artificial limb is still considered as the same person despite not having all the same parts as before. Even in everyday life cells die, are destroyed and new ones are made in their place, yet we still consider ourselves as the same person. This is due to the fact that our cognition (usually) remains in a logically consistent path, and we have a biological advantage to think we are the same person.

As for a digital brain, as long as it feels the same as before and as long as the person in question acts in a similar way to how they would have before the procedure, I would argue that for all practical and personal purposes, it may as well be the same person. In short: as long as I think I am the same, I do not care whether I truly am or not. Having said this, I do think that a digital copy should be regarded as the same person, except for biological senses of the term. More importantly, though, is that people would have multiple strong reasons to upload their brains to digital form.

The (solvable) duplication problem:

As discussed earlier with the teleporter analogy, there is one major problem: the ability to have multiple copies of one person. Regardless of whether you define a digital being as the same person the cognition came from or not, the ability to have many copies all claiming to be the same person would be a huge problem both for society, law and a whole host of other areas of life. It is imperative, then, that there must not be more than one copy of each person at any time. It also seems logical that in any case where two active copies do arise, that both should be destroyed and a singular backup be activated. This is the only practical way to avoid problems that would inevitably arise from multiple copies of one person. This would need to be agreed upon by all governments and consumers to avoid a collapse of the legacy legal system.

Why metaphysics won’t affect the outcome significantly:

One reason why the whole duplicates paradox argument can be ignored is due entirely to the complete lack of impact it has on the outcome. Even assuming as little as 1% of the population has the procedure, at least half of the population will have achieved cognitive immortality after about 70 generations. Though this sounds like a lot, it is still an insignificant time period compared to the billions of years each person with digital immortality would live. In all likeliness, the transition to digital existence will be much faster as people decide that existence as a digital being is worth it to live for billions of years. As such, I would argue that even with the general consensus being that digital beings are not really human, eventually most people will gain digital immortality.

Let us know your thoughts in the comment section below.

Loui Coleman

Author of Generation Byte

Leave a Reply