The Universal Risk of Mind Uploading

The Universal Risk of Mind Uploading

Mind uploading will be the biggest change ever to the way people live, ushering in a fully digital era. One component of this will inevitably be the societal, economic and political risks and struggles. We will list what we think are the biggest universal risks, and give our take on how each universal risk could be avoided or managed. The risks listed may not necessarily happen but are nonetheless highly possible.

Universal risk:

Power struggles and abuse

One of the major benefits of uploading your brain to digital form is the fact that you will be digitally immortal. However, if this is done by people in high positions of power, it could easily lead to what is known as ‘eternal dictators‘. This is due to having no biological ageing, combined with bodies being replaceable and digital brains being able to be copied and stored in databases. Consequently, this would likely lead to entire regions or even extraterrestrial planets and moons being controlled under tyranny. From there, rights could be removed, populations could be enslaved and brain software could be forcefully installed to limit freedom and increase compliance.

This would be difficult to stop peacefully once it was to happen. The most likely result would be some form of war, but given a dictator’s tendency to not obey international (or universal) laws, the result of this would be uncertain. Thankfully, preventative measures can be taken, such as making digital brains compatible only with known compliant databases and blacklist unknown digital brains from all data services and software enhancement platforms. This, of course, doesn’t remove the possibility entirely but makes it so someone would have to either steal or independently develop the technology, and would still have to develop their own software and infrastructure.

Class divides

This is one universal risk talked little about. Assuming that the upload process is a paid service, as opposed to an NHS style public service, it will likely be very expensive, at least for the first decades. This means that only wealthy individuals who could afford the upload would get to and poor people could not. Eventually, this would lead to the biggest financial divide ever with the rich being immortal and the poor stuck with biological bodies. If this would go on, the gap would increase even further as those with immortality accumulate wealth with practically zero chance of dying, meaning the gap would only increase.

This is easier to avoid, though it depends on the regional government. One solution is to have a Universal Basic Income (UBI), something seen as an inevitability in an increasingly automated world. Provided the service was at a reasonable price, only those with very little financial discipline wouldn’t be able to afford it, which seems fair enough. Alternatively, and perhaps preferably, the service could indeed be provided free, with the artificial body and software and hardware enhancements being provided by a for-profit corporation, completely removing this universal risk.

Digital/Biological divides

Another potentially huge universal risk is the divide between digital and biological people, something we have taken a deeper look at before. This is inevitable at some scale, though that scale is unknowable right yet. In summary, it is very easy for people to see segregation between digital and biological people and consequently harbour a negative attitude. This will not end well for the far technologically and intellectually inferior biological humans, and could possibly bring financial collapse or recession.

This would be virtually impossible to prevent completely, though it can be somewhat managed. The two main factors in this are good regulation and open and honest discourse, which are down to governments and the media respectively.

Longevity of dated ideas

It is a practically global phenomenon that younger generations look at the political and social ideas of older generations and brand them as old-fashioned. These changing attitudes depend on being brought up with increasingly more social acceptance and is a consequence of being in a developed biological society. With digital immortality, however, this phenomenon is removed when you lose mortality. The likely effect of this is that for the first decades, those who have uploaded their brains will be more likely to have views society deems as modern and liberal, but as time progresses their ideas could be seen as dated.

On the scale of things, this is a low universal risk, with the only potential problems being a feeling of segregation and social divides. This is not too much of a concern, however, as the software a digital being would have means that misinformation would be harder and ideas would be more likely to be evaluated objectively.

‘Ghost towns’

One inevitable factor of the transition to the digital era would be the complete lack of use of many buildings, products and places, something we looked into in our sustainability series. Not only this, but industrial decline could cause many industry reliant towns or cities to have very little employment. This would effectively force people into urban or remote areas, creating numerous places that are largely or completely abandoned. Theoretically, if not demolished or patrolled these places could become hotspots for illegal activities. This would also impact the economic/financial world. This is something unlikely to be avoidable, and would persist to be an issue for centuries or until these places are demolished.

Undetectable error

Our final major universal risk is a copying error in the transfer process. This would be undetectable, due to this causing a logical paradox. Regardless of where you are on the metaphysical debate, unless a copy is 100% accurate it can be regarded as not the same person. This is an incredibly complex issue, especially as the biological brain is so hard to understand, with many cognitive processes being subconscious, and memories being unreliable. Furthermore, if this were to be a problem every time a mind was copied, a series of errors would accumulate and eventually, the later copies would be significantly different from the original version. Error may also be caused by ionising radiation or other external interference, which would be largely not preventable.

Thankfully, this should be avoidable with the right technology. One method, used in space hardware, is to have each byte that makes up the human part of the digital being’s data have an odd number of copies above one. For example, if each byte has five copies, a corrupted byte could be corrected by averaging the five versions and assuming the modal value (1 or 0) is correct. Unfortunately, this doesn’t work for the initial upload and one would have to hope the process was reliable.

What do you think? Let us know in the comment section below.

Loui Coleman

Author of Generation Byte

Leave a Reply