Supposedly, after a person had uploaded their brain to digital form, they would have a comprehensive virtual copy. This copy could be analysed and separated into its constituent brain regions. Theoretically, these regions could be replaced with regions from other people or some form of AI, a sort of hybrid ‘configurable brain’. In the case of a Copy/Paste Hybrid Brain (CHB), in which a user takes unedited parts from other people’s digital brains, this has the potential for partially selling your mind.
Selling your Mind
Reasons to sell
One reason true of pretty much all selling is simply the hard cash. Selling your mind would be a perfect example of a ‘passive income’, which is income where effort is upfront and not required to sustain the product. This could work either as a member of a larger organisation responsible for the actual transfer or affiliation in the case of an Artificially-Averaged Hybrid Brain (AAHB). Regardless, given that any seller would already be immortal, this would just provide a safe and steady income source
Alternatively, selling your mind could be done for altruistic purposes. In this case, a person could decide to sell regions of their brain to replace faulty regions in other people’s brains. In another case, a particularly creative person may want to literally share that creativity with other people. One potential intriguing use case is where this is done by a social influencer or celebrity for their fans.
Sadly, though, it is pretty much guaranteed that some reasons will be selfish and even manipulative. Continuing with the celebrity idea, assuming the celebrity was popular enough, they would likely sell a large number of the parts. This could effectively act to brainwash the fan base, causing them to unknowingly change their behaviour and perspectives. This idea is taken further when you consider a political populist who might want to share their political disposition.
Arguments against the freedom to sell
On the other side of the debate are the arguments against selling your mind. These mainly boil down to retaining a sense of identity, privacy and security. There are other arguments, but they usually come down to one or more of these three reasons. Many arguments are also more generally against configurable brains, which of course rolls into selling your mind.
The identity argument has two main sides. On one side, people argue that a CHB, AAHB or AIHB reduces and conflates a person’s identity. For some, this would effectively make them a fake version that resembles their previous self. On the other side, people point out that identity is the notion that practically all human systems are built on. This ranges from relationships to finances and more importantly democracy and law. As soon as you inject large proportions of your population with brain regions from a select few, you run the risk of ruining your population’s democratic, legal and financial integrity.
Privacy remains a big issue with technology today, with companies seemingly being able to do what they want, due in part to ignorant politicians. In the case of selling your mind, this could be problematic if someone who wanted to sell anonymously was doxxed, or a mistake in the copying process caused sensitive personal information to be exposed.
As is the current case, interests could be established and a digital profile could be built up and sold to advertisers. In fact, a particularly sketchy company could buy brain regions under the pretence of configurable brains and then instead harvest their data to sell or manipulate society with. In this case, a political group could buy this information and use it to create policies that are likely to get a lot of support, regardless of how moral they are.
This idea is brought further when you consider security. Up until now, we have assumed that the selling itself has at least been on legal markets, but this isn’t necessarily the case. For example, an organised crime group could pay off people working in secure locations and literally harvest their entire memory bank. They could also do this by abducting people and removing all communication methods with the use of some sort of Faraday cage. This means that any pre-programmed distress routine reliant upon remote activation would be nullified, and death would not be registered. This could lead to having no active copies or being wrongly perceived as dead and having two active brain copies, a serious legal and societal problem. This argument is less about the actual selling of brain regions, but rather the ability to separate regions through software.
Other than this, sellers with certain political or social motivations could illegally start selling brain regions that drastically change the user’s brain and mental security. They could be made more compliant to an extremist or otherwise corrupt group, they could be made more skeptical of government, science, banks or other public institutions, they could be made more susceptible to isolationist arguments and angry at innocent minority (or majority) groups and so on. Without knowledge of this, and without a system where the process is trialled to check compatibility or approval, the change might not be registered and thus not reported. This means any action to combat the effects would be severely disadvantaged. People might not even know that they have been corrupted.