Be Right Back – Black Mirror – Review

Be Right Back – Black Mirror – Review

This is a review of Season 2 Episode 4 of Netflix’s Black Mirror. Once again, this is not a review of the storyline, cinematography, character choices or anything like that, but is instead a review exclusively of the technology within the episode. The episode, called ‘Be Right Back’ was aired on the 11th of February 2013, airing on Channel 4 and later being acquired by Netflix and put on their streaming service. We recommend you watch the episode as there will be spoilers.

Relevant technology – the artificial resurrection

The first question is what actually is the technology in ‘Be Right Back’? In as concise terms as I can put it, the technology involves AI taking the appearance, voice and personality of a person, in this case the recently deceased Ash (partner of the protagonist Martha). The AI does this by harvesting data from known interactions the person has made and attempting to imitate those interactions in a new context. This data set can be added to with private data, namely messages and media. This all contributes to the desired effect of making AI seem indistinguishable from a person.

Stage 1: Text

The first stage at which the artificial resurrection happens is an email. Once signed up to the program, an email is sent to confirm interest. After the protagonist, Martha, clicks to accept the software analyses the now deceased Ash’s tweets, Facebook updates, messages and other publicly accessible information. After this, Martha is entered into a chatroom with what appears as Ash’s profile picture. After this, they talk for some time, with Ash responding fairly similarly to how he would have in real life.

This is not only likely but has actually existed for years to some capacity. Back in 2013, a service called “That can be my next tweet!” was released, which analyses all prior tweets to create a new tweet with no additional input. More recently, OpenAI, a capped-profit company previously under Elon Musk’s leadership, created a text generator that they deemed too dangerous to publicly release. The last notable example of this is much closer to how it was in the episode. James Dunn was born with Epidermolysis Bullosa, a rare and devastating skin disease. Before he died he connected with a neural network chatbot which used his conversations to train itself to speak like him. This makes a text generator based on public information, like that in ‘be right back’ entirely possible.

Stage 2: Voice

Stage 2 happens very quickly after stage 1 in the episode. Martha, upon being made aware that voice interaction was possible, uploaded all the clips she had where Ash spoke. The software then analysed the footage very speedily and called Martha’s number, with the artificial voice being exactly the same as Ash’s. This continues for a long time with conversation feeling very smooth, although the episode makes a point of stating that the AI did not understand phrases invented by the couple until they were explained. This is seen in the confusion over what ‘throwing a jeb’ meant. Also, it is important to note that the vocal output was damaged when the phone was dropped and cracked, but could be restored with a new phone, due to the voice being cloud-hosted and not part of the phone’s internal memory.

Vocal imitation is a lot harder to do convincingly, though it’s not impossible. Currently, voice assistants can be seen in many households and devices, such as Amazon’s ‘Alexa’ devices or Google Home devices. These work similarly to the vocal software in ‘be right back’ in that they interpret speech and respond appropriately. It would take a fair bit more work to have the voice sound like someone based only on recordings, but this does seem at least conceptually manageable. The other challenge is to make an AI understand context much better, but with exponential improvements in computing and AI this seems very likely.

Stage 3: Artificial Body

Perhaps the most important technology within the episode is the artificial body. In the episode, this is done by placing a weird humanoid object into a bath. The specific requirements were that the blank body needed to stay upright until opened and left in the dark while forming. The object later reveals itself to be a convincing lookalike of Ash, rendered in 3D. In the episode, Martha remarks about how well the body looks and how smooth it is. This is explained as the system only being based on saved media, which tends to be more flattering. The smoothness is said to be due to some features being texture-mapped and some being rendered only in two-dimensions. The body is stated to be able to chew and swallow but doesn’t need to eat, and it does not need to sleep. It also does not have any blood or usual biological reflexes. The body is also able to retroactively change its appearance if the change is minor.

From the details given, it seems like the process is chemical, with some chemical reactions being sensitive to light and needing water. To me, however, this method of creating an artificial body is simply absurd. We have talked before about artificial bodies that a digital being may have, but the idea of a chemically self-made body is completely implausible. There are ways to pre-program something to make an exact look, but this is never done by the thing itself on such a scale. Maybe nanobots could be used to do something like this in the far-off future, but that wouldn’t require a lack of light or water in all likelihood. Despite that, the concept of an artificial body could be realised by other means.

Other technology in ‘Be Right Back’

There were a few other technologies in ‘be right back’ that served to give a timeline of the episode. Things like a super-thin phone, gesture-controlled laptop and wireless charging desk suggest this episode takes place in the late 2020s to mid-2030s, as these things all exist albeit at a very high price. There is also one instance where the fake Ash looks something up instantly without using an external device. This, combined with the lack of need for food and sleep, makes sense as all of these factors are true of those with artificial bodies.

Review of the issues raised in ‘Be Right Back’:

Consent and Privacy

Black Mirror isn’t just a speculative showcase of futuristic technologies. It’s also intended to make people think about the potentially devastating consequences these technologies could bring. This theme is very apparent in ‘Be Right Back’, with the false hope given to a grieving pregnant woman, Martha. She is not the only person who is completely screwed with either. What is less noticed in the episode is that the real Ash has no say in the events of the episode past his death.

Ash didn’t give permission for his data to be used in such a way, nor his likeness to be copied. This is less avoidable for all his public accounts, with this kind of software being entirely possible to use in such a way without consent. More than this though, Martha surrenders pretty much every piece of private data she has of Ash to the algorithm. This is very close to crossing the legal line but is nonetheless easily made into a complete abuse of a person’s memory and privacy. Even outside of the fictional universe of ‘Black Mirror’ this is a growing trend, with companies like Facebook and Google being exposed for data misuse. Without proper regulation from a government that understands the issue at hand, this could become one of the worse cases of data abuse in human history.

Identity and its abuse

Another huge issue raised is that of identity. Though the episode stuck exclusively to creating an imitation of a deceased person, there is no technological reason to stay in such a bound. This could easily lead to extreme cases of identity theft and abuse, with many terrible outcomes. Let me suggest a few worst-case scenarios to make this point clear.

In one such case, a very public figure, such as a high-up executive, government official or celebrity could have all of their public data crawled through and their exact likeness copied. A political or personal attack could then be staged with video of the fake person doing something disgusting like paedophilia (with a fake child), or a fake meeting with some ‘enemy’ such as a dictator, or even a fake ‘official’ announcement in which the person says something very damaging. This could ruin the reputation of the people involved, but it could also be used as easily as a scape-goat for real disgusting behaviour. There would be no definitive way to distinguish between the real and fake body in most cases.

In another scenario, a similar figure could be kidnapped and have all their private information extracted. They might then be copied and the fake version take the place of the real one, potentially with some dodgy story about acute memory loss about non-digital conversations and relationships. Theoretically, a large proportion of the population could be replaced in this manner. Also, someone could start by replacing people at the lower tiers of a company or security force and eventually when the majority where replacements, these replacements could work to capture someone like the president or prime minister. This capability effectively kills modern democracy, something far beyond the scope seen in ‘be right back’. In my opinion, this technology should be banned as soon as possible to avoid such a catastrophe, though I have little hope.

What do you think about this? Feel free to let us know.
Please follow and like us:

Loui Coleman

Author of Generation Byte

Leave a Reply

error

Enjoy Generation Byte? Please spread the word :)