Saturday 26 December 2015

The ethics of copying the soul

While recently watching another hapless victim play through the video game SOMA, and watch them struggle with the range of ethical questions this game enjoys throwing at the player, I had a few more thoughts on these dilemmas offered. As usual, the caveats about spoilers and such for those who haven't played this game yet apply. Otherwise feel free to read on.

The main dilemma the player in SOMA is presented with is basically the definition of 'I' and whether this definition of a 'self' can or has to be unique. Basically what the implications are of scanning a person's brain and copying its personality, memories and other qualities to a new host.

In one scene, the player is told that he has to transfer into a new body. Since the player's avatar at that point is already known to be a computer running a brain scan of a once-living person, the assumption is made that one can just 'move' this scan onto the new body's neural chip. After the process finishes, however, the avatar realises to his horror that after 'waking up' in the new body, 'he' is also still in the old body.

The dissonance caused by this situation has to do with the fact that at that point in time, there are now two instances of what either instance will perceive to be themselves. Each of us is used to there being just a singular 'I' at any given point in time. This dissonance can be observed already with (identical) twins, and how their environment responds to them. With how important identity is to most, people become confused, even angry, as they fail to distinguish between which twin is which 'instance'.

Twins themselves deal with this fairly easily because to them it's obvious that this other person who looks identical to them isn't 'them', but still very similar and thus also very familiar. This usually creates a much tighter bond than between more dissimilar individuals, exactly due to this fundamental familiarity and mutual understanding.

So what is it that causes many players in the aforementioned scene to make use of this option to terminate their old body and with it their other 'self'? One justification to terminate their own body's functions is that this body and thus the person inside it is trapped inside a facility filled with monsters, so termination is the 'humane' option. One could argue here that this takes away the right to decide over one's fate. What seems humane to some is thus still murder.

Worse, it dodges the basic moral and ethical question being asked over and over again: what is life, and when is it worth preserving? Throughout the game one is presented with a wide variety of situations, including a person whose mind is trapped inside a robot, but who can still communicate, to a real human being who is being kept alive while injured, but you need to tap into the system that keeps her alive in order to proceed in the game.

At each point you can quite literally just pull the plug, turning off life support, or disconnecting or discharging their power supply. Basically you are asked 'would you terminate this existence?', and at each point you have the choice to either kill that person, or leave them as they were while going on your merry way.

This repeated question becomes the most poignant when this question gets asked about what you saw until just shortly before as your own body and self. Would you kill yourself if you knew that afterwards there would be just one copy of you in existence? What makes you more worthy of being alive than this other... person?

With the ending of SOMA, the game superbly highlights the intense hypocrisy of those who decide to terminate the existence of others - including their old self - as an act of mercy. As a copy of the avatar's mind and his companion make it onto the escape vehicle and towards salvation, the previous copy - himself also a copy - stays behind in the hell which his other self just escaped. The response is one of outrage, anger and of feeling betrayed. There's no sense that it is all right now, that this copy is all that matters.

What this indicates is that a copy of a person is unique onto itself, just like how two identical twins are still unique individuals, despite their similarities. They still got their own wishes, feelings and desires. Given enough time their experiences and memories will diverge sufficiently that both copies are no longer copies, but as unique as if they grew merely up as siblings.

This fear of it somehow being 'wrong' to have two diverging copies of a person's mind is however what drives many characters in the game to extreme measures. Basically at the moment that their brain scan completes, they will kill themselves, thus ensuring that there is only version of 'them', and no diverging. Just the perfect brain scan from which they can continue as their whole, uncorrupted self. This then touches upon the belief that a human is merely human due to this mystical property called a 'soul'.

Having two diverging copies would then somehow violate this principle, as it should not be possible to make a copy of a person's soul, at least if many holy scriptures are to be believed.

The interesting thing hereby is of course that nobody can then exactly explain where souls come from, or how the whole mechanics would work. Like, if a couple produces a child, does this child get a brand new soul, or a used one? Is there some massive stock of brand new souls awaiting population rise? Is a new soul created from partial copies of both parent's souls? And of course, how could one copy something that is supposed to be completely immeasurable? Shouldn't a brain scan fail to produce a viable copy? Many questions lie along this path.

Back in what at least most would assume to be reality, I think that the view of parenthood and offspring might be the most viable way to look at the production of copies of a person, as in that it is essentially creating offspring through asexual reproductive means. Much like how a bacterium can split itself into two and have both identical copies go on their way afterwards without missing a beat.

In this view, to then terminate the old instance, the one which essentially allowed you - the new instance - to be created, is then akin to patricide or matricide, i.e. the murdering of one's parent.

Everything taken together, I must say that this one video game brings an incredible amount of material to the table on which one can philosophise for a very long time. Yet the intriguing thing here is that one day all of these questions may become highly pertinent, nay essential. One day people will copy themselves to a new body and order the old body to be destroyed, or similar. We do not know what the future brings exactly, but we do know that sci-fi like this has a tendency to become more real than some may want.

Would you kill your original?


Maya

1 comment:

Unknown said...

This is an issue I've spent a good bit of time considering, starting when one of my Philosophy professors opened a class on Artificial Intelligence by having us watch John Wheldon's animated short "To Be". https://www.nfb.ca/film/to_be

If I felt that the copy was trapped in an unpleasant place and had no way to escape plus felt that I was likely to survive in my current self or had other copies that would persist, I would terminate the copy/original. To me, the important issue isn't some intangible quality of soul or self, it's the demonstrable essence of my will. There are things I want to see done and whether there is some "me" around to accomplish them is how I measure survival. Were I a copy that was being terminated I would be annoyed, maybe even scared, but I feel that way every night when I go to sleep if I spend too much time thinking about what really happens. What I care about is that my will is done as well as I can manage, and the more copies of me that exist the better.

If some copies are trapped or damaged to a point where they can't effectively undertake to achieve my goals, one or more instances of "I" will destroy them and make more. I suppose that makes it dangerous for any specific "me" and I'm the sort that would cooperate with my selves to prevent all out war.

Ultimately, I would be likely to tell a known-good copy of myself to destroy me to prevent my mind from being misused or permanently tormented. It would be frightening, but life is like that.