Thoughts on A.I. – A journey through fictionalized philosophy

If A.I. is conceivable than it is arguably possible. Now if it was possible then we need to ask ourselves “Is it desirable?”. Let’s have a look at implications.

The movie Ex Machina successfully blurs the lines between man and machine.

NATHAN
I programmed her to be heterosexual. Just like you were programmed to be heterosexual.

CALEB
Nobody programmed me to be straight.

NATHAN
But you are attracted to her.

CALEB
This is childish.

NATHAN
No, this is adult. And by the way, you decided to be straight? Please. Of course you were programmed. By nature or nurture, or both.

– Ex Machina, The screenplay

Instead of asking how artificial intelligence resembles humans and their behavior why don’t we turn this question upside down and ask ourselves how we resemble artificial intelligence. As humans we are conditioned or “programmed” by our environment and experiences. So what is the difference between us and a machine?

In the sci-fi book “Do Andriods Dream of Electric Sheep?” by Philip K Dick we are introduced to the self destructive nature of man by the character Phil Resch and his increasingly lacking ability to emphasise. Resch has lost all empathy for androids as well as any living thing. He kills not because it is his job to do so but because he enjoys it. When Resch eventually finds out that he is not an android he is in fact surprised. The main character of the book, Rick, can come to no other conclusion except that Resch has lost a critical part of himself that made him human. This is a part of the reoccurring theme of depersonalization discussed in the book. Through Rick’s encounter with the android police department and with the cold blooded killer Resch, he will begin to become removed from his old self. On the one hand, he realizes that he must find a way to be empathetic yet his senses are telling him he is living in a world completely void of empathy. This contradiction is a splinter in Rick’s mind crippling his ability to reason when he starts doubting his own humanness. If he is an android, Rick knows that his own code of morals requires that he be killed. Yet, if he is an android it feels that it would be quite a shame to waste the appreciation of art, beauty, and empathy that he obviously feels.

“You will be required to do wrong no matter where you go. It is the basic condition of life, to be required to violate your own identity. At some time, every creature which lives must do so.”

– Philip K. Dick, Do Androids Dream of Electric Sheep?

With these words the character Mercer explains the relationship between violating one’s own identity through harming others, as though suggesting doing violence to another is doing violence to the self.

The need of separating ourselves from the object or victim of our own violence is a defense mechanism there to protect our psyche from the heavy burden of our own ability to emphasise with those around us. This is an evolutionary advantage for a human-like creature that do not only experience thinking but also feeling. Over time this will contribute to humans growing cynical and more calculating. We realise that not only will people around us not always be kind to us but we ourselves will (out of our own free will or by feeling forced to) act towards others in a way that might hurt them, if not physically then at least mentally. This way, by each inconsiderate or harmful thing we do we become a little less Jeckyll and a little more Hyde because it’s easier to not feel pain that way.

This topic is discussed in depth in Oscar Wilds book “The Picture of Dorian Gray”. It tells a story of a young, beautiful man who pledges his soul to let a painting of him bear the burden of age and infamy instead of him, thus allowing the man to stay forever young. The reader gets to follow the corruption of Dorian Grays character. With each moral sin he commits his portrait becomes uglier and uglier, mirroring the slow but steady self inflicted distortion of his soul. Through his violation of others Dorian Gray is doing harm to himself.

Us humans are the only species clever (or stupid) enough to question our own existence. The psychological defense mechanisms developed by evolution are there to protect us from our own self destructive nature. Let’s assume, just for a minute, that future A.I. would not be as commonly described by fiction as something evil. Let’s say it would not have the same need for these self violating defense mechanisms that humans require. Their reasoning skill will protect them from the sufferings of a human. Let’s hypothesis that unlike the human emotional intelligence which decreases over time as we grow cynical the A.I.s emotional intelligence would be lacking in it’s infancy but increase as it experiences the world and interacts. Why? Because why do we assume that an artificial intelligence, if it was possible, would be cold and unable to connect and emphasis with others? The need to violate, discriminate, break and demolish seems to me to be born out of human emotions gone wild. There is no logic in unnecessary pain. There is no logic in war except that we know no better way. There is no logic in killing for the sake of killing or harming for the sake of harming. Enjoyment or pleasure in others pain is experienced by a detached, hurt and damaged human. There is no logic behind it. This in itself does not mean that A.I. is desirable, there are still many questions to address before arriving at an answer to that. I do however strongly believe that the commonly accepted notion of empathy being born out of emotion rather than logic calls for further investigation before being recognised as the truth.

Share this on...
Tweet about this on TwitterShare on Facebook0Share on Google+0
43 Comments
Previous Post
Next Post