Sadly, our unit on Neuromancer has come to a close.
But is it sad? While I will miss discussing literature in class, our close reading of Gibson’s Neuromancer has effectively set me on an interesting thought path- an equation that is perhaps unsolvable.
Since the novel deals largely with artificial intelligence, the question of why AI’s trope is always to turn against humans was brought up. Surely the technology would recognize that it was created by a flawed being (a human) and would thereby consider itself flawed. It would be more likely to destroy itself before it would destroy humanity.
In response to this, someone asked why the AI would not try to fix the flaw, instead of destroying itself?
Zach Whalen, my professor (hi! Is it weird if I use your full name??) offered this insight: AI lacks the empathy required to attempt to repair humanity. So, he proposed a way of going about programming empathy into technology:
And was kind/awesome enough to draw up a hypothetical equation:
Although I did a project on coding, I am still the opposite of qualified to pass judgement on any sort of code or program (but, here is a Neuromancer related post by Mary, who is crazy smart and would actually be able to discuss the validity of this code).
As far as what I have to offer to this thought problem, I thought that surely we could define empathy in a fashion more suited for the language of a computer- or at least the definition could be more specific.
I researched “Define empathy,” which looks like this
and was overall not helpful.
Next, I tried searching, “Define methods of expressing empathy,” which was either a mistake or a stroke of genius. Evidently there are several schools of thought surrounding human empathy, before AI is even considered. Eventually the search for empathy turns into a discourse on compassion, which Einstein himself commented on.
Here is a webpage that has organized most, if not all, of the Q&A about empathy.
All of the rhetoric involving how empathy is defined is almost as difficult to wrap one’s mind around as string theory. I do not yet have a solid means of formulating the empathy equation, but I am thankful to Gibson’s Neuromancer (and my classmates/professor!) for leading me to explore the idea. It often seems to me that the process of asking and solving is more rewarding than knowing the answer. I guess that’s a reason to go to college??