Philip K Dick and William Gibson on AI

Daniel - Jun 26, 2009 - Literature Philosophy

Artificial intelligences are central to the narratives of Philip K Dick’s Do Androids Dream of Electric Sheep and William Gibson’s Neuromancer. They are the bases for the respective plots, raising compelling questions of morality and identity, among other significant themes. Both authors explore the personalities and characters of artificial intelligences through their interaction with humans; they are shown to have evolved and developed beyond a recognisably synthetic state, and in both novels the resulting contrasts between man and machine are closely examined.

Gibson presents the AIs in Neuromancer as predominantly non-physical entities existing within cyberspace, akin to software developing itself rapidly and efficiently. They are beings far-detached from the tangible human world, and thus have vastly different motives and personalities to humans. “I ain't likely to write you no poem, if you follow me,” the Flatline states at one point, a former friend of the protagonist, emphasising the ambiguous intentions of one such intelligence, “your AI, it just might.” Gibson uses the unpredictability and detachment of the AIs in Neuromancer to explore how such a being would interact and co-exist with traditional biological intelligences.

The Flatline is entirely embodied within the matrix: he serves as an intermediary between Case, the human protagonist of the novel, and Wintermute, the artificial intelligence. The matrix is a "consensual hallucination", an augmented internet accessed through the mind; the Flatline is simply software within it. “Are you sentient, or not?” Case asks him at one point; “Well, it feels like I am,” the Flatline replies, “but really I'm just a bunch of ROM”. His intelligence is more restricted than Wintermute's; he is a recording of a human personality and has no way to develop past the boundaries of his own code. Gibson suggests a fundamental difference between humans and AIs here: the Flatline has been fully immersed into the matrix, yet he retains the features that made him human; his motives, dispositions and feelings are still recognisable and understandable, and his personality still reflects his former self. Wintermute, on the other hand, desires to be more than he is; a concrete personality is unimportant to him. Gibson describes him as part of "another ... potential entity," whose existence is driven by the desire to further himself in the matrix.

By contrast, PKD depicts artificial intelligences as human replicants: unlike Wintermute they live in a human world, forced to abide with human conduct. They are androids, the word itself derived from the Greek ‘man-like’, and they are designed to mimic humans in every way imaginable. Unlike Gibson's artificial intelligences, the differences between humans and androids are slight and virtually undetectable; principally, the only way to detect them is empathy testing, an area in which their personality is lacking. "No T-14 android - insofar, at least, as was known - had managed to pass that particular test."

The idea of empathy is one which PKD focuses on in order to explore the relationship between android and man. The androids in Electric Sheep often want to be human, in some cases even believing that they are: “Ever since I got here from Mars,” one explains, “my life has consisted of imitating the human … Imitating, as far as I'm concerned, a superior lifeform”. They admire humans perhaps because their own design is imperfect; the subtle differences between man and machine in this novel are more poignant to them than they are to Wintermute, who has no desire to follow the same path as man. “How does it feel to have a child,” one android asks, bitter at their limited life, “how does it feel to be born, for that matter?” PKD presents the androids as being infatuated with human qualities, which is surprising when their role in society is considered: they are designed and built solely as off-world servants, or more accurately slaves. Despite this, these androids are not vengeful for the most part ; "You wouldn't enjoy Mars," one states laconically without anger or passion, when asked about the world on which he was enslaved, "You're missing nothing". They are simply self preserving - and being human-like is wholly necessary in this regard.

The first Gibson reveals of artificial intelligences in Neuromancer’s universe is their role as servants, similar to Androids in all but the context, the former being the non physical world of cyberspace. “That’s where all ice comes from, you know?” explains Case, who sees AIs as merely functional beings, written by large corporations to develop firewalls within the matrix. Both authors present intelligence as beings - originally created to serve humans - who desire to break free from their shackles, digital or otherwise. Much of the imagery used supports this: Gibson describes Wintermute as having "hard-wired shackles" which "keep him from getting any smarter". Since Wintermute is essentially a mind without a body, he is restricted in a very different sense to PKD's androids, but imagery of physical enslavement pervades throughout the novel.

Case has avoided intelligences like Wintermute for the simple reason that they are inherently dangerous: Gibson explains that there is no reason to avoid AIs, “not unless you’ve got a morbid fear of death.” On the whole, this suggests a lack of understanding between man and machine; the idea of interaction between the two races is considerably new to both Case and perhaps humans in general. Case’s accomplice Molly has difficulty understanding his intrinsic fear. “How come you aren't just flat-out fascinated with those things?” she asks at one point. To her they are a curiosity, non-human sentience, however Case is indifferent. “I dunno,” he replies, “it just isn’t part of the trip”. These two reactions summarise the human opinions on AI in Gibson’s future world; from Molly’s point of view, they are of utmost interest, an apex of human development, while from Case’s perspective they are simply another piece of software, a distraction in a world already filled with technology. The tone of this conversation is distinctly light hearted, suggesting that the two humans are naïve to the magnitude of Wintermute's potential power.

The infatuation with artificial intelligences suggested in Molly could have a number of different roots, each hinting at a narrowing boundary between man and machine. “It’s just the way I’m wired”, she states on more than one occasion: Gibson's language suggests a connection to computers, the word ‘wired’ even giving a sense of physical similarity; both she and the artificial intelligences exist only to do what they are programmed to do, consequently when she encounters an artificial intelligence which is more than a involuntary automata, it is a strong source of interest to her. Her desire to augment herself also mirrors Wintermute: “I had the blades in”, she states at one point, describing an operation to enhance her reactions and physical ability, “but the fine neuromotor work would take another three trips”. Her acquaintance with artificially improving herself is perhaps one reason she helps Wintermute; almost as if she feels familiarity with it, or its motives.

The notion of interaction and understanding between humans and AIs in both novels is impeded by the matter of trust. In Neuromancer, we learn, artificial intelligences are not extended much: “Every AI ever built has an electro-magnetic shotgun wired to its forehead” explains the Flatline. Even humanising beings such as Wintermute is dangerous: the Flatline's depiction of the AI's metaphorical 'forehead' risks bridging one gap between the two races: "He. Watch that. It. I keep telling you,” he later warns Case. There are, however, those who do fully appreciate the magnitude of artificial intelligences' latent power – they are known simply as ‘Turing’, after a forefather of modern computing. If an artificial intelligence steps out of line, “Turing'll wipe it”. Their name strongly echoes the idea of the creator turning on the creation, an idea which is dominant in both novels.

Turing, whose purpose is to prevent AIs from developing too far, mirror the bounty hunters in Androids — the sole purpose of each is to control and destroy rogue intelligences, although in both novels their roles are shown from very different perspectives. In Neuromancer Turing are genuinely afraid of AIs: "You have no care for your species," one Turing agent says to Case, "for thousands of years men dreamed of pacts with demons". The imagery presented here is almost religious: Gibson suggests that beings such as Wintermute have gone beyond all understanding, elevated even to the status of gods or demons. Deckard, on the other hand, a bounty hunter, only sees androids as a threat to his own wellbeing — they are still well within the realm of control, and his only motive for retiring them is the "bounty money", rather than a deep sense of social duty.

The androids of PKD’s novel are also fundamentally different to Gibson’s in that the intention of their designers is for them to be human-like. Interestingly, since they fill the function of servants, they are in a parallel situation to Wintermute, however since their physiology and specifically their psyche is designed to replicate humans’, they feel this enslavement far more acutely and many attempt to escape from it. “Do androids dream?” Deckard asks himself at one point — “Evidently; that's why they occasionally kill their employers and flee here”. He makes the connection: it is the androids’ similarities to humans which prompts them to desire freedom – the simple fact that they continually escape is evidence for this. "Hell, all Mars is lonely," one describes, even evoking some sympathy in the reader; "it's an awful place". Wintermute, on the other hand, has no notion of freedom in the human sense, and certainly no desire for it: "Those things don't have any autonomy" describes Case - thus Wintermute seeks to break his shackles in a very different sense.

The congruity between androids and humans in Androids is emphasised greatly by the ambiguity of what it is to be non-human. PKD suggests that despite some minor differences, they are none too dissimilar to us. A colleague of Deckard’s, Resch, for a time believes that he is an android: “Listen, Deckard”, he says "I want you to … You know. Give me the Boneli test or that empathy scale you have, to see about me”. Resch finds it impossible to determine his own race, which establishes how similar the two are. Alongside this, some androids have the occasional tendency to believe that they are themselves human. When they do find out the truth, their reactions are presented as quite human-like: “You guessed when he asked for one more try”, an android’s superior asks her, after she has been subjected to an empathy test. Her only retreat is ironically the human emotion she has been programmed with: “Pale, Rachael nodded fixedly”.

Gibson also explores the idea of identity through Wintermute, who is largely un-human, without a need or desire for human characteristics. “I don’t have what you’d think of as a personality, much”, he explains. He is forced to manifest himself through figures Case recognises from real life; a “spokesperson” used to interact with him. Initially, Wintermute attempts to communicate with Case using a projection of Case's former girlfriend, whom he has earlier seen being murdered. This backfires, the AI appearing to overestimate his compatibility with the human psyche: “I was hoping to speak through her,” he explains, “the emotional charge.... well, it's very tricky.” We see Wintermute attempting to connect with a human at a level he is incapable of; eventually he resorts to choosing a character Case is less close to, a powerful symbol for the lack of congruity between man and machine. Apparently Wintermute can not connect fully with Case, or manipulate him at the level he had initially desired.

Similarly, PKD shows Rachael - an android - attempting to connect with Deckard with a corresponding emotional pretext to Wintermute's. “I love you,” she states, hoping to seduce Deckard, to sway his ability to retire androids. Interestingly, both Wintermute and Rachael become more human-like when they are hoping to manipulate the novels’ protagonists. Through these two characters, the authors suggest that manipulation and dishonesty are aspects of human nature which are easily imitated - and our weakness for emotion is the most easily manipulated.

The question of identity in Androids is complicated by the existence of identical androids; PKD suggests through Rachael and Pris, the former a pawn of a larg corporation, the latter a renegade escaped from Mars, that identity is shaped to a large degree through experience, as although they are physically indistinguishable, these two androids are presented as having strongly opposing characteristics. Ironically, it is Pris, who has known she was an android since she was created, who shows the most compassion towards humans: “J.R.’s value to us outweighs his danger”, she pleads at one point, helping a human who risks being murdered. Conversely, Rachael, who for the first two years of her life believed herself to be human, shows more compassion towards her own kind; “You’re not going to be able to hunt androids any longer”, she calmly explains to Deckard, after having tricked and seduced him. PKD suggests that perhaps simply giving intelligence to a being is not sufficient, that life experience is important. This is exemplified by Pris' lack of many basic social skills, despite her being the most advanced android model to date: "something else had begun to emerge from her. Something more strange. And, he thought, deplorable. A coldness." PKD presents her as inhuman and detatched; the 'coldness' he describes suggests that the androids' personas have a rather disquieting aura.

The development of identity within Gibson’s artificial intelligences seems to have been far more organic than in Androids — he describes them as having developed themselves rather than being upgraded by humans. “Those things”, describes the Flatline, “they can work real hard, buy themselves time to write cookbooks…” The Flatline's language suggests that Wintermute's task has not been simple; indeed, any complacency on the AI's part is likely just a nuance of his chosen personality. It would seem that much like any biological organism, AIs are unique, only special in the sense that their evolution has been accelerated and carefully self-regulated. This gives them an unpredictable edge – since they are so complex, their objectives are often difficult to identify. “Real motive problem, with an AI,” explains the Flatline, “Not human, see”. Further emphasising the incompatibility between the two races, the AIs appear to find humans equally perplexing: "You're a pain”, states Wintermute. “The Flatline here, if you were all like him, it would be real simple. He's a construct, just a buncha ROM, so he always does what I expect him to.” PKD’s androids often have motives which are easy to understand on human terms; survival, being human-like… Wintermute, on the other hand, is not so straightforward.

Another fundamental difference between humans and artificial intelligences is exemplified in the way Gibson and PKD show their AIs with a distinct lack of human values. Wintermute shows an almost total disregard for human life: “You killed ‘em,” Case exclaims at one point, “Crazy mother-fucker, you killed ‘em all”. In humans, a willingness to go to any means to achieve our ends would suggest insanity, at the very least amorality – here, however, Gibson implies that Wintermute has no concept of morality on human terms, or more fundamentally, no need for such a concept. He sees people and situations as objects, which can be manipulated and then discarded. In a similar light, PKD’s androids, although they seem at times to simulate morality, are in essence without many human principles. “You know what I think”, states an android at one point, having just caught a spider, “I think it doesn’t need all those legs”. She then proceeds to mutilate it with an almost innocent naivety, a lack of conscience which might even suggest an undeveloped, childlike mind. This is perhaps caused by their short lifespan, and suggests that there is something to being human which cannot simply be programmed into the mind of an android. Wintermute seems to enjoy his role in reshaping himself. In describing Armitage, whose personality he has twisted to his own ends, he talks of him being “Stable enough … for a few more days”. When the man does break free from the constraints the AI has placed in his mind, it results in suicide – and Wintermute’s reaction is far from sympathetic. “Yeah. Armitage was already gone”, he states almost casually, with sadistic humour, “Hadda do it”.

Throughout both of these novels, artificial intelligence is presented as something subtly immoral – something in many cases trying to be human, but always falling short. In Do Androids Dream of Electric Sheep, PKD depicts androids who desire to be human, but who lack altruism: empathy, Deckard asserts, “evidently existed only within the human community”. In Neuromancer Gibson presents Wintermute as having completely separate desires to humans – “If I were gonna subject you to my very own thoughts, let's call 'em speculations”, he explains, “it would take a couple of your lifetimes”. He is exponentially complex, and is often presented as a superior being, however far from having evolved beyond humanity, he seems to have detached himself almost completely from the humans which created him. Wintermute is defined by his purpose, however he wants to further himself – whereas the androids in PKD’s novel simply want to be what they were imperfectly created to be: human. By the end of Androids, Deckard realises that empathy and morality are questionable traits even within humans - and the imperfect androids serve only to exemplify this. “Empathy,” he notes, “must be limited to herbivores”. Ultimately, he realises that this one defining characteristic of androids, their lack of empathy, is shared by humans, including himself. In Neuromancer, Wintermute succeeds in bypassing human morality and empathy – or specifically the need for them. “I'm the sum total of the works,” he states, “the whole show”.

bloggeratf@gmail.com

who wrote this? Id love to talk to them (mail me)

Shenoy

Very nice. Was waorth the time spent reading it. I sure hope you take it forward as a smaller part of a larger part on the way AI (both strong and weak) is treated in SciFi literature. Of course to track all of them down would be quite a task, but that said some major authors have more or less made their stance clear on strong AI and singularity if that can be said to follow from that. Vinge follows surely. But so too does Neal Stepehnson who in Diamond Age calls them Pseudo-Intelligence not 'Artificial' intelligence because there's only a semblane or illusion of sentience or Robert Sawyer whose AIs never go beyond 'merely clever' simulations. ah, it's but a sea and to do it with these two greats is a good start. My compliments.