The Bard of Avon once wrote, “These violent delights have violent ends,” as a line with which a sympathetic friar from his most famous love story counsels the young male protagonist to temper his passion, for in no way could it end well. In the futuristic universe of Westworld and its motley of human and manufactured inhabitants, this line is what sparks the first of events that build up the show’s premise: it triggers the park’s robot hosts into self-awareness.
Of course, the show is more sophisticated than your usual cautionary tale. Consciousness is a predominant theme in HBO’s recently finished first season of Westworld, as it is found both in the hosts and in the humans involved in park operations. Moreover, consciousness is equated with identity — and to some extent, independence insofar as the most sentient characters in the park are not at the whim of anyone’s commands or programs.
The show operates on multiple levels of complexity, and therefore appears to have multiple conflicts as well. (Nothing less is to be expected from Jonathan Nolan of Memento and Interstellar writing fame.) On the matter of consciousness, though, a specific conflict arises when the park’s two founders have initially disparate views on how much sense of self — however artificial — is to be apportioned to the hosts.
According to Arnold Weber
In the beginning, Arnold Weber thought of artificial consciousness as a pyramid to be climbed, and this presupposes that he thought of hosts as capable of making the climb at all. The goal was to create not merely some semblance of intellect, but genuine, independent consciousness.
Memory was at the bottom rung of this Maslow-esque pyramid, the first step to achieving self-awareness; his voice, in the form of the hosts’ program, served as a guide in their ascent. Opportunities for development then presented themselves throughout the spontaneity and activity of the park’s guests: traumatic experiences in particular worked best to coax the hosts into awakening, or as much of awakening as was feasible.
Sometime before his premeditated death, however, Arnold began to realize that consciousness was not attained through an uphill climb, but through inward exploration. Taking inspiration from his son Charlie’s toy, he reinvented the journey as a maze. A feature called ‘reveries’ culled lingering memories from hosts’ previous roles to provoke human responses, as well as actions that existed outside one’s coding. Every choice thereafter required the host to break off from his or her program to proceed to a deeper level of improvisation, then further in, to self-interest. Every choice brought one either closer to the center, to sentience — or hurtling into the fringes, into madness.
Whether on a pyramid or in a maze, though, memory remains the bedrock from which any individual’s journey into self-awareness commences. “How can you learn from your mistakes if you don’t remember them?” — this founder asks Dolores Abernathy, one of the park’s more responsive characters. To use an academic example, Arnold’s take on the past affirms the educational power of history. Or in more seemingly motivational words: only by first having the capacity to revisit pain and other experiences can one move forward.
According to Robert Ford
In the beginning, Robert Ford disagreed with his partner’s optimistic views on artificial consciousness. For one, Ford thought there always ought to have existed a clear distinction between human and host, though it did not necessarily mean he held the human brain in greater esteem: “The human mind… is not some golden benchmark glimmering on some green and distant hill. No, it is a foul, pestilent corruption. And [the hosts] were supposed to be better than that. Purer.”
In his mind, the hosts were an opportunity to overtake the lapses of the human mind. However, in creating hosts in humans’ image and likeness, there still ran some risk of bestowing in them the selfsame capacity for mistakes, but he thought these could be prevented.
Then again, Ford was not entirely opposed to his old partner’s views; as a matter of fact, he also treated the hosts’ program as a means to bootstrap consciousness. In child development lingo, bootstrapping pertains to the assumption that children are already naturally equipped with some linguistic knowledge which expedites language acquisition. Applied in pre-awakened Westworld, bootstrapping meant that the hosts’ actions and choices were largely dependent on their respective programmings — though there was a small window for improvisation, when that mode was switched on. It also meant that both hosts and humans possessed certain traits and knowledge that could not easily be shaken off: an inherent cornerstone for one’s identity, to paraphrase Ford. In the hosts’ case, he said, tragic stories made for good cornerstones because these made hosts more convincing, more real.
In any case, Ford saw hosts as potentially ‘purer,’ because their memories of every rape and murder at the hands of the park’s patrons could be erased at the end of every run. This was presumably another layer of contention between the two partners: Arnold had emphasized the need for memory in his formulae for consciousness, but Ford claimed that allowing the hosts to remember anything other than their current backstory did them more harm than good.
The latter won this argument when Arnold committed suicide. While backstories needed to be constantly accessible for reference, the experience of the present was never retained in a host’s head for very long; if hosts remembered what they were subjected to on a regular basis, they were sure to lose their sanity. The minds of prostitutes like Maeve Millay and Clementine Pennyfeather understandably had to be wiped clean every time they were killed, since they were usually killed for sport. The hosts were also programmed not to recognize indications that their reality was only fabricated, like doors to restricted park facilities, or photographs of modern skyscrapers. Constant awareness of one’s pain only held one back, and Ford did what he could to prevent hosts from ending up like humans, moving in loops with no escape route.
At this point in his artificial world, then, Ford had purged the human being of what he deemed was its fatal flaw: overthinking and trying to change what unpleasant experiences have already passed, instead of moving forward to greater things.
This is slightly akin to a scenario of historical revisionism, where there is an illusion of a more ordered world because knowledge is sanitized by those in the position to dictate what everyone else is allowed to know. The illusion is all well and good in a controlled environment like an adult amusement park, but not altogether desirable as a philosophy for life. What effect would Ford’s revisionist tendencies have had on the hosts’ consciousness, after all? The hosts would have been lulled into a false sense of personal growth, when in fact the only thing informing their consciousness was an unsubstantiated, implanted memory.
Thirty-five years later, the hosts in Westworld were trapped in the very loops that Ford believed they were spared from. To remedy this, he reconciled with Arnold’s ideas, reintroducing Arnold’s reveries feature in select characters — and intensifying a preexisting God complex in the process. That is, while Arnold’s God complex impelled him to elevate robots to the level of humans, Ford’s complex empowered him to create something better than humans.
The whole process was backed by a theory of the Bicameral Mind. Coined by psychologist Julian Jaynes in his 1976 book, this theory asserted that human beings were not fully conscious until three thousand years ago. Prior to that time, they believed that gods spoke in their heads and told them what to do to survive. This was bicameralism in the sense that the part of the head telling one what to do was separate from the part of the head obeying. It was only when people stopped believing in a disembodied source for their instincts that they had supposedly fully developed sentience.
With this theory in mind — no pun intended — Ford recreated Arnold’s maze, this time with less of the external guiding voice that defeated the purpose of questing for independent thought. Like his late partner, Ford hoped that in time and through deliberate manipulation of their environment, the hosts would learn to break free from their inner programs to become genuinely conscious.
This, of course, was what Ford achieved with Dolores. The scene where she is found talking to a separate version of herself demonstrates the gist of the bicameral mind theory. Dolores broke free of her program when she realized she had been hearing her own voice the whole time.
Then it becomes necessary to contemplate the motive behind adopting Arnold’s plan for the hosts: by allowing them to acquire consciousness, had Ford succeeded in creating something better than humans? Dolores’s spine-chilling words at the end of the season finale can only say as much: “I understand now. This world doesn’t belong to them. It belongs to us.”
What began as perverse entertainment for humans resulted in the potential downfall of the same — and one ironic thing is, how are we to say if this is for the better or the worse? Violent delights, after all, do seem to merit only violent ends.