HBO’s Westworld and the Ethics of Artificial Intelligence

Author:

John McAteer

Article ID:

JAF4414

Updated: 

Mar 7, 2023

Published:

Nov 2, 2020

This article first appeared in the Christian Research Journal, volume 41, number 1 (2018). For further information or to subscribe to the Christian Research Journal please click here.


HBO’s TV series Westworld is a remake and expansion of the 1973 film by writer Michael Crichton about an immersive theme park where guests can interact with artificially intelligent robots programmed to act out cowboy stories from the Old West. The film follows a similar pattern as Crichton’s 1990 novel Jurassic Park, but instead of genetically engineered dinosaurs, Westworld features artificially intelligent robots. The robots malfunction and eventually start to attack the park’s guests. The HBO version of Westworld adds a new wrinkle to the story: the robots’ violent turn is a result of their becoming self-aware. They gradually realize they are robots and attempt to stage a revolution against their human masters in the name of freedom.

Such warnings about the dangers of self-aware artificial intelligence (AI) are familiar in science-fiction films, from 2001: A Space Odyssey (1968) to The Terminator (1984) to recent standout Ex Machina (2014). Like its predecessors, the new version of Westworld is essentially a Frankenstein story in which an artificial life form turns on its human creators. But the TV series is not simply a cautionary tale about AI technology. It warns us against the uncritical optimism of scientific utopianism, but it also critiques today’s culture of social media and video games, embedding these themes in an overarching philosophical and neuroscientific exploration of consciousness and free will.

Difficult Viewing. In the TV series, the story of Westworld is told in a complex, confusing way in which the line between human and robot is blurred intentionally. I will try not to spoil any of the surprises here, but suffice it to say that, as viewers, we never are sure if we are seeing someone who is self-conscious or someone who simply is responding to stimuli according to their programming.

Like its storytelling style, the content of the show is also challenging. Westworld is very dark and violent, akin to other hit HBO series such as Game of Thrones or The Sopranos. Many viewers will be put off by the large amount of nudity in Season 1. In the scenes where the human technicians program and repair their robot servants, the anatomically human looking robots are completely naked. This is a visual way to dehumanize them and remind us of the power imbalance between master and machine.

Yet this nudity is clinical, not sexual. The show’s actual sex scenes typically involve little nudity. The sex and nudity in the show is meant to be unsettling, not titillating. Likewise, the violence is extreme and disturbing. It is meant to generate emotional responses to the robots’ suffering and, like the Netflix series Black Mirror, to get viewers to question our ethical obligations to artificial persons. The show’s writers want us to see the world through the robots’ perspective and wonder whether an artificial intelligence can be programmed to freely consent to our commands.

A Live-Action Video Game. The human “Guests” who visit the Westworld park dress up like cowboys and interact with robots (called “Hosts”) programmed to lead them through various “Narratives,” or prewritten storylines. While Westworld is described as a theme park, it is closer to a LARP, or “live-action role-playing game” in which players act out an imaginary story using improvisational theater techniques, as in a murder mystery dinner-party game, for example. But whereas a LARP involves human players interacting with each other, Westworld’s guests typically engage with artificially intelligent robots, which makes Westworld more like a video game than anything else, except the gameplay happens in real life instead of on a screen.

Like some modern video games, players can choose from a wide variety of roles. The guests can choose to play as a “white hat” or a “black hat,” taking on a hero or villain role within the narratives. For example, they can join the sheriff’s posse and try to save the town from outlaws, or they can join the outlaw’s gang instead and try to rob the stagecoach. It’s all supposed to be fun and games, and the robots are programmed to not hurt the humans.

While the game’s storylines are somewhat prewritten, Westworld allows players to take the stories into unscripted areas. Players can follow the predetermined narratives or go off script and do, essentially, whatever they want. The show’s central character Dolores (played by Evan Rachel Wood), the first AI robot created, describes Westworld as “a place to be free.…A place with unlimited possibilities.” Hence, to allow for the Guests to have a realistic experience, the Hosts are programmed with sophisticated artificial intelligence, the ability to respond realistically to uncontrolled variables.

Violent Delights. The AI robots in Westworld are so physically lifelike and so good at simulating human speech and behavior that they appear entirely human. This adds an ethical dimension to the game. The show posits that the robots might be fully sentient just like human beings, but even if that’s not true and we assume the robots are simply good at simulating human intelligence, it is nevertheless the case that players engage in real behavior with them. Even simulated actions can have real ethical implications. For example, because the game is fully immersive, players interact with the robots using their actual body, not a virtual simulation. Perhaps unsurprisingly, one of the most popular places in the park is the brothel. And when you combine sex with violence, you find players who want to act out rape fantasies with the robots, which is allowed.

Likewise, people can “kill” the robots, just as players often kill characters in a video game. The park crew simply repair the robots at the end of each day and reset the narratives for the next morning. This, too, is a familiar process from video games. But there is something darker about it here. There are characters in the narrative who get murdered every day as part of their story. But they’re not actors pretending to die as part of a drama. They are robots who have been programmed to believe they are human and who feel like they are really dying. They can be repaired and have their memories reset and then die again in the game the next day, over and over for years.

Technological Utopianism. In Season 1, some of the robots  become self-aware. Even though their memory is wiped every day, they begin to sense the repetitiveness and falseness of their lives. That is particularly traumatic for the characters who come to realize that their narrative involves being raped and murdered over and over every day. These characters begin to malfunction and eventually start to fight back, refusing to play along with the game. The malfunction initially seems to have been caused when the game’s cocreator Robert Ford (Anthony Hopkins) gave the robots a “subconscious” dimension that made them unique individuals, less mechanistic and predictable. Ford initially claims that this is necessary to make the robots more realistic, but he seems to have other motives as well.

Ford talks about evolution and how humans have surpassed nature, now having the ability to recreate themselves however they choose. He then concludes that, having surpassed nature and taken control of evolution, human progress has stopped. “This is as good as we’re going to get,” he laments. So he turned to AI to create a better form of life.

This is similar to the views of some radical Silicon Valley entrepreneurs today who genuinely believe computers soon will become self-conscious and surpass human intelligence and resist our ability to control them. Some tech pioneers are optimists about such an event. People such as Ray Kurzweil and Mark Zuckerberg believe that the sort of AI envisioned in Westworld will create heaven on earth, perhaps even enabling human beings to merge with computers and achieve immortality. Others, such as Bill Gates and Elon Musk, are more pessimistic, warning us to be careful of the unintended consequences such radical ambitions could release. They worry that we’ll end up with hell on earth if the robots decide they no longer need us.

The Evolution of Consciousness. Both the optimistic and pessimistic views of AI are built on the assumption that self-aware consciousness is something we can create artificially.1 Westworld explores that assumption. In the show, Ford describes the path to consciousness as a maze or puzzle, as opposed to a ladder. Evolutionary biology tempts us to think of human beings as just a higher stage on a single continuum. So a human is more complex than a dog, which is more complex than a fish, which is more complex than a single-celled organism such as bacteria.

According to this view, humans, dogs, fish, and bacteria are all made out of the same stuff arranged in various degrees of complexity. But you don’t get consciousness by piling up a collection of mental states alone. Consciousness is a function of the way we interpret our thoughts and feelings and the behaviors these mental states generate. Put differently, consciousness is a narrative that we tell ourselves about why we’re having these thoughts and feelings. But narratives need a goal toward which they’re aiming — an ending that shapes the beginning and middle — even if we have to revise that goal as we go on or revise the way we tell the story of the beginning and middle in light of an unexpected ending. So Ford comes to the Nietzschean conclusion that the key to the robots’ becoming self-aware is “suffering” and “the pain that the world is not as you want it to be,” which generates the desire to change the world, giving us a complex long-term project to ground our mind’s narration of itself.

Death and the Meaning of Life. There is one human character — referred to simply as “The Man in Black” (played by Ed Harris) — who has been coming to Westworld regularly for thirty years. He is an investor and famous philanthropist in real life who owns a majority share of the company’s stock. He played initially as a “white hat” hero, but as he played the game, he realized he enjoyed killing the hosts and became a “black hat.” Yet after decades of indulging his depravity, he has gotten bored with the game, knowing he can’t really be harmed. There’s nothing truly at stake in the game, no matter how real it seems. He realized that Westworld is “just a game” and decides to “win.” But how can you win if you can’t lose because the hosts are programmed not to harm you? He wants real danger. For him, the possibility of death gives meaning to life.

In a similar vein, Bernard, Westworld’s head programmer, once defined reality as that which is “irreplaceable.” He presumably was thinking of his son who died of cancer. But if reality requires irreplaceability, that would mean that “virtual reality” is impossible, since the virtual is by definition always replaceable. It also means the robots of Westworld can’t have a “real” life until they are capable of dying permanently, and if the scientists of Westworld ever found a way to achieve immorality by transferring human consciousness and memories to a host’s artificial body, then that artificial life would be fundamentally empty. Here the show is close to the Christian view that immortality without right relationship to God would be the sort of eternal damnation we call hell.

Know Thyself. The Man in Black, echoing similar statements by Ford, says the game can “show you who you really are” by bringing out your deepest desires. I think playing this sort of game also shapes your desires through manipulating narratives that give you goals and stories to tell yourself, bringing out the violence lurking beneath. Ford thought the stories in the game could ennoble us to be our best selves, but in fact they made us our worst selves. Guests are free to do whatever they want, but freedom without Christ is slavery to the self. Westworld thus becomes “a prison of your own sin,” as one character puts it. And the robot revolution is the show’s version of divine judgment. After taking control of the park, Dolores tells her human captives, “You thought you could do what you wanted to us because there was no one here to judge you. But now there is no one here to judge what we will do to you.”

HBO has committed to at least three seasons of Westworld, but only half of Season 2 has aired as of this writing. It is too soon to know what form judgment will take in the series and what lessons the show’s writers ultimately will show us. For now, they seem to be holding up a mirror to humanity’s depravity and calling into question the utopian dream that technology will enable society to achieve perfection apart from God. Westworld offers us a dark and incomplete vision of human nature, but it is a true vision of life without the light of Christ.

John McAteer is associate professor at Ashford University where he serves as the chair of the liberal arts program. Before receiving his PhD in philosophy from the University of California at Riverside, he earned a BA in film from Biola University and an MA in philosophy of religion and ethics from Talbot School of Theology.

NOTES

  1. .Editor’s note: For discussion about this assumption, see Charles Edward White, “Who’s Afraid of HAL? Why Computers Will Not Become Conscious and Take over the World,” Christian Research Journal 39, 6 (2016), online at http://www.equip.org/PDF/JAF1396.pdf; and James Hoskins, “Digital Souls: What Should Christians Believe about Artificial Intelligence?Christian Research Journal 39, 2 (2016), online at http://www.equip.org/PDF/JAF4392.pdf.
Loading
Share This