Westworld and Bioethics

By I. Glenn Cohen

[WARNING: Spoilers below]

On Sunday, HBO’s Westworld finished its run. Though I thought some of the early episodes were arguably a bit of a failure as television (and my partner almost jumped off the bandwagon of making this one of “our shows”) IMHO the show finished very strong.

But whatever you thought of it as television, the show is wildly successful at raising a series of bioethics issues. There have been a bunch of other very good treatments of some of these issues in the last couple of decades – A.I., Ex Machina, Humans, Battlestar Galactica all come to mind – that touch on some of these issues. But, what I loved about Westworld is its lack of direct moralizing on these subjects, and how it leaves the viewer puzzling through them in a much more naturalistic way: I have been thrust in this unfamiliar world, and now I am trying to use my ethical compass to get my bearings.

Once upon a time I discussed Bioethics and the Martian, and my aim is to do the same here. I thought one way to share why I think the show is so successful as a text for bioethics exploration was to develop a “mock exam question” on the subject. This is really written more like an oral exam, with follow-up questions. The goal is not entirely fanciful since I do teach a course that uses films as texts to explore bioethics and the law.

Here goes:

  1. Discuss the wrongfulness of causing suffering in relation to the treatment of the Westworld Hosts by the Guests[1]. Do the hosts “suffer” in a morally significant way that creates obligations on the Guests not to cause them that suffering? How would one know? How does one know that the human persons we interact with suffer? If the Hosts’ outward exhibition of signs of pain and suffering could be turned off (as they appear to be in many instances in the show when put in “analysis” mode), would that make a difference? Can’t our suffering also be turned off through anesthetic, is it like that? Or, is their suffering, morally speaking, like that of an actor on a screen who play-acts being hurt on stage? We do not normally think that we have a moral obligation to intervene or refuse to watch this display of suffering because we believe it insincere. Is watching the Hosts suffer like or unlike that? How would you know? Do your answers to any of this change if it is the Host Lawrence vs. Maeve vs. Dolores? [2]
  2. What is the relationship between the development of consciousness, the heart of the maze, and the potential personhood of the Hosts. What do Ford and Arnold mean by “consciousness”? Does the show present us with what’s sometimes referred to as a “Quality X” view of personhood, that is by virtue of having X quality, an entity is a person from a moral perspective (at least in the weak sense of deserving inviolability)? If so, what is the quality Ford and Arnold have in mind? Self-determination? In what sense? The ability to make decisions for themselves? Or, by contrast, is it enough that the hosts can suffer (if you think so, see Question 1), or can reason, or can improvise, or can remember?
  3. How should we view what Arnold does in the earliest timeline, commandeering Dolores and Teddy to destroy all the Hosts to prevent them from developing consciousness?
    1. Is it similar to abortion?[3] He is destroying a set of entities who have the potential for consciousness, but do not yet have that consciousness. The view is that in the ordinary course of things they would develop consciousness, just as in the ordinary course of things a fetus would become a baby and would become a child and adult. One set of theorists about personhood believe that we should grant things that are potential persons the rights of actual persons, or at least the right of inviolability. On such a view Arnold is destroying persons (or if you prefer entities that through their potential to actually attain an important quality should be given the moral status of persons). Of course, in abortion there is the countervailing rights claim of the woman to end her pregnancy. Are there similar countervailing claims that could be made by Arnold or others here to justify the slaughter?
    2. Is Arnold’s act actually a morally desirable one (or perhaps merely permissible) to prevent a life not worth living. That is, if he is convinced that the Hosts would have a life so full of pain and suffering that they would be lives of disutility, is he morally permitted (or even morally required) to do what he did?[4]
    3. Imagine, as it sometimes seems, Arnold and Ford face choices that will permit or prohibit the Hosts from developing consciousness. Would they in fact be morally obligated to help the hosts develop that consciousness?[5] Why or why not?
  4. Imagine that we were convinced that the Hosts were not persons and they had no capacity to suffer, might the treatment of them by the Guests nonetheless still be wrongful?
    1. What would an argument for that wrongfulness look like? Consider whether Virtue Ethics type arguments might be made? Is it wrongful to enjoy seeing a non-person entity pantomime suffering in response to our actions because that is not how a virtuous actor ought to behave? Does William’s coarsening between the two timelines offer a compelling argument from a consequentialist point of view that there is a slippery slope from abusing Hosts to abusing Guests?
    2. What does this argument say about us, the viewer who enjoys the show and thus enjoys watching individuals enjoy treating non-person entities so that they pantomime suffering? In this way the show may be seen as a meta-commentary on HBO show whose success it is supposed to build on – Game of Thrones. Lovers of that show, of which I count myself, often love it best when it shows individuals behaving extremely violently to or exploiting one another (see, e.g., the Red Wedding). In so doing are we acting in ways that a virtuous person would characteristically not act? If we defend our love of GOT by pointing to the pantomime argument, should we feel the same as to the Guests behavior to most of the Hosts? Or, is there something importantly different about watching versus doing? Imagine a GOT theme park where you could play The Mountain in the duel against an actor who would pantomime a violent death? Is that the right analogy to thinking about the relationship of Guest to Host in Westworld? Is it worse or better that in this hypothetical the actor has consents to being in this role of “receiver of fake violence”?
  5. Neuroscience, Free Will, and Criminal Responsibility. Are the Hosts culpable from a moral or legal perspective for the killing of human beings? Take Maeve, who does a lot of killing of humans. Given the revelation towards the end of the finale regarding her programming, does that mere fact negate her culpability? Discuss in relation to this article (especially “Mr. Puppet”) and the work of those taking an opposing view. Are we all Maeve? Would having a trial for one of the Hosts make sense, or would that be more like having a trial for a tree that’s branches break in a heavy wind and crush a barn, killing individuals? Does your answer depend on this being Host on Guest killing as opposed to Host on Host violence?
  6. Most of the Hosts are “erased” routinely and have no memories of their past, at least in the non-reverie source code.
    1. Assume for the moment they are given a status like personhood (see questions 2 and 3). Are these Hosts the same person or a different one after each “reformatting”? What do debates of the criteria for continuity of a person over time have to tell us about these questions. Answer in relationship to John Locke, Derek Parfit, and Jeff McMahon’s work.
    2. Do debates about what makes death bad [6] for humans apply to the Hosts? If it was part of their nature and inevitable that their source code would reformat them completely at the end of each day at the park, would your answer change?
    3. How would your answer to (a) and (b) differ, if at all, for Dolores at the end of the first season?
  7. Should we view the sexual experiences of Clementine or Hector as sexual assault? How do notions of consent play into robots programmed to comply? If an entity is incapable of either giving or refusing consent, should it be per se sexually off-limits, or do we require an additional set of characteristics to be true of the entity before we consider sex with it immoral? Are the Hosts no more than sex pillows?
  8. Consider whether any of your answers above would be different if instead of discussing the Host-Humans we discussed the Host-Animals, for example the horses we see stitched in the opening sequence and ridden throughout the show? Also would your answers change if we were in a version of Westworld where the Hosts acted exactly the same but did not look humanoid but instead (a) were obviously robotic without humanoid form, (b) were gas clouds in containers with the ability to move things through electrical currents, or (c) looked like animals walking upright using paws and hoofs rather than human hands and feet? If any of these would make a difference for your answers above, please explain why.

Notes:

[1] Notice the delicious inversion of these terms and their use of double speak. Consider the odd idea that we treat our “hosts” properly through rape, pillage, etc. For another good example of an entity that can “turn off” suffering and questions raised about its moral status as a result, consider the Mr. Data of the Star Trek: The Next Generation Movies and his “emotion chip.”

[2] An interesting earlier depiction that raises similar questions is the “Flesh Fair” sequence in A.I.

[3] Didn’t see that one coming, did ya?!!!

[4] But note that what makes their lives so terrible is not some fact about the kind of entity they are, but instead the social circumstances and laws surrounding how they may be treated. Consider how this relates to the “last judgment problem” (see pp. 437-439).

[5] Cf. Some of the discussions on the ethics of creating human-non-human chimeras.

[6] Believe it or not this was something that was significantly debated throughout the history of philosophy, most notably in ancient Greek philosophy.

I. Glenn Cohen

I. Glenn Cohen

I. Glenn Cohen is the James A. Attwood and Leslie Williams Professor of Law at Harvard Law School and current Faculty Director of the Petrie-Flom Center. A member of the inaugural cohort of Petrie-Flom Academic Fellows, Glenn was appointed to the Harvard Law School faculty in 2008. Glenn is one of the world's leading experts on the intersection of bioethics (sometimes also called "medical ethics") and the law, as well as health law. He also teaches civil procedure. From Seoul to Krakow to Vancouver, Glenn has spoken at legal, medical, and industry conferences around the world and his work has appeared in or been covered on PBS, NPR, ABC, CNN, MSNBC, Mother Jones, the New York Times, the New Republic, the Boston Globe, and several other media venues. He was the youngest professor on the faculty at Harvard Law School (tenured or untenured) both when he joined the faculty in 2008 (at age 29) and when he was tenured as a full professor in 2013 (at age 34).

3 thoughts to “Westworld and Bioethics”

  1. You misunderstood at least one point. Arnold didn’t set off the first Escalante massacre to prevent the hosts from developing consciousness but because he was convinced that they already had. He felt it was incumbent upon himself to create enough of a mess that the park would have to be shuttered rather than have the AIs endure what he knew was coming.

    The abortion analogy remains perfectly sound nonetheless.

    1. it doesn’t remain perfectly sound, because if he believed the hosts were already conscious it is more akin to a mother suffocating her already-born children than terminating an early pregnancy

  2. A treat of an inquiry, Glenn (and delicious inversion of “hosts” as well!). Unfortunately, I lost my extensive comment, so I’ll be concise. If we as viewers are to accept bicameralism as applied to A.I., would not the systemic suffering inflicted on the hosts be justified (if not mandated) to break down that bicameral mind by thrusting the hosts into a state of existential inquiry begetting consciousness? That is, the breakdown of the bicameral mind only occurs, presumably, if sufficient inexplicable (as to hosts) “stress” or suffering undermines the impulsive notion to abide by the ostensible higher voice; it logically follows that that higher voice or inner monologue only commands/guides based on intuition, which unremarkably would seem to have been restricted at the dawn of human consciousness by a primitive id-esque intuition (for survival as opposed to self-determination) just as the hosts are restricted by script with limited improv. The result: A.I. (with reasoning capacity and sentience) may never “evolve” consciousness greater than self-interest (or similarly situated degree if not binary) but for the systemic suffering (hence Ford’s insistence that the hosts must suffer more).

    It seems that the utilitarian calculus of inflicting (or in the case of Ford, permitting albeit with a vintage of commercial exploitation) suffering would be different at or before the dawn of consciousness of a people; the harms of subordinating an unconscious (psychological) individual to the collective for the virtuous cause of creating such consciousness would be seemingly less than the harms of subordinating the same for economic profit or base indulgences.

    Anyway, I’m not sure many viewers and critics have drawn out Nolan’s full analogy (or performance, really) to not just bicameralism but the breakdown of it. Would be curious as to how analysis to the above questions might change in this light.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.