As part of our reading of Westworld, we are considering how humanity is redefined in the world of this text. We are exploring what it means to be human, in a world where people have their non-biological, “fake,” non-living counterparts (“hosts”).
What does authenticity means in a world where everything, including emotions, memories, reveries, beings, etc. can be simulated, created by people? What defines a “human” or “humanity” in the world of Westworld? What distinguishes the real/genuine/authentic from the fake/simulated/ersatz? What is missing/lost/sacrificed (if anything) in these replicas? Is anything gained?
- Who/what serves who/what? Who are the masters and who are the slave? Who are the superiors and the inferiors?
- What are the relationships (colleagues, friendship, sexual, love, etc.) between different types of beings?
- What is a real “emotion” if it can be simulated or real memories if they can be implanted?
- What about the setting, the utopian park of the old Wild West, where the rich come to live out their fantasies at the expense of others?
- What kinds of competing sets of values are at play?
- What are central conflicts of the first episode?
I am also particularly interested in us tracing how, through their interaction with the “hosts,” people (the “newcomers” or the people who work on creating the hosts or Westworld itself) move from merely embodying values/norms of their society that they have have already internalized, to developing individual, (perhaps rebellious?), free-thinking understanding about the world and their places in it, and the hierarchy of beings (living and otherwise).
Think about these questions in relation to other texts we are read or ideas discussed this semester, as well as real-life advances in technology (such as those presented in this article, “Japanese professor creates uncanny, human-like robots and the exhibit website, Android: What is Human?).
[Logistics]
Make one comment (just hit “reply,” either to my original post or to another comment on it) by Tu 10/6. Then go back/read through all comments and extend the conversation by making at least two more comments (of course, more are always welcome!) in response by W 10/7.
Your comment (reply) can be just a few sentences: provide the quote/citation and a quick explanation of how/why it functions in the context of some larger issue/question (or you can raise questions, complicate issues, extend discussions, analyze a character, or setting, etc. &/or discuss central conflicts/values/themes through the use of your evidence/analysis). Feel free to post multiple comments, and also to respond to others. If you’ve already discussed some of these instances in your previous blogs or in class, you should feel free to draw on that material.The goal is to have some good virtual discussions here to help you think critically about important themes/questions raised by this complex novel, and to find/analyze/synthesize various pieces of evidence in support of claim.
The goal in all cases is to provide specific examples from the text (scenes/quotes/citation from the episode) with discussion/analysis and some connection to a larger claim/argument. You must cite currently in MLA format (in-text citation).
I believe that a ‘human is defined as a being that is capable of feeling emotions, physical touch, and be able to think on their own. In Westworld, it is made apparent how to tell the difference between the two, in that the host can not feel things on their skin. However, the very last scene (1:04:00) shows Delores, who is supposed to be an android, express a human quality when she kills a fly on her neck. Furthermore, the emotions that a human would normally feel can not be easily simulated, because every individual is different and reacts different. Theres no way to program an android to properly react to this.
“Theres no way to program an android to properly react to this.”
This is a particularly true statement as androids are normally programmed to just do what they are programmed to do. They aren’t built to feel anything or react to anything unlink humans can. They can’t feel, only humans can. This statement is something to always think about whenever a new AI comes out.
I believe that being alive means more than that. There are humans that can’t feel anything physically because of conditions and humans that lack certain emotions because of surgeries. There are a lot of animals that don’t have feeling either but they are all considered to be alive. The last seen only shows that these robots are alive By your standards because Dolores can feel. She was only being limited to react before.
“There’s no way to program an android to properly react to this.”
well not really if you ever heard of a game called Detroit: Become Human there are androids who are programmed to follow commands until a glitch causes them to be self-aware. afterward, they start riots, are able to display emotion, and even kill people. Even if they weren’t programmed to do those things they had to learn and adapt to it from watching humans. So when Delores kills a fly on her neck she might not feel it per se but she can watch humans and react accordingly.
I completely forgot about that game so in fact you are right it is very possible for androids to become aware. It has also happened in Terminator which I did not take into hand.
What it means to be human is explored in the first episode of Westworld partly through juxtaposition. Human guests, upon arriving at the park, act simply. They seek entertainment to cater to their base values, e.g., sex and violence. The hosts, however, display complex, subtle behavior one may expect of a real human, such as introspection and inquisitiveness. Delores, for example, says at the end of the episode “I know things will work out the way they’re meant to.” This has a subtle double meaning, since it can be taken literally as an android reading from its script that the story will work out as written. But taken from a different, more human level, it suggests an introspective acknowledgement of forces impelling her to act.
“Human guests, upon arriving at the park, act simply. They seek entertainment to cater to their base values, e.g., sex and violence. The hosts, however, display complex, subtle behavior one may expect of a real human, such as introspection and inquisitiveness.”
Interestingly, from this statement, could the human guest’s even be considered human? In my opinion, I think the human guests could be considered androids that are slave to their carnal emotions while the hosts are more human with complex behavior.
“Human guests, upon arriving at the park, act simply. They seek entertainment to cater to their base values, e.g., sex and violence. The hosts, however, display complex, subtle behavior one may expect of a real human, such as introspection and inquisitiveness.”
The thing about that is the humans are aware of what this place is and its purpose, as opposed to the androids who are not aware of what they are. So imagine if they did know, wouldn’t at least a few of the androids revolt, learn from the humans, and eventually be in the exact someplace the humans were?
I think what makes someone a human is the ability to figure out what is right or wrong and have certain attributes: morality, reason, consciousness. We know what is right or wrong because we are able to think differently and tackle complex questions. Humans have many feelings and some of these emotions are hard to describe. I don’t like the idea described in “Westworld” because they don’t care about what is right or wrong and the whole purpose is to satisfy human desires. If there is a park or some different world humans create and people were able to do whatever they want like kill and rape then what is the difference between humans and animals? In “Westworld” there are many conflicts, but I want to focus on free vs controlled. The humans or hosts are definitely the free ones because they get to decide what they want to do. Unlike the robots or AI who are basically programmed to have a job and a set path for their life. Whenever they don’t follow the path created for them it is considered a bug and they will be fixed or reset. Dolores the main character, which I think is a robot states “Some people choose to see the ugliness in this world”, “I choose to see the beauty”(2:56). I don’t think she chose to see the beauty it is more like she is brainwashed to think that way. She is kind of control to think that all the pain she went through is only a dream.
“I think what makes someone a human is the ability to figure out what is right or wrong and have certain attributes: morality, reason, consciousness.”
Do you think that if a machine was able to provide a convincing argument that it could not only determine what is right or wrong, but could also exhibit morality, reason, and consciousness, would it deserve the same rights as a human being? Rights like self determination?
It’s kinda weird to think about, but I’m leaning toward believing it would — even if we were unable to prove whether or not the machine was faking it. That said, I think we’re woefully underprepared to have a conversation about the rights of potentially-sentient machines, especially when our parts of our society actively dehumanize actual human beings.
To answer your question, I think machines definitely deserve the same rights as a human being if they meet all that. (determine right or wrong, morality, reason, and consciousness) However, if machines also look the same as humans it would be really hard for me to differentiate between a robot/machine and a human.
“I don’t think she chose to see the beauty it is more like she is brainwashed to think that way. She is kind of control to think that all the pain she went through is only a dream.”
I agree with this point. I feel like Delores is controlled to act like this and this is not something she has control over. It’s interesting how in the first episode that Delores doesn’t question her dreams or think about why she’s having them. She also has a persona that everything around her is bright and beautiful and she’s acting like she hasn’t felt pain in her life even though she’s having dreams of it.
“I agree with this point. I feel like Delores is controlled to act like this and this is not something she has control over. It’s interesting how in the first episode that Delores doesn’t question her dreams or think about why she’s having them. ” I agree, it is interesting to see that too I think it is because she didn’t experience any events that are really painful or had a huge effect in the beginning.
In my opinion, Westworld shows us a strange flip when it comes to the question of Humanity as a whole. In life, Humans are creatures with the potential to be good but an innate sense of evil and ill intent that presents itself in some aspects of life. Here we see this Innate Evil within Humans manifest in the form of both the Guests that partake in Westworld and the people running the experience. This inherent evil shown makes the Guests and the people in Management feel less like Humans and more like Humanistic Monsters, or Machines of ill will given flesh and bone. They don’t change, they don’t grow, and in most cases they end up doubling back and reverting into an animal like state of reasoning where they do what they want solely because they can.
On the contrary, the Hosts themselves show the biggest signs of Humanity despite being fabrications. Even though most of them are scripted to act a certain way and perform certain tasks, they slowly begin to deviate thanks to the adaptability they’ve been given and the coding they undergo. One of which being Dolores who, even though she maintains her code and lives the life she’s meant to live, still shifts and changes in a way unseen before. Dolores is known for being unable to harm anyone, not a Human or even a fly, and throughout the episode she’s shown to be a kind and gentle person. Yet at the end, she contradicts her code and kills a fly on her neck as if it were second nature. Unlike most of the Humans in the show, she’s shown to change and shift in personality even for one second, and in essence show that she has the capability to grow and adapt like a Human could.
I feel like humans are defined as individuals who show emotions, makes their own decisions, morality, and physical attributes. In Westworld, the hosts are robots but they seem to show many human characteristics which makes it hard for someone to distinguish them from being a human or a robot. The first 15 minutes is mainly about the relationship between Delores and Teddy. Even though they aren’t humans, I sometimes forget that they are robots as they show several human characteristics. We see in this episode that the sheriff breaks down when a bug touches him even though the other androids don’t react to it. This scene tells me that the robots in this world are capable to Malfunction.
“Even though they aren’t humans, I sometimes forget that they are robots as they show several human characteristics.”
I wonder if it will get to a point where it doesn’t matter. We interact with many people every day (or at least did before COVID). We don’t check their internals to make sure they’re human. We have our interactions, we give and get what we need, and we move on. If one or more of those interactions were with an artificial being that offered the precise kind and quality of interaction as a human, I’m tempted to believe it wouldn’t be problematic. The issue lies in the ersatz representations. Whenever people see someone or something as an “other,” conflict follows.
“Whenever people see someone or something as an “other,” conflict follows.”
I agree with this statement by Max. This is a theme in many science fiction and fantasy novels. I think my favorite example of this is a side story in “The Day of the Doctor” from Doctor Who. There were monsters called Zygons that could transform into humans. This culminated into a cold war senario where humans and their doppelgangers were poised to kill each other. The way they solved it was by a temporary mind wipe that caused everyone to forget who was alien and who was human. This would be an example of removing the “other” classification to resolve the conflict.
“We don’t check their internals to make sure they’re human. We have our interactions, we give and get what we need, and we move on.”
I really agree with this statement because I see it every day that people go about their days not thinking about the people they just pass by. Many people are oblivious to little things that make us human, like sarcasm and jokes. If people can go by not noticing a lot about other humans, it would be probable that many androids or artificial humans could live among us and we would never know.
I agree, during most of the time when I was watching the episode I forgot they were androids until they started glitching out or when it cut back to reality. Even though the androids look human like, Ford adds more details in the update to make the androids act more human like.
I feel that being human is usually tied to emotions as Itmam said, but there are also other qualities that make someone human. The quality of strength and weakness is usually tied to being human. Humans generally vary in this measure. In Westworld, they make it very hard to tell a host from a guest, but as you get to know and understand the hosts, there become indicators of why they are hosts. To humans, machines are perceived as emotionless and this is very true across many writings of science fiction. Humans distinguish themselves from machines because they think emotion is what makes us superior to technology. Throughout the entire episode, it seems very clear that there is going to be a conflict between humans and machines.
I agree that the notion of strength and weakness is tied to humans, however i would also say that emotions are much harder to copy, as it human emotion is not programmable to be able to constantly change, but having strength is more likely and could easily be faked.
I don’t really know what it means to be human and its differences from the machines. To me, I feel like the “hosts” could be considered humans too. Consider this, if a person with amnesia and conditioned/taught to not harm a select set of people, would they not be the same as the “hosts”? It could be even said that people with dementia is coming close to being like these “hosts”. It seems possible to condition humans to have act in the same way as these “hosts” so would these humans be considered to have lost their “humanity”? So to me, I think the “hosts” and humans are more or less the same to begin with.
“I don’t really know what it means to be human and its differences from the machines.”
In regards to your other statement about “hosts”, that is just a bit of an example of brain washing which isn’t really true to how machines work but you can see some similarities of it. With enough time you can get people out of being brain washed, but machines you can’t really change them unless you change the code within them. They can be similar, but work differently than what people think.
“To me, I feel like the “hosts” could be considered humans too.”
While I can agree with this, I personally think that their are somethings that the ‘host’ may not be able to fully understand (we do not yet know their capabilities), I think they should be classified as their own branch of human, since their are most likely things they experience that we can not, which would have a misunderstanding between us.
During a conversation in WestWorld (00:18:58) Bernard says, “The hosts can’t hurt you by design.” The Security Guard responds with, “You don’t have kids at home, do you, bernard? No. If you did, you’d know that they all rebel eventually.” Then (1:07:28) Dolores slaps the fly that lands on her neck. This shows a change in the android. I think this is the start when the androids start to rebel against their makers.
Having androids or AI rebel against their own makers is something that a lot of people fear that will happen one day, hence making them is very slow since the chance of them going rouge may be very low, it can still happen.
I found it interesting how Bernard thinks the hosts can’t hurt anyone even though the hosts are androids and has potential to malfunction. We did see the sheriff malfunction and this was something unexpected to Bernard himself.
Having a robot compared to a kid is all the evidence you need to know that these robots express themselves. If their expression can be described as a kid being rebellious it shows that these robots don’t want to be in the space that their in. It shows they have their own emotions and will to want to better themselves just as a human child would.
I feel like that there are so many instances in science fiction where artificial intelligence or androids become either self aware or rebellious. It is a common theme because like us humans we don’t fully understand the capabilities of technology and the advancements they can make their own. I think that technology is bound to evolve and change and humans either evolve with it or destroy it.
This discussion pretty much enters into the discussion I had in High School when I was taking theory of Knowledge, in this course we learned that there is in fact two versions of “Real.” The first version of “Real” is with a lowercase r, in which it becomes “real” meaning that for the said person such information or things is considered real. The more they hammer it into their brains the more “real” it becomes, (exactly what the school system does to everyone) in the case of the show ‘Westworld” the androids reality is in the made up world. They are programmed to be “real” and as for the people they believe they are the masters, their reality is then reenforced when they see how they can delete the memory card on the androids. The humans reality is that they are the master, people with true emotions, and ideas, and yet there is signs of emotions and ideas in the father figure robot. In the second case of “Real” which is uppercase, is the general or socially enforced understanding that everything the people as a whole is considered to be “Real.” This impacts the show by making the viewer see the whole picture of Both ends, the master and the “Slaves.” This combination makes a general understanding that there might be a possibility of emotions to androids, however there is also a general understanding that the true emotions come from something else that is very hard to replicate; a soul.
The morality of bringing things to life has always been a topic for concern amongst humans. People clash on the idea that human are allowed to play God and create life by other means than intercourse. If the things that are created through other means are allowed rights or if they become property as they aren’t “real” people. This question brings the question to what being human is. I believe being human is being anything that can experience, think, process, respond and feel. That includes these robots in Westworld that are just advanced computer programs. There comes a point where their coding becomes just as advanced as the coding that flows in our DNA.
I think what it means to be human is to be able to think as feel things freely while also being able to react to it in a number of different ways. We all have our own definition of love, hate, and morality and not everyone will be able to respond to certain stimuli the same. Humans fight a lot with others, and they fight a lot with themselves, some of them have the capacity for change and learn from it while others refuse. I’m currently learning about programming, neural networking, and machine learning, and to make a long story short no matter how advanced these androids are, it’s how humans have the free will the think, feel, and react in different ways and grow from them is what makes us human.
I completely forgot about that game so in fact you are right it is very possible for androids to become aware. It has also happened in Terminator which I did not take into hand.