‘Westworld’ Has No Moral Compass. That’s What Makes It Hard To Watch.

The first time through, I missed the after-the-end-credits scene of the Season 2 Westworld finale. Turns out, there was a similar tag-on at the conclusion of Season 1, but I didn’t stick around for that one, either. When “The Passenger” reran, I dutifully tuned in for the final minutes to see The Man in Black — a host? A human? — being tested by his daughter Emily — a host? A human? — for “fidelity.” In an interview with The Hollywood Reporter, Lisa Joy was kind enough to save us all some trouble and just flat-out explain what was going on.

“In the far, far future, the world is dramatically different. Quite destroyed, as it were. A figure in the image of his daughter — his daughter is of course now long dead — has come back to talk to him. He realizes that he’s been living this loop again and again and again. The primal loop that we’ve seen this season, they’ve been repeating, testing every time for what they call “fidelity,” or perhaps a deviation. You get the sense that the testing will continue. It’s teasing for us another temporal realm that one day we’re working toward, and one day will see a little bit more of, and how they get to that place, and what they’re testing for.”

Reading her description, I thought, okay — so, they have taken a human, or a host, or a hybrid of the two, and put him through the psychological torment of repeating the same loop over and over again, with no memory of each cycle. Is this moral?

That question should be at the heart of Westworld. The question of morality should make the show a compelling, thoughtful watch. But because of the confused storytelling and the consistently blurred lines between reality/fantasy, robot/human, past/present, memory/experience, the show doesn’t have room to present clear questions about what is right or wrong.

There’s no starting point in Westworld — to use the show’s own verbiage, there’s no baseline of accepted modes of behavior and action that all other events are set against. When we enter the show, everything has already been twisted from the world we know, but we’re not really sure if this is a misguided utopia or a very dark vision of the future.

You could say Westworld is very good at introducing ideas that bring up moral questions, but it makes little attempt to resolve those questions. In Season 1, I thought it might have been the robot uprising after this entire species was essentially created to be abused by people. But as the show went on, it was revealed that there are so many problematic things going on, no one is in the right. If the robot uprising may have actually been pre-programmed by a human, is it justified?

Ultimately, what Westworld lacks is the identification of morally problematic choices and consequences for those choices. This is where, in my mind, the show fails — because instead of giving viewers clean character arcs, it seems the powers that be behind Westworld have decided to just throw in neat ideas wherever they could. It’s like the producers are sitting in a room, trying to come up with cool concepts without really thinking about whether or not they fit with the show.

There’s an easy comparison here: Buffy the Vampire Slayer. As any Buffy fan knows, there were distinct rules in the universe, and consequences for violating those rules. That’s not to say the rules were never violated — they were — but the characters always had to deal with the fallout. Think Willow and her misuse of magic. The line that Buffy could never cross: killing humans, even if they were bad guys. The Buffy-verse has a set of moral rules that Westworld just does not.

Here’s my short list of potentially problematic aspects of Westworld. Ask yourself how many of these contradict with one another. A clever moral web, you could say, or a lack of moral consistency that prevents you from fully investing in this piece of fiction, because you just don’t know who to root for.

  • Humans create a species of humanoid robots for guests to play with in a park. The robots can’t fight back. Visitors to the park are told they will experience no consequences for whatever they do while in the park on “vacation.”
  • Robots are endowed with the ability to grow, love and form attachments. Although their memories are wiped between “builds,” they retain familiarity and bonds with others, intangible connections they don’t quite understand. They recall, with some prompting, their past abuse at the hands of the guests.
  • The park is copying the cognition of the guests while they are in the park, without their knowledge. The purpose is to turn humans into hosts, something I still don’t quite get, although I predicted it. Is the intent to sell immortality? Otherwise, why would this happen at all? How are the hosts they would create with human consciousness a superior version to the ones they already have?
  • The hosts eventually commit violence against humans, guests and park administrators alike. They do not discriminate between those who have actually caused them harm, and those who happen to be in the way.
  • The hosts can be particularly vengeful, aka Dolores. However, she was designed to be vengeful. Indeed, the entire robot uprising seems to have been orchestrated by a human being, Ford — so do any of the hosts really have a moral justification for taking the actions they do?
  • The Man in Black mows down a group of people, to the shock of his daughter, before he kills her, too. At this point, he’s gone insane, convinced the world has been created for him and none of it is real. He has immediate regret when he realizes the figures were not hosts, but people — but what is the true moral difference here? How is it less reprehensible to end the life of a host designed to act, think and look like a human, than to actually end the life of a person?
  • How should The Man in Black be held accountable for that action? There’s no clear answer because Westworld is a world without consequences — which makes it ultimately unsatisfying.

By the way, we won’t find out soon what happens with host/hybrid William and his testing with Emily. Later on in the Hollywood Reporter interview, Joy says that she doesn’t envision that as part of Season 3. They are just working towards that much-later storyline. (Which is another reason why Westworld is a show that is best binge-watched. Put in a weekend and move on, enjoy the candy without thinking too hard about what’s going on.)

 

Advertisements

‘Westworld,’ Tell Me Who The Heroes Are

As much as “Vanishing Point” (S 2 E 9) may have finally answered some questions, it opened others, and left me with this profound sense that I have no one to root for. Bernard, I suppose. But I don’t know who’s the hero in this story, and there aren’t even any dark knights whose victories you can secretly enjoy.

But let’s not forget, Charlotte Hale put code in the host network to kill them all. We weren’t told how that would play out, but it could easily have been in Bernard deleting his Ford code, taking control of his fellow AIs and having Teddy end his own life. I’m not sure, and I’m still trying to find a reason to care.

Westworld has a very dark view of humanity. That darkness is not just in people, but the humanoid robots they create out of human consciousness. I don’t know what the message is here, except that humans are very bad, and there isn’t much to redeem them.

Unless that’s the truth of humanity inside the park — but maybe not outside, where life could be idyllic or at least more complex in its range of human emotions and experiences. That’s why it was compelling, for about 30 seconds, in “Vanishing Point,” when we saw William and Emily’s home life in flashback.

Turns out the story outside the park — that I was so anxious to see — isn’t that deep or compelling after all. Basically, William is a bad guy. Made everyone miserable. His wife couldn’t handle it. He felt guilty. Came into the park, where he’d become obsessed with the characters and storylines.

Which comes to the question of whether William is, or is not, a host. I’m not sure it matters, because by this stage the lines between the humans and the AI have become blurred. Maybe that’s the point.

As for Charlotte’s program to eliminate the hosts, let’s think back to Star Trek: TNG (S 5 E 23, “I, Borg”), when they saved an individual Borg and planned to use him to infect the entire collective with an algorithm that would result in their self-destruction. At the end of that plotline, they decided that to use one Borg that way would be unethical. But they went on to reason that the experiences of independence the one Borg had had while detached from the collective would do the same kind of work. When the one Borg, Hugh, went back home, he’d bring a new sense of identity with him. Eventually, the entire Borg would be infected by that identity, and thus understand the concept of “I.”

Let’s hope the minds behind Westworld are planning something equally as interesting for the finale as that classic bit from TNG.

‘Kiksuya’ Is ‘Westworld’ At Its Easiest And Most Compelling

There hasn’t been much to enjoy about Westworld this season, with its confusing, overloaded episodes that bombard viewers with multiple timelines and several plotlines all at once. Every word is somehow significant, or not — it may solve the entire puzzle, or it may be a miscue, or you could be making things up just to keep it interesting. You can’t really watch Westworld passively, but at the same time, there’s an uncertain payoff if you choose to pay close attention. It may just be a waste of time.

I have the intent, however vain it may be, to re-watch the first seven episodes of Season 2 just to give Westworld a second chance. It’s obviously an expensive production, so someone must really believe in it. Whomever at HBO approved the budget deserves at least a second look before those of us faithful (but cynical) viewers write off the endeavor completely.

That intent grew a little less remote tonight after seeing Westworld‘s latest installment, “Kiksuya.” Instead of going deep with analysis — tough to do with Westworld because, like I say, it may be wasted mental energy — I’m just going to list off why this episode worked.

You may not agree, but as they say, “That’s Entertainment.”

It had a story.

When I first learned that episode 8 was going to focus on the Ghost Nation, I was like, “noooo!” Not another plotline. Not more character introductions. Not more people to keep straight. No, no, no!

But it worked. Because it wasn’t so much a character introduction as opening up that world that had been going on in the background of Westworld since the beginning. Who are those people on horses, except aboriginal stereotypes? Perhaps deeper, more enlightened — and human — than viewers might previously have assumed.

It was easy to follow.

Even a casual viewer, which I was not this evening, could figure out how Akecheta fit into this whole thing and what he was trying to do. There was no mish-mash of seemingly disconnected characters, each with uncertain agendas. It made sense.

Importantly, you could see how the hosts are perhaps much more independent than we’d previously assumed. It had been a decade since Akecheta’s last update, and he was apparently continuing to learn, his memories of past lives becoming deeper, that entire time.

We learned some things.

Ford seemed amazed when he discovered Akecheta was drawn in by the maze. Maybe Ford isn’t controlling things after all. He may be a guiding force, but at least in that moment he seemed pretty much content to let the hosts do whatever they wanted.

It looked pretty.

It did.

 

 

 

 

 

‘Westworld’: Just William And Ford Playing A Game

I’m trying, Westworld, to stay with you here. I’m trying to care, and tonight (Episode 7, “Les Écorchés.”) I made myself care by reading a bit too much into that conversation between Maeve and Dolores, when one is severely injured, experiencing a deep mourning, and the other is running amok around the park, drawing blood with a kind of nondiscriminatory venegance.

Maeve is seeking her own brand of justice, but she’s holding fast to a deep emotional bond, with her daughter to whom she made a promise to return. Dolores has no interest in such niceties, sacrificing the father given to her by the park’s creators. She took his life to extract the encryption key in order to, one might assume, detroy it and free the hosts.

Because, as Dolores tells Maeve, the park’s script doctors gave the hosts kin as a way of tying them down, to keep them as playthings. You need to be dark in order to survive, she says. At this point, Maeve and Dolores are both amped-up, their programming on maximum settings, but they have quite divergent views on what it means to make good on the promise of free will.

Simultaneously, we saw Ford, living on inside the Westworld park as a kind of virtual host controlling his creations, making them commit heinous acts the AI couldn’t quite stomach doing on their own. So the question is quite simple. Is it all a predestined, logical series of programmed events, or do the hosts have independent thought and decision-making capacity?

I don’t know. I’m more interested in why William was able to get up, albeit with difficulty, after he was shot at close range. He could be like Ford: a mind uploaded to the park’s infrastructure, with a certain level of control over the environment. Maybe his body isn’t inside the park at all; maybe he’s long dead. Maybe he’s just playing a game with Ford, and they are in a race to get to the Valley Beyond.

William’s daughter wants to get him out, expressing an intent of not letting him die inside the game. She probably knows everything there is to know, as the heir and the reasoned, outside observer. Why did Charlotte Hale invite William’s daughter to come inside, again? Charlotte knows William is still playing Westworld. Is William’s game with Ford somehow risking the Delos corporation’s investment?

What was that line of Ford’s, about James Delos? He’d rather die than have a bad investment. When did Westworld become a bad investment?

I don’t know. Three episodes to go.

 

Ford Showed Up On ‘Westworld.’ Does Anyone Care Anymore?

Westworld, I’ve been patient. I’ve (largely) held my tongue while you churned out episode after episode of robot violence and stunning landscapes. But tonight was Episode 6, Westworld. You are halfway through the season, and nothing has happened. Nothing.

Early on in tonight’s episode, I had high hopes. We got a momentary switcheroo with Delores seemingly programming Bernard, and not the other way around. William chatted with his daughter and we found out an earth-shattering piece of information: they don’t get along. She wants him out of the park and won’t let him die there. Charlotte Hale is doing something in a bullet proof vest. Maeve finds her daughter, only to realize she’s in a new loop of an old scenario, her daughter doesn’t remember her and has a different robot to call Mom.

I am probably missing a storyline or two, because there are so many, and I don’t know when they are, how they fit together, or even why I should pay attention to any of them. I have a vague memory of the Season 1 Westworld finale providing a sort of recap that tied up all the loose ends. In essence, you really didn’t have to binge-watch all ten episodes to figure out what happened. But in Season 1, it was far more enjoyable to do so.

Didn’t we expect more of Shogun World? I’m at a loss as to how this drives Westworld forward, other than to give Maeve another outlet to explore her grief at her missing child. Many of us, I would guess, assumed Shogun World would reveal more about the park, its guests, and the reason behind the whole enterprise. It’s turned out to be just another ride at Disneyland, with more violence in different costumes.

Side note: Jimmi Simpson, who plays young William, debunked Westworld fan theories on a video last week for Elle. Watch, if even for the comedic value, and faux-offended reaction of Simpson near the end at a fan’s assertion that the show is “convoluted.”

Really, can you think of a better word for Westworld than convoluted?

So, we don’t know exactly when Bernard ended up in the park’s hive mind, because he’s with Elsie but not with Elsie — we don’t know how much is memory, how much is happening in real time and when real time actually is. I have not bothered to try to figure it out. But apparently getting a robot labotomy lets the hosts communicate directly with Ford, or a projection of Ford, or whatever it was that Bernard saw on his little dreamscape.

When will I care, Westworld? When will this show have a story? When will I feel like I’m getting my money’s worth out of my HBO subscription? Tell me, please. I’m losing patience, and it’s unlikely I’ll be buying in for Season 3.

“Westworld’s” Most Interesting Storylines Probably Exist Outside the Park

I first heard of ‘Westworld’ back in 2015 when news broke that background performers (extras) were asked to sign a startling consent form that described potential acts they would have to perform on set. It got the attention of SAG-AFTRA, who issued a member alert telling actors about their rights to withdraw from the production. That strange story, combined with the high-profile cast, made me curious to watch.

I didn’t end up seeing Westworld until I binge-watched Season 1 as part of an HBO free preview a few weeks ago. The powers that be will be glad to know that it worked, I signed up for HBO just to see how Season 2 might play out. Now that we are four episodes in, it’s clear that Westworld is something that’s best binge-watched, lest you have so much time between episodes to come up with intricate fan theories that you are disappointed at the end when none of them turn out to be true.

(Note — please don’t tell me Westworld resembles Lost. I’ve never seen Lost, although I’m currently a fan of Josh Holloway’s much smaller show Colony, which has led us through two full seasons without showing us the aliens.)

Which leads to the revelations in “The Riddle of the Sphinx,” Sunday night’s exploration of James Delos’ experiment with becoming a host. As anyone who might recall, or felt like Googling it, The Riddle of the Sphinx goes something like this: What walks on four legs in the morning, two in the afternoon and three at night? The answer: a man, who crawls in childhood, walks upright during adulthood and depends on a cane in old age.

Of course, in Westworld, some figures don’t want to experience old age. Apparently, they don’t want to die at all, but experience a modified immortality by having their minds — or AI-enhanced versions of their minds, it isn’t clear — transplanted into the bodies of hosts built to resemble them.

In “The Riddle of the Sphinx,” we see James Delos’ son-in-law, William, visit Delos in the lab, where he’s essentially repeating the same loop of early morning exercise, Rolling Stones music, and coffee. Older William (Ed Harris) eventually comes to tell Delos that they have rebuilt him 149 times, presumably over several decades, but it just isn’t working, and maybe people just aren’t meant to live forever. He exits, but decides to end Delos’ life with a bit of misery by telling the technician not to terminate him, but to let him degrade, just to see how it goes.

Here’s what I eventually came to think about Westworld: the complex family dynamic of the Delos clan is probably far more compelling than any of the shenanigans going on inside the park itself or in its corporate boardrooms. My guess about what’s happening inside the park is probably similar to everyone else’s. The guests have been offered a kind of eternal life where they can live forever, but the technology (or legality) only lets them do it inside the park. The massive amounts of data they keep on the guests is probably consensual; rich people who don’t want to die contract with Delos to keep them going inside this fantasy world. All of these characters are probably hosts who were once human beings — Bernard is probably still Arnold, perhaps suffering the same degredation as Delos (note the shaky hands). Old William is probably remade William, decades or even centuries after his original death. When he encounters his daughter at the end of “The Riddle of the Sphinx,” she maybe wanted to say hello one last time. Maybe she hasn’t seen him in decades, either. Maybe she’s been rebuilt. Ford’s game is probably to give these permanent guests the chance at final escape, because perhaps in the original contract, there was no provision for everything to end. William — the guest who can have everything he wants — isn’t allowed to die unless he can solve Ford’s game.

But to me, none of that is as interesting as whatever is going on with the Delos family. Why was Logan an addict? Why did Julia end her life? Why do William and James have this strained relationship? What is the relationship between William and his own children? What exactly is William’s public image? In Season 1, another guest thanked him for his charitable work, only to be rebuffed. Who is he outside the park?

My guess is that Westworld will never answer those family questions, because it’s just not that kind of drama. Westworld purports to be about the intersection of technology and humanity, and that’s probably where it will stay, giving us unsatisfying answers to increasingly predictable questions. But I’ll still watch, if only to justify the cost of my HBO subscription.