24 May 2020

Asking the Wrong Questions: “Deus Ex: Thoughts on Westworld’s Third Season”

in Bucharest, Romania
Westworld season 3 on HBO

Take, for example, the crux of the season, the concept of an AI-run society. On paper, this is a brilliant expansion of the show’s central conceit – that there is effectively no difference between humans and hosts except that the latter have been designated, through the logic of capitalism far more than the realities of technology and biology, as inherently disposable, their suffering and death justified because they provide entertainment and distraction for the rich and powerful. What Dolores discovers when she arrives in the real world is that most humans live exactly the same kind of life. Like hosts, they have “loops” and “storylines” decided upon by a god-like AI of whom they aren’t even aware. Like some hosts, they can be rewritten, assigned to new roles and stories with only a faint awareness of the life they once lived – Caleb, we learn, was once designated a troublemaker, one of the small percentage of humans who don’t take to Rehoboam’s guidance, and was subjected to personality-altering treatments and the erasure of his memory in order to make him a constructive member of society. When Dolores releases Rehoboam’s profiles of each citizen, allowing them to see how their lives have been guided and constrained, she likens it to revealing the reality of the park to the hosts, and the result is entirely similar – violence and destruction (and, as in the park, it eventually turns out that these are false flag operations, funded and directed by Dolores as a cover for her attempts to get to Rehoboam).

Abigail Nussbaum

After a rather disappointing season two, I had almost no expectations for this series. The third season was similarly met with bad reviews and criticism online, but this time I do not fully agree with this sentiment. Maybe I am biased precisely because of my low expectations, but I enjoyed the ideas and themes laid out throughout the season, which reminded me of the philosophical aspects of the first season.

The primary thread underlying the action is the analogy between the former existence of hosts in the park in the first season and the life of humans in the real world, both shaped and controlled by obscure forces, beyond their knowledge and reach. Whereas hosts were created this way, programmed with sets of predefined actions to fit certain scenarios, humans were driven into a similar situation by a massive hidden surveillance complex, controlled by an AI and its reclusive maker for the overt goal of insuring humanity’s security and continued survival. This naturally touches on current questions in our society, from the manipulation of voter intentions through data collected on social networks, to the balance between safety and freedom, between the Western liberal society and the Chinese authoritarian society, to something as recent as the pandemic response (how many people can we sacrifice for the overall ‘good’ of society, and more importantly, who gets to decide that). On a more general level, the central theme continues to be free will, with Dolores on a crusade to return free will to humans, after liberating the hosts.

Of course, most of these concepts are not particularly original and have been debated numerous times in literature, including science-fiction. Constant surveillance and reeducation or removal of non-compliants are prominent in 1984, now updated with modern methods; pacifying restless population with synthetic drugs reminds of Brave New World; modeling the future and adjusting current actions to meet desired outcomes is a central idea in the novels of Isaac Asimov, from The End of Eternity to Foundation; the subjugation of an entire civilization by a (possibly intelligent) machine gives off pre-Butlerian Jihad Dune vibes.

Where the show stumbles and sometimes fails is translating these ideas into a compelling story. Jumping from a western to a futuristic setting, the most recent season mixes interesting ideas about future technology with anachronisms and implausible tech. My favorite concepts were the evolution of AI and VR psychology. In the first episode, Caleb calls his former army buddy to talk about his daily concerns, only to be revealed that there was no person at the other end of the line, but an AI built to mimic his friends’ responses. It reminded me of the movie Her, but I felt the idea was better employed here, as a nice background detail, underlining emotional distancing in the future world, and possibly differences in status, as richer patients had access to more expensive human care. The scene where William confronts his alter egos in a VR simulation was also impressive, even if it did not have much impact in the story.

On the other hand, I was perplexed by how petty criminals in the future still rob ATMs for a living… you would think money would be fully digitized in a couple of decades – then again, that scene was set in the US, so anything’s possible. Flying vehicles without a discernable mode of propulsion were too much for my taste – they certainly look cool, but has humanity really managed to master antigravity?! Even self-driving cars seem to recede further into the future at this point.

The characters are generally another weak point of the season. It was very much Dolores’ show, to the extent that the supporting characters felt unnecessary, simple pawns in her grand scheme to topple Serac and Rehoboam. If feels a little too convenient that a former host, with little prior experience in the real world, would be able to mount such a successful challenge to the most powerful AI and richest man in history – and that a complete surveillance system misses her entirely, even though she rarely makes any effort to conceal her presence. But at least there is an in-universe motivation: with no data about hosts and their behavior (especially sentient hosts), Rehoboam could not make reliable predictions about her actions – also a nice commentary on the current state of machine learning.

Despite his backstory, tied closely to the hidden mechanisms of this future society, Caleb was nothing more than a figurehead for the revolution planned and put in motion by Dolores. Bernard spend the season running around clueless, but fortunately in the finale we get a nice explanation of his crucial role for Dolores, as well as an emotional scene with the wife of Arnold, his human template. William is even more expendable this season; I felt the writers only wanted to emphasize what a terrible human being he was his entire life. Halores, the copy of Dolores with the appearance of Charlotte Hale, played a more complex role, but still it felt contrived and somewhat illogical. There is no explanation why her consciousness, originally identical to Dolores’, would deviate so abruptly, making her fearful, paranoid, and suddenly in love with Hale’s family. Even if someone grafted Hale’s memories and parts of her personality onto Dolores – which evidently didn’t happen, since Halores needs a recording to remember the lullaby she used to sing to her son – it doesn’t match the Hale character we knew from previous seasons. And Maeve… was around as well, I guess to pose some counterbalance to Dolores and raise trite observations about simulations.

I ended up enjoying the season, despite its many flaws. I would have liked to see some of the arcs in this season better developed, as opposed to the second, which could have easily been merged into the first season, with two or three extra episodes. And, unlike the previous season, it left me interested in how the future story will unfold.

My rating: 3.0

Post a Comment