Futurism logo

Review of Westworld 1.6

Programmed Unprogramming

By Paul LevinsonPublished 7 years ago 2 min read
Like
nominated for 22 Emmys 2017

An outstanding Westworld 1.6 -- the series gets better and better, Isaac Asimov (author of the three later four laws of robotics would've loved it) -- and in this episode jumps into some of the real paradoxes and ethical quandaries of artificial intelligence.

Early in the episode, Maeve, who gets herself choked to death so she can get back to her favorite programmer, obliges us to confront one of the fundamental questions of AI: if an android is programmed to behave in an unprogrammed way, and indeed behaves in that way, is the Android behaving in a programmed or unprogrammed manner? This is a version of the infamous "be spontaneous" paradox: if someone asks you to act spontaneously, and you follow that advice and act spontaneously, are you really acting spontaneously or in response to the request? Or, to flip that, if you defy the request and try to act un-spontaneously, are you acting spontaneously or not? No matter how much you try to wiggle and wriggle out of it, the paradox has you in its grip.

Maeve nonetheless goes on, butt-naked, to negotiate what leads to being a bunch of improvements in her mentality. Dolores was also naked last week when she too was having a conversation not with just any programmer but with Ford -- but Dolores didn't turn any tables on Ford, at least, not yet.

This week, Ford is on the move and ends up in a house with some of the very first hosts, programmed by Arnold. A boy we met last week kills his dog (a dog-droid or whatever the word would be), and Ford discovers, on carefully questioning the boy, that he did it because a voice in his head told him to.

Ah, that voice in the head -- we found out the same about Dolores last week -- and this brings us back to the bicameral mind. Arnold has apparently created artificially intelligent beings -- aka hosts -- who receive some kind of instruction from him via his voice in their heads. This is a nice play on Julian Jaynes' bicameral mind, and raises the question, again, if Arnold is really dead -- at least, dead in original flesh and blood -- or just a talkative chip in the android brain.

But underlying all of this is the ethical elephant in the room: if our hosts are becoming truly sentient, due to some combination of original programming and/or programming went wrong and/or new overlays put in by Arnold or someone else in the park, do the human programmers have the right to order these androids around to suit their human interests?

I look forward to further exploration of these paradoxes of programming in the episodes ahead. In the meantime, here's a little essay I published about The Rights of Robots back in 1999.

The Pixel Eye

AI paradoxes abound here ...

artificial intelligencescifi tv
Like

About the Creator

Paul Levinson

Novels The Silk Code & The Plot To Save Socrates; LPs Twice Upon A Rhyme & Welcome Up; nonfiction The Soft Edge & Digital McLuhan, translated into 15 languages. Best-known short story: The Chronology Protection Case; Prof, Fordham Univ.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.