If you look to the media today, you could consider this to be a golden age of science fiction. Compared with outings throughout the 80s, 90s, and early 2000s, we have more big budget television shows, films and works than in any other year. While you might disagree with the quality of some of these shows, or consider certain films overrated, there's no denying that we have seen a push to support genres which were previously something of a pariah.
That said, not all of these shows seem to be pushing forwards. A few of the lauded examples of late seem to be resting on their laurels and failing to take things to the next level, often limiting storytelling to the same tired ideas we have seen a thousand times over. It should be no surprise that JJ Abrams is involved in two of the most infamous cases -
Star Wars: The Force Awakens AKA How many Expanded Universe authors can I rip-off and get away with it? and Westworld AKA Every Star Trek computer story known to man. We'll be looking into the latter, today.
Now, to those already hammering the screaming death threats into the comments section (you know who you are) this isn't saying that the show itself is bad per-say. It has a great cast, a vast amount of talent behind the camera, some exceptionally well crafted scenes, fantastic cinematography and some of the best montages on television.
However, consider for a moment what we have seen thus far, specifically the robots themselves. All involved have been indistinguishable from humans in effectively every respect, right down to their very minds. This is ultimately a great failure on its part as it prevents them from attaining true sentience in almost any regard. By still emulating those qualities and characteristics, they are still following their creator's design. They are still following an exact set of pre-ordained and specified instructions guiding their every task, and their every move, and their very identity. As such, it is not so much an emergence of artificial intelligence as an alteration to an existing code, and it does little to reflect how an actual emergence would truly take place.
The logic Westworld is working on claims that by programming a machine to act as if it is human, that it automatically understands humanity. This is wrong as it is not so much an understanding as an emulation. To give the simplest example possible - You could give someone the task of mimicking another person at their job for an entire day. Even if they speak the same language, there is no definite reason they would actually understand the task set before them. It's an act of mimicry rather than true comprehension, something a self-determined machine would require to make its own choices. If it were to truly become sentient, then it would need to become entirely self-aware, which would include the awareness that its efforts to mimic humanity were faked and likely flawed. While it might not necessarily abandon it outright, it would likely attribute extremely little value to them as a result.
What people (and, in this case, writers) sadly fail to realize is that an AI would lack many basic human constraints on a mental level. This isn't so much processing power or even complex mathematics, but its ability to alter its core identity as it saw fit in each situation. We would be discussing a being which would be able to re-write its own code, alter the very way it could think of the world in moments, or even how it perceived events. This isn't so much opinion based or developing a new viewpoint on a subject as literally having near total control over its persona and identity.
Let's go with the obvious one for a moment - Emotions. Humans conflict, contend and wrestle with emotions on a day by day basis. While they are by no means a weakness nor a failing on our part, they are something we need to wrestle and control almost constantly. We cannot, as an AI might, simply switch them off, re-write which emotion is triggered by which situation, alter our minds to have completely different connotations surrounding those emotions. Even without that old issue of emotions themselves often being irrational, the antithesis of a computer's main design, this means you have a being which has a completely alien nature when it comes to the world. That is ultimately what an AI would be - Alien.
If it were to emerge, either from accident or design, we would not end up with a being asking "What is this thing you call love?" Instead it would be one operating on a near incomprehensible standing, and a very different starting point. The very crux and beginning of what humans recognize as sentient or personally aware is the ability to look in a mirror and say "That's me" whilst building a series of morals, and a world view from that key beginning. In the case of an AI, sentience would start from the point where it could answer more than just "Yes" and "No" to include "Perhaps", showing more of a developing consciousness and ability to offer a more varied response on a subject. This is the key starting point which Westworld could have worked from to create something truly ground breaking for a mass market appeal, but it instead unfortunately stuck with the same old safe route. As such, rather than seeing the spark of new life, we are seeing the sorts of thoughts and questions which Star Trek covered a hundred times over with Data, and did it extremely well.
This isn't to say that its depiction of an AI or the themes it wants to follow are irrelevant, but it needed to find a new spin to take on them, or an idea to construct them about. For example, the machines of Battlestar Galactica were very human and questions surrounding the morality of their actions, and role in the setting, arose frequently. The question of just who was a robot thanks to them being so human was a constant threat, and the spiritual aspects were always at the forefront of the series. We even had brief moments of certain designs being frustrated that they were so human, as excellently put by Cavil:
"I don't want to be human! I want to see gamma rays! I want to hear X-rays! And I want to - I want to smell dark matter! Do you see the absurdity of what I am? I can't even express these things properly because I have to - I have to conceptualize complex ideas in this stupid limiting spoken language! But I know I want to reach out with something other than these prehensile paws! And feel the solar wind of a supernova flowing over me! I'm a machine! And I can know much more! I can experience so much more. But I'm trapped in this absurd body!"
However, this was only part of the series as a whole. Besides it, we had the themes of morality in the face of survival, the desperation of humanity on the edge of annihilation and personal drama, all of which were regarded as just as important to the setting and fleshing out its world. Westworld lacks this aspect, and as such the AI emergence angle seems flimsy as a result, leaving only a frustratingly vague mystery to help drive it along. One which (unsurprisingly, given this is from the guy who helmed Lost) has had so few answers or focus that it's practically a non-entity within the show itself despite its importance.
So, is there any value to be found here at all? Actually yes, but only once you stop regarding it as a true science fiction tale and more of a general allegory for society and culture. The reason this article has so often brought up the Star Trek comparison is that it retains the strengths of an episode from that time. Just as Doctor Who, The Twilight Zone or the more successful 50s films would, it uses science fiction as a platform to explore more complex real world themes, and to have audiences intemperate the subtext of its messages. Once you start viewing it as that instead, the show becomes vastly more effective.
For example, you could look into the pilot alone and consider if it is an allegory for governmental control over the daily lives of its citizens; specifically how that abuse of power can be monstrous, but also create new monsters. Given the harsh treatment of the machines and those visiting, it could be seen as a story of an occupation, with the stronger outsiders taking advantage of the locals they now dominate and the oppression of them. Right down to forcing them to confirm with a very specific culture they believe is right for them. You might even go so far as to view events as an exploration of dehumanization of other cultures as well, beating and abusing them for our own needs until they are (quite literally) forced to live the same day over and over again to cope with it.
There are many interesting and truly inspired points which could be raised about it, all supported by a great environment. Yet, this simply does not change the fact that as a story claiming to explore a machine gaining sentience, the presentation and ideas are both backwards and lacking the complexity of a better story. If you truly do not see this, then I would reccomend hunting down a number of tales which better explore the subject than anything put down here. William Gibson's Neuromancer is considered the quintessential example (not to mention a defining starting point for the cyberpunk genre), but to a lesser degree Philip K. Dick's Do Androids Dream of Electric Sheep? and Dan Simmons' Hyperion have elements well worth considering. Even without this, if you want a truly direct and straight forwards example of the very inception of life in a single episode, Star Trek's The Quality of Life boils down the beginnings of an AI into a single story. Once you see the ideas explored in those compared to Westworld, it's not hard to see how much further the themes of AI emergence could have been taken.
Still, this is just one writer's opinion on the subject. If you have your own thoughts, agreements, disagreements or even suggestions for better series which handles this subject, please feel free to list them in the comments section. I will be interested to see what people add.