The recently released Eve no Jikan movie is a succinct and mostly successful retelling of the six original net animation episodes released between August 2008 and September 2009. Set in a future where commercial helper robots are commonplace, the titular "Time of Eve" café is a place where the delineation between robots and humans is removed and the problems encountered with such a blending are made manifest.
Like many robot-focused stories, Eve no Jikan's central conceit is artificial intelligence. Divorcing that from the concept of robotics, the mechanics of which are already available in various forms not least of which the famous ASIMO robot from Honda, is important. Autonomous (rather than "intelligent") robots have their own, not exclusive set of hurdles to overcome, starting with the mundane - navigating simple environments - precipitously growing more complex: speech recognition, language parsing, decision making... The list goes on.
Cognitive understanding and machine learning come well before any semblance of intelligence can be imparted. Professor Noel Sharkey once put it best when demonstrating a predator-prey robot experiment: asked by a member of the press whether the predator-class robots would attack and suck the life out of his cellphone, Noel replied that they have enough trouble not running into walls.
Putting aside the minefield that is the question of what intelligence is, learning would naturally seem to be a key part of that definition. Electronics form a natural barrier to however with their deterministic nature: you can't get out more than you put in. That's a vast simplification and ignores complex system definitions and Turing (we'll come back to him) completeness, but the short version is that a system will do what it is specified and nothing more.
This is fine for simple systems with only a few logic gates, but the outcome gets trickier to predict when scaling up to include millions such as in modern, multi-component systems. It is then possible to arrive at emergent behaviour - an unexpected outcome but one that given enough time could be wholly specified. The term is mostly applied when using learning algorithms or neural nets.
All very academic, but how does this relate to Eve no Jikan? Starting from the outside and working in: it is assumed the robots within the Time of Eve café are visually indistinguishable from humans which is remarkably difficult to achieve. Think of all the rubber masks or mannequin heads or video game faces which come close, but just look "wrong" somehow. That's the uncanny valley, where the closer something becomes to looking human, the more repulsed by it people tend to be. This is only tackled obliquely with the nameless robot that enters Time of Eve, the café's rules demanding it be treated without prejudice.
Visual identification infers individuality and how unique a person's face really is, are all the robots individually designed? Ghost in the Shell: Standalone Complex tackled this with "face sculptors", artisans who could craft a unique persona for a machine. But surely the protagonists of Eve no Jikan could recognise the likeness of a particular machine if they were mass produced? Similarly, fluid, human movement, like faces, is hard to mimic as anyone who has seen animatronics can testify.
The next layer down then is the mechanical operation of the robots. Their autonomy must rely on batteries which have been woefully unchanged for many decades. Powering not only the physical movements but also the complex computer core and being able to operate for hours without visible recharging is quite the achievement. Wireless recharging in buildings would obviate much of this issue, rarely do we see Sammy roaming the streets for any length of time, but the high inefficiency of such a method would be of paramount concern for the energy companies and for those consumers footing the bill.
Going down a level is the software underpinning the robots. Going into all the details shown would be a book in itself, to name a few that stood out, starting with speech recognition. Ever had to train a speech recognition program? It can take hours to do and even then minute changes in your voice like after a hot drink or a cold or the soprano-tenor octave change of puberty is enough to throw off most systems not to mention the perils of background noise. Distinguishing multiple voices at once and even unknown or heavy accents (Kansai et. al.) is a goal that even today is far away.
Speech synthesis, the other side of the communication coin, has been subject to great strides with a model of the human speech system relatively easy to construct. The results aren't perfect, far more GLADOS than Shodan, but are a world apart from the mechanical speaking-clocks that many people are familiar with which simply glue together disparate sound bites without any kind of blending.
The part in between those two is natural language processing; turning what a robot is told to do into discrete items. Breaking a phrase into the different parts of language (verbs, nouns etc.) is hard enough with written documents (something Google can attest to), let alone fragmentary speech. Going the other way and constructing a response beyond "Understood" is equally difficult, especially when attempting conversation. One of Alan Turing's most lasting legacies is the so-called Turing Test, commonly believed to be a way to tell if a system is "intelligent" but more correctly as defining whether a system can only imitate a human. It is widely held to be only a basic step towards discovering whether a machine has a mind or intelligence but one that is distinctly relevant to Eve no Jikan.
The last task that the Eve no Jikan software definitely perform is image processing, specifically the isolation of specific objects within a scene such as Rikuo and Masakazu or manipulable objects like coffee-cups. Progress is certainly being made in this area with facial recognition now a part of many modern video cameras; however object recognition and identification is still difficult, relying on a large library of existing objects to pattern match from or accurate (so far non-existent) generalisation algorithms i.e. identifying the specific "properties" that define a chair. Programs such as Adobe Photoshop are pushing different aspects of this into the commercial domain with "Content Aware Fill" in the latest version as well as separate research into live video manipulation.
The most nebulous of the elements shown in Eve no Jikan relates to the science fiction author Isaac Asimov and his Three Laws of Robotics:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Their beauty comes from their ambiguity and is perhaps the greatest emotional triumph of the series. Tex, or THX from George Lucas's brilliant debut feature-length film about a robot wanting to break free (incidentally, 1138 is the identifier the authorities give to the Time of Eve café), is given an order by Masakazu's father not to speak, yet contravenes this law when it believes Masakazu is in danger. The implication being that Tex could not only understands the situation in the café but is able to extrapolate that this would lead to harm.
It's quite a stretch for a machine to be able to do that, but not so surprising when the logic required to adhere to the three laws is already incredibly complex, not least of which the definition of "harm". Physical would be the most obvious, but could generalise into emotional harm (something Tex doesn't take into account and would likely provide for a group of very petulant robots overly concerned with not hurting everyone's feelings) or even long-term harm such as to the environment indirectly hurting humans.
There are countless volumes of research and speculative fiction written on these topics, not least of which robot sexuality, morality, laws etc. It's a rich area to explore but one that is, for the foreseeable future at least, fictional. It has certainly not been my intent to debunk the fiction that Eve no Jikan presents, more to explore its roots in reality and to refresh myself on how much has changed in the seven years since my university tenure. It's rare to find a production like Eve no Jikan which covers such a topic succinctly and, crucially, in a thoroughly entertaining way.
That said, this is one of the best examinations of the Three Laws in 'realistic' situations I've yet experienced. It works on both an emotional level as a soft SF piece, but does explore the nuts and bolts of the setting, as it were.
While I'm on the subject, I finished reading The Stories of Ibis by Hiroshi Yamamoto around the same time the fansub of this movie was posted online. The sheer number of parallels and common themes between the two made my head spin! The penultimate component story of the novel goes into a lot of depth about how A.I. could actually integrate into human society - it's one of the best stories of its type I've ever read. I strongly recommend it.