Emotional Robots
Emotional Robots
Emotional Robots
Anthropomorphic Robots
Could robot’s ever duplicate humans?
Since humans first conceptualised robots we have aspired to give robots
human form and abilities.
Science fiction has long represented humanoid robots in film and literature.
Even our pets haven’t been left out of the robotic picture- eg Sony Aibo, roboraptor etc.
Questions: What are the advantages/ disadvantages of a robotic pet over a biologically live pet?
Isaac Asimov in his 1942 short story “Runaround”, created these laws for robots
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with
the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or
Second Law.
http://library.thinkquest.org/25500/index2.htm
“Benedict de Spinoza in the 17th century described emotions as bodily changes that result in the
amplification or attenuation of action and as processes that can facilitate or impede action. For
Spinoza, emotion also included the ideas, or mental representations, of the bodily changes in
emotion.
Today, there continue to be researchers who define emotions in different ways. At the extremes,
emotions can be seen as biological responses to situations over which we have little control. Plutchiks
theory of emotions is one that stems from the biological perspective. At the other extreme, there are
psychologists who define emotions more by the conscious experience rather than by the biological
response. One of these is Schachter.
Most researchers today accept that emotions operate on a number of levels. A common view is that
emotions consist of:”
Subjective That is how the individual interprets what they are feeling at any point. These are
feeling inner personal experiences. How one individual interprets "being in love" will often be
different to the next person. Subjective feelings in response to an emotion cannot be
readily observed. As a result, the self-report method is often used to collect data in
this area.
Expressive This refers to the outward signs that an emotion is being experienced. Such
behaviour behaviour can be intentional or unintentional and includes facial expressions as well
as body language.
Physiological This involves bodily changes which occur when we experience an emotion. This
responses involves the operation of the brain as well as the Autonomic Nervous system and
research by Candace Pert suggests that it also involves the cells in our body. It is
often our awareness of the arousal that makes us suddenly aware that we are
experiencing an emotion.
Scientists have already made progress with robots that can exhibit basic facial expressions
NEXI ZENO
At right – Ever2
http://web.mala.bc.ca/clemotteo/Pandora/Phil%20362/should_robots_feel.htm
“should we create such a machine, if the possibility becomes available to us. Are there uses for such a
machine that could not be satisfied by a complex automaton? Is there anything about real emotional
response that would be necessary for a machine to operate autonomously, and still interact with human
beings? What are the dangers? What are the ethical ramifications?
By simulation, I mean a robot that is functionally the same as a human in its behavior, but is still to be
considered simply a machine. A robot that will react to inputs from its environment exactly the same as a
person would, but is not itself a person, only a clever imitation of one. By replication, I mean a robot that
actually is a person, not just an imitation of one. A “strong AI” intelligence that has a real, conscious mind,
Both of these arguments can be looked upon as controversial, in regards to the argument over
whether or not a replication AI can actually be built. Few would argue that an excellent simulation of a
human would not eventually be built. Sooner or later, as technology advances, it is highly likely a
simulation that can pass for human will be devised. What seems to be controversial about this idea of
simulation is that some in the field of AI feel that there would be no difference between a perfect simulation
of a person and a replication of a person. That is, a functionally perfect simulation would be a replication.
This seems to me somewhat of a behaviorist view, one in which only the inputs and outputs are important,
“Folk” psychologists, as well as philosophers, believe that computers lack the ability to reproduce
Before considering whether or not robots should have emotions, I think it will be useful to take a
cursory look at least into why human beings have emotions. I think there are a couple of ways of looking at
this. First of all, human emotions are integral to our lives, the way we define ourselves as persons. Our
personalities are closely connected and determined by the things we love and hate, our fears and our search
for happiness. We value our emotions as intrinsic to our beings, even when they cause us pain, fear, or
anxiety. That said, another way of considering emotions is to ask how they are functionally beneficial to us.
It is not impossible to imagine human beings without emotions, cold calculating beings still fully capable of
The question is, do emotions serve a functional purpose that cannot be satisfied by a purely reason-
based process?
important differences from more rational beliefs and judgments (Emotions & Reasons, p. 4):
(1) Although its appropriateness may be explained in terms of belief warrant, the evaluative component of
(2) The affective component of emotions gives them a special role to play in rational motivation, as “extrajudgmental”
Because of this, imagination and intuition can play a role in our actions, as these “extrajudgmental”
emotions allow them as reasons for action. Greenspan also describes emotions as resistant to direct rational
control, saying “..it will assume resistance to direct control, of the sort we have over action. In this respect
and others emotion seems to stand in between action and belief, exhibiting some features of both
From this I think we can determine some practical uses for human beings to have emotions.
Emotions may help us to take action in situations where we may feel something that we can’t justify through
After looking at some practical applications for emotion, some of the dangers inherent in emotion
should become readily apparent. Emotions can and often do lead people to take unwarranted action. A
perfect example of this is jealousy. Science fiction abounds with stories warning of the dangers of an
emotional robot. For instance, HAL 9000, the conscious computer in Arthur C Clarke’s book 2001: A Space
Odyssey becomes paranoid, afraid for his life, leading him to murder the crew of the ship he controls.
Another science fiction author, Isaac Asimov, envisioned such robots being kept in check by his famous
Three Laws of Robotics, central beliefs programmed into the robot to prevent it from harming humans.
However, human emotions have the ability to circumvent even our most basic beliefs (consider the jealous
husband), so one could easily argue that an emotional robot might ignore such laws when in a highly
emotional state. .
Societal Integration
We are all integrated into a society, which teaches us morals, ethics, and most importantly here, limits on the
actions we take to express our emotions. The dangerous killer robots of science fiction tend to exist in a
social vacuum, removed from human society and without a society of their own. A possible solution to the
unpredictability of an emotional robot is social integration. Humanoid robots might be taught to feel as that
the robots we interacted with did not really feel, I think most people would still be happy with simulations as
companions. Only building simulations would eliminate all the risks of an emotional robot deciding to harm
a human, and we still haven’t discussed any practical reason to have a true replication.
1. Searle, John, Minds, Brains and Science. Cambridge,MA: Harvard University Press, 1984
2. Greenspan, Patricia S., Emotions and Reasons: An Inquiry into Emotional Justification. London,
3. Ullrich, Robert A., The Robotics Primer: The What Why and How of Robots in the Workplace.
6. Clarke, Arthur C., 2001:A Space Odyssey. New York: Harper, 1968
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=296
Co-ordinator Dr Lola Canamero said the aim was to build robots that
"learn from humans and respond in a socially and emotionally
appropriate manner". Movie I Robot depicted
emotionally complex
The 2.3m euros scheme will last for three years. machines
"The human emotional world is very complex but we respond to simple cues, things we don't
notice or we don't pay attention to, such as how someone moves," said Dr Canamero, who is
based at the University of Hertfordshire.
Sensory input
The project involves building a series of robots that can take sensory input from the humans they
are interacting with and then adapt their behaviour accordingly.
Dr Canamero likens the robots to babies that learn their behaviour from the patterns of movement
and emotional state of the world around them.
The robots themselves are simple machines - and in some cases they are off-the-shelf machines.
The most interesting aspect of the project is the software.
Dr Canamero said: "We will use very simple robots as the hardware, and for some of the
machines we will build expressive heads ourselves.
"We are most interested in programming and developing behavioural capabilities, particularly in social and emotional
interactions with humans."
The robots will learn from the feedback they receive from humans.
"Tactile feedback and emotional feedback through positive reinforcement, such as kind words, nice behaviour or
helping the robot do something if it is stuck."
The university's partners are building different robots focusing on different emotional interactions.
'Detect expressions'
The robots will get the feedback from simple vision cameras, audio, contact sensors, and sensors
that can work out the distance between the machine and the humans.
Artificial neural networks are being used because they are very useful for adapting to changing
inputs - in this case detecting patterns in behaviour, voice, movement etc.
One of the areas the robots will be learning from is human movement.
"The physical proximity between human and robot, and the frequency of human contact - through
those things we hope to detect the emotional states we need."
The robots will not be trying to detect emotional states such as disgust but rather will focus on
states such as anger, happiness, loneliness; emotions which impact on how the robot should
behave.
'Imprinted behaviour'
"It is very important to detect when the human user is angry and the robot has done something
wrong or if the human is lonely and the robot needs to cheer him or her up.
"We are focusing on emotions relevant to a baby robot that has to grow and help human with
every day life."
One of the first robots built in the project is exhibiting imprinted behaviour - which is found among
birds and some mammals when born.
"They get attached to the first object they see when born.
"It is usually the mother and that's what makes them follow the mother around.
"We have a prototype of a robot that follows people around and can adapt to the way humans
interact with it.
"It follows closer or further away depending on how the human feels about it."
Dr Canamero says robots that can adapt to people's behaviours are needed if the machines are to
play a part in human society.
At the end of the project two robots will be built which integrate the different aspects of the
machines being developed across Europe.
The other partners in this project are the Centre National de la Recherche Scientifique, Universite
de Cergy Pontoise, Ecole Polytechnique Federale de Lausanne, University of Portsmouth,
Institute of Communication and Computer Systems, Greece, Entertainment Robotics, Denmark
and SAS Aldebaran Robotics, France.
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=296
“The most advanced robots in the very fine movie I, Robot had the ability to interpret the emotions
of the human beings around them. They did it by analyzing the stress patterns in the voices they
heard. In phrasing it just that way, the film pays homage to an earlier computer who did just the
same thing - the HAL 9000 computer from 2001: A Space Odyssey.
Today, Affective Media Limited in Scotland is working to help computers better understand people
in various stages of emotional stress. Affective Media even has an online demo with an animated
character named Tetchy the Turtle, who accepts voice samples and analyzes them. “
The photos and much of the content from this paper was sourced from :
WWW LINKS
http://www.dmoz.org/Computers/Robotics/Robots/
http://www.headrobot.com/famousrobots.php
http://www.headrobot.com/famousrobots.php
http://en.wikipedia.org/wiki/List_of_fictional_robots_and_androids
Isaac Asimov in his 1942 short story “Runaround”, the Laws state:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First
Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second
Law.
http://www.ai.mit.edu/projects/sociable/movies/emotion-narrative.mov
MIT Kismet
http://www.ai.mit.edu/projects/sociable/movies/emotion-narrative.mov
eMuu
http://technology.newscientist.com/channel/tech/robots/dn13959-strokable-robot-rabbit-talks-with-
touch.html
http://io9.com/374951/an-emotional-robot-shows-how-it-feels-++-and-is-creepily-convincing
“MIT thinks that the world needs an emo robot. That’s why they created this next-generation tiny
humanoid robot called Nexi. It’s an ‘emotional robot’ designed by roboticist Cynthia Breazeal’s
group at the MIT Media Lab. It’s known as an MDS (Mobile/Dexterous/Social) robot, which
basically means it can move it’s body, hands, and face in a way that suggests human emotions. Its
arms, wrists, and hands are fully adaptable to clutch and raise up to 10 pounds and by the looks of
it this thing is probably a cutter too.
It possesses changeable features including eyes, eyebrows, eyelids, and mouth movement. It
creeps us out in a whole new cartoon way. It also moves on a pair of animatedly self-balancing
wheels. So, if you hurt it’s feelings, it will have no problem chasing you down and killing you.”
http://web.mala.bc.ca/clemotteo/Pandora/Phil%20362/should_robots_feel.htm
http://news.bbc.co.uk/2/hi/technology/6389105.stm
http://www.livescience.com/technology/080407-nexi-robot.html
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=1072
http://www.engadget.com/2007/06/06/kansei-makes-a-comeback-with-reactive-facial-expressions/
Kansei
Kansei can make thirty-six different facial expressions (using its 19 different movable parts
underneath its silicon face mask). When the robot interacts with people, the words that people use
are treated as keywords.
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=882
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=296
“Dr Christian Jones, the chief executive of Affective Media, puts it this way:
"When you are depressed or sad, the pitch of your voice drops and your speech slows down.
When you are angry, the pitch rises and the volume of your voice goes up. We betray our
emotions as we talk in dozens of subtle ways. Our recognition system uses 40 of these. It ignores
the words you use, and concentrates exclusively on the sound quality of speech. It can tell your
emotional state the very first time it hears your voice."
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=1648
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=1658
The basic idea is to identify specific landmarks on a person's face, like the eyes, eyebrows, mouth,
nose and so on. Then, a mesh can be created by connecting these landmarks. Finally, the
computer is able to create a three-dimensional mesh model of an individual face.
Carnegie Mellon University researchers are pushing this idea even further. Their active
appearance modeling software can identify faces that are partially occluded by objects.
http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=1186
http://www.tauzero.com/Rob_Tow/BirdsAndBeesEmotionalRobot.html
http://www.foxnews.com/story/0,2933,296616,00.html
zenzo
http://www.uberreview.com/2007/04/mike-the-emotional-robot.htm
Basically, Mike is a robot made by Brazilian students that can express its emotions through the
colors of its eyes: red is for anger, green is for happiness, and orange is for sadness. Now we only
need to know how a robot can feel those emotions, but that detail will be discussed in a future
post.
Mike will be presented tomorrow at the International Robotics and Artificial Intelligence Fair, taking
place in São Paulo, Brazil.
http://www.cs4fn.org/alife/robot/blade.php
blade
Anthropomorphic Robot
www.internationalrobotics.com/comrobots_anthro_mil.htm
http://www.internationalrobotics.com/comrobots_anthro_mil.htm
Techxellent Training Solutions 2008 Sue Inness 0414184033 14
"Anthropomorphic" Robots, while Animatronic in nature, are in a category of their own, simply
because they represent the highest level of expression of Animatronics from the point of view of
their more fluid and human-like body motions. International Robotics was responsible for
launching two Anthropomorphic Robot programs for its client Ford Motors, which have been used
successfully at Auto Shows throughout the world.
The Anthropomorphic Robot offered by IRI is also in a class of its own amongst other
Anthropomorphic machines, in that it is the world's first and only Robot combining a revolutionary
mix of electric motors and miniature hydraulic chambers, using plain water instead of Hydraulic
fluid. We call this "Aquadraulics". Read on, and you'll discover how truly revolutionary this
technology is, and how much sense it makes to own or rent one. Of course, we'll give it any shape
or appearance you need."
"Anthropomorphic"| means having a shape like a human and "Tele-operated" | means operated from a remote location by Tele-Presence!)
http://www.androidworld.com/
http://www.takanishi.mech.waseda.ac.jp/research/voice/index.htm
Cartooning Links
http://www.tpub.com/content/draftsman/14263/css/14263_203.htm
http://www.cartoonconnections.com/toonbulletin_1.htm
http://www.toonzone.com.au/teachb.html
Emotions
http://www.hamiltoneducation.org.uk/topics/LKS2_RiseOfTheRobots.pdf