Feelix Growing

Last updated

Feelix Growing is a research project, started on December 1, 2006, [1] that is working to design robots that can detect and respond to human emotional cues. The project involves six countries and 25 roboticists, developmental psychologists and neuroscientists. [2]

Contents

The aim of the project was to build robots that "learn from humans and respond in a socially and emotionally appropriate manner". [3] The robots are designed to respond to emotional cues from humans and use them to adapt their own behavior. The project designers wanted to facilitate integration of robots into human society so that they could more easily provide services. The project aims to create robots that can "recognize" a given emotion, such as anger or fear, in a human, and adapt its behavior to the most appropriate response after repeated interactions. [4] Thus the project emphasizes development over time.

Robots are expected to be able to read emotions by picking up on physical cues like movement of body and facial muscles, posture, speed of movement, eyebrow position, [4] and distance between the human and the robot. [2] Project participants want to design the robots to detect those emotional cues that are universal to people, rather than those specific to individuals and cultures. [3]

The robots are made not only to detect emotions in people but also to have their own. According to Dr. Lola Cañamero, who is running the project, "Emotions foster adaptation to environment, so robots would be better at learning things. For example, anything that damages the body would be painful, so a robot would learn not to do it again." [4] Cañamero says that the robots will be given the equivalent of a system of pleasure and pain. [2]

The robots will have artificial neural networks. Rather than building complex hardware, the project coordinators plan to focus on designing software and to use mostly "off the shelf" hardware that is already available. The only part they plan to build themselves are heads with artificial faces capable of forming facial expressions. [3]

The scheme for 2.5 million euros is financed by the European Commission [2] (the executive body of the European Union) and is set to last for three years. Project participants hope to have a model of robot that can be used in homes and hospitals by the scheduled end date of the project.

The name Feelix is derived from the words feel, interact, and express.

See also

Related Research Articles

An android is a humanoid robot or other artificial being often made from a flesh-like material and, particularly in contrast to gynoids, made to have masculine appearance. Historically, androids were completely within the domain of science fiction and frequently seen in film and television, but recent advances in robot technology now allow the design of functional and realistic humanoid robots.

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While some core ideas in the field may be traced as far back as to early philosophical inquiries into emotion, the more modern branch of computer science originated with Rosalind Picard's 1995 paper on affective computing and her book Affective Computing published by MIT Press. One of the motivations for the research is the ability to give machines emotional intelligence, including to simulate empathy. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response to those emotions.

Facial expression Expression of the human face

A facial expression is one or more motions or positions of the muscles beneath the skin of the face. According to one set of controversial theories, these movements convey the emotional state of an individual to observers. Facial expressions are a form of nonverbal communication. They are a primary means of conveying social information between humans, but they also occur in most other mammals and some other animal species.

A microexpression is a facial expression that only lasts for a short moment. It is the innate result of a voluntary and an involuntary emotional response occurring simultaneously and conflicting with one another, and occurs when the amygdala responds appropriately to the stimuli that the individual experiences and the individual wishes to conceal this specific emotion. This results in the individual very briefly displaying their true emotions followed by a false emotional reaction.

Nonverbal communication Interpersonal communication through wordless (mostly visual) cues

Nonverbal communication (NVC) is the transmission of messages or signals through a nonverbal platform such as eye contact, facial expressions, gestures, posture, and body language. It includes the use of social cues, kinesics, distance (proxemics) and physical environments/appearance, of voice (paralanguage) and of touch (haptics). It can also include the use of time (chronemics) and eye contact and the actions of looking while talking and listening, frequency of glances, patterns of fixation, pupil dilation, and blink rate (oculesics).

Unconsciouscommunication is the subtle, unintentional, unconscious cues that provide information to another individual. It can be verbal or it can be nonverbal. Some psychologists instead use the term honest signals because such cues are involuntary behaviors that often convey emotion whereas body language can be controlled. Many decisions are based on unconscious communication, which is interpreted and created in the right hemisphere of the brain. The right hemisphere is dominant in perceiving and expressing body language, facial expressions, verbal cues, and other indications that have to do with emotion but it does not exclusively deal with the unconscious.

Developmental robotics (DevRob), sometimes called epigenetic robotics, is a scientific field which aims at studying the developmental mechanisms, architectures and constraints that allow lifelong and open-ended learning of new skills and new knowledge in embodied machines. As in human children, learning is expected to be cumulative and of progressively increasing complexity, and to result from self-exploration of the world in combination with social interaction. The typical methodological approach consists in starting from theories of human and animal development elaborated in fields such as developmental psychology, neuroscience, developmental and evolutionary biology, and linguistics, then to formalize and implement them in robots, sometimes exploring extensions or variants of them. The experimentation of those models in robots allows researchers to confront them with reality, and as a consequence, developmental robotics also provides feedback and novel hypotheses on theories of human and animal development.

Rosalind Picard American computer scientist

Rosalind Wright Picard is an American scholar and inventor who is Professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group at the MIT Media Lab, and co-founder of the startups Affectiva and Empatica. In 2005, she was named a Fellow of the Institute of Electrical and Electronics Engineers for contributions to image and video analysis and affective computing. In 2019 she received one of the highest professional honors accorded an engineer, election to the National Academy of Engineering for her contributions on affective computing and wearable computing.

EveR is a series of female androids developed by a team of South Korean scientists from the Korea Institute of Industrial Technology in Korea University of Science and Technology. The project is headed by Baek Moon-hong and was unveiled to the public at the Kyoyuk MunHwa HoeKwan hotel in Seoul on May 4, 2003. The EveR name is derived from the combination of the Biblical "Eve" and the r from robot.

Emotional responsivity is the ability to acknowledge an affective stimuli by exhibiting emotion. Increased emotional responsivity refers to demonstrating more response to a stimulus. Reduced emotional responsivity refers to demonstrating less response to a stimulus. Any response exhibited after exposure to the stimulus, whether it is appropriate or not, would be considered as an emotional response. Although emotional responsivity applies to nonclinical populations, it is more typically associated with individuals with schizophrenia and autism.

Non-verbal leakage is a form of non-verbal behavior that occurs when a person verbalizes one thing, but their body language indicates another, common forms of which include facial movements and hand-to-face gestures. The term "non-verbal leakage" got its origin in literature in 1968, leading to many subsequent studies on the topic throughout the 1970s, with related studies continuing today.

Affective haptics is the emerging area of research which focuses on the study and design of devices and systems that can elicit, enhance, or influence the emotional state of a human by means of sense of touch. The research field is originated with the Dzmitry Tsetserukou and Alena Neviarouskaya papers on affective haptics and real-time communication system with rich emotional and haptic channels. Driven by the motivation to enhance social interactivity and emotionally immersive experience of users of real-time messaging, virtual, augmented realities, the idea of reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner was proposed. Four basic haptic (tactile) channels governing our emotions can be distinguished: (1) physiological changes, (2) physical stimulation, (3) social touch, (4) emotional haptic design.

Embodied cognition is the theory that many features of cognition, whether human or otherwise, are shaped by aspects of the entire body of the organism. The features of cognition include high level mental constructs and performance on various cognitive tasks. The aspects of the body include the motor system, the perceptual system, bodily interactions with the environment (situatedness), and the assumptions about the world that are built into the structure of the organism.

Social cues are verbal or non-verbal signals expressed through the face, body, voice, motion and guide conversations as well as other social interactions by influencing our impressions of and responses to others. These percepts are important communicative tools as they convey important social and contextual information and therefore facilitate social understanding.

Emotion perception refers to the capacities and abilities of recognizing and identifying emotions in others, in addition to biological and physiological processes involved. Emotions are typically viewed as having three components: subjective experience, physical changes, and cognitive appraisal; emotion perception is the ability to make accurate decisions about another's subjective experience by interpreting their physical changes through sensory systems responsible for converting these observed changes into mental representations. The ability to perceive emotion is believed to be both innate and subject to environmental influence and is also a critical component in social interactions. How emotion is experienced and interpreted depends on how it is perceived. Likewise, how emotion is perceived is dependent on past experiences and interpretations. Emotion can be accurately perceived in humans. Emotions can be perceived visually, audibly, through smell and also through bodily sensations and this process is believed to be different from the perception of non-emotional material.

Affectiva

Affectiva is a software company that builds artificial intelligence that understands human emotions, cognitive states, activities and the objects people use, by analyzing facial and vocal expressions. The company spun out of MIT Media Lab and created the new technology category of Artificial Emotional Intelligence.

Rana el Kaliouby Egyptian-American computer scientist and entrepreneur

Rana el Kaliouby is an Egyptian-American computer scientist and entrepreneur in the field of expression recognition research and technology development, which is a subset of facial recognition designed to identify the emotions expressed by the face. El Kaliouby's research moved beyond the field's dependence on exaggerated or caricatured expressions modeled by laboratory actors, to focus on the subtle glances found in real situations. She is the co-founder, with Rosalind Picard, and CEO of Affectiva.

The relationship between gender and emotional expression describes differences in how men and women express their emotions.

Artificial empathy (AE) or computational empathy is the development of AI systems − such as companion robot or virtual agents − that are able to detect and respond to human emotions in an empathic way. According to scientists, although the technology can be perceived as scary or threatening by many people, it could also have a significant advantage over humans in professions which are traditionally involved in emotional role-playing such as the health care sector. From the care-giver perspective for instance, performing emotional labor above and beyond the requirements of paid labor often results in chronic stress or burnout, and the development of a feeling of being desensitized to patients. However, it is argued that the emotional role-playing between the care-receiver and a robot can actually have a more positive outcome in terms of creating the conditions of less fear and concern for one's own predicament best exemplified by the phrase: "if it is just a robot taking care of me it cannot be that critical." Scholars debate the possible outcome of such technology using two different perspectives. Either, the AE could help the socialization of care-givers, or serve as role model for emotional detachment.

Social emotional development represents a specific domain of child development. It is a gradual, integrative process through which children acquire the capacity to understand, experience, express, and manage emotions and to develop meaningful relationships with others. As such, social emotional development encompasses a large range of skills and constructs, including, but not limited to: self-awareness, joint attention, play, theory of mind, self-esteem, emotion regulation, friendships, and identity development.

References

  1. Cañamero, Lola. "FEELIX GROWING : FEEL, Interact, eXpress: a Global approach to development with interdisciplinary grounding". European Commission. Retrieved 2011-11-01.
  2. 1 2 3 4 Javier Sampedro. March 4, 2007. ¿Qué sienten las máquinas? ELPAIS.com. Retrieved on March 5, 2007.
  3. 1 2 3 BBC News. 23 February 2007. Emotion robots learn from people. Retrieved on March 4, 2007.
  4. 1 2 3 Kate Wighton. March 3, 2007. Robo-doc’s on call today. Times Online. Retrieved on March 5, 2007.