Closing the loop: from affect recognition to empathic interaction

 Onkologie

 3 views
of 5
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Description
Closing the loop: from affect recognition to empathic interaction
Share
Tags
Transcript
  Closing the Loop: from Affect Recognition to EmpathicInteraction Iolanda Leite,André Pereira,Samuel Mascarenhas INESC-ID and InstitutoSuperior TécnicoPorto Salvo, Portugal iolanda.leite@inesc-id.ptGinevra Castellano Dept. of Computer ScienceSchool of EECSQueen Mary Univ. of London United KingdomCarlos Martinho,Rui Prada, Ana Paiva INESC-ID and InstitutoSuperior TécnicoPorto Salvo, Portugal ABSTRACT Empathy is a very important capability in human social rela-tionships. If we aim to build artificial companions (agents orrobots) capable of establishing long-term relationships withusers, they should be able to understand the user’s affectivestate and react accordingly, that is, behave in an empathicmanner. Recent advances in affect recognition research showthat it is possible to automatically analyse and interpret af-fective expressions displayed by humans. However, affectrecognition in naturalistic environments is still a challeng-ing issue and there are many unanswered questions relatedto how a virtual agent or a social robot should react to thosestates, and how that improves the interaction. We have de-veloped a scenario in which a social robot recognises theuser’s affective state and displays empathic behaviours. Inthis paper, we present part of the results of a study assess-ing the influence of the robot’s empathic behaviour on theuser’s understanding of the interaction. Categories and Subject Descriptors H.5.2 [ Information Interfaces and Presentation ]: UserInterfaces; I.5.2 [ Pattern Recognition ]: Design Methodol-ogy; J.4 [ Computer Applications ]: Social and BehaviouralSciences General Terms Human Factors, Design, Theory. Keywords Affect recognition, empathy, artificial companions. 1. INTRODUCTION Empathy plays an important role in human social interac-tion. Hoffman [8] defines empathy as “an affective response Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, torepublish, to post on servers or to redistribute to lists, requires prior specificpermission and/or a fee.  AFFINE’10,  October 29, 2010, Firenze, Italy.Copyright 2010 ACM 978-1-4503-0170-1/10/10 ...$10.00. more appropriate to someone else’s situation than to one’sown”. It includes perspective taking, the understanding of affective states of others and communication of a feeling of care [6]. Therefore, empathy is often related to helping be-haviour and friendship: people tend to be more empathictowards friends than towards strangers [11].Although no precise definition of the internal processes of empathy exists to date, most researchers agree that empa-thy has at least two phases. First, the assessment of theother’s affective state and, in a second phase, a reaction (ei-ther by affective responses or“cognitive”actions) taking intoaccount the other’s state. Therefore, to endow social robotsor virtual agents with empathic capabilities, we need to (1)recognise some of the user’s affective states and (2) define aset of empathic behaviours to be expressed by the robot tak-ing into account those states. These two phases are equallyimportant. As discussed by Cramer et al. [5], the incorrectassessment of the user’s affective states (and consequent in-appropriate empathic behaviours) can have negative effectson user’s attitudes towards robots.Our goal is to develop an empathic robot companion ca-pable of recognising some of the user’s affective states andreacting in an appropriate manner. We hypothesise thatwith this social capability, users would be willing to continuethe interaction and eventually establish a social relationshipwith the robot. To achieve this goal, we are developing anaffect recognition system capable of detecting user’s natural-istic affective states in a real world environment [3]. Onceour companion is able to recognise some of the user’s affec-tive states, another important question arises: how can weuse the knowledge about the user’s state to actually improvethe robot’s behaviour? To evaluate the impact of empathicbehaviours on people’s perceptions of robotic companions,we developed a scenario in which an iCat robot observes achess match between two players, and behaves in an em-pathic manner by commenting the game and disclosing itsaffective state. In this paper, we present part of the resultsof an experiment conducted within this scenario. 2. RELATED WORK Previous studies have shown some of the benefits of mod-elling empathy in virtual agents [12, 13]. Empathic agentscan better relieve user frustration [9, 10], foster empathicfeelings on users [14], assist users in stressful situations [16],or even provide social support and comfort [1], when com-pared to agents without empathic capabilities.  In contrast to the extensive list of related work concern-ing empathy in virtual agents, only recently the first em-pathic robots started to appear. This may happen due tothe required effort to recognise the user’s affective state inhuman-robot interaction. While during the interaction withvirtual agents the user is often in front of a computer andthe affective state can be predicted, for example, by task-based information that the user provides to the agent orpredefined dialogue utterances, the interaction with robotstends to be more open-ended and thus perceiving user ac-tivity becomes a more challenging task. Nevertheless, thisis changing, considering the first working prototypes of au-tomatic affect recognition using different modalities such asvision, speech or physiological signals [19].Most of the research addressing empathy in human-robotinteraction has focused on emotional contagion, which is aparticular aspect of empathy. One of such examples can befound in [7], where an anthropomorphic robot that recog-nises a simple set of user’s emotions (through speech) mir-rors such emotions using facial expressions, while the userreads a fairy-tale in an “emotional” way.In the same line of research, Riek and Robinson [17] con-ducted a study in which a robot with the form of a chim-panzee head mimics the user’s mouth and head movements.The results of this study suggest that people interacting withthe facial-mimicking robot considered the interaction moresatisfactory than participants who interacted with a versionof the robot without mimicking capabilities.More recently, a study assessing the effects of empathic be-haviours in people’s attitudes towards robots was performed[5]. The experiment consisted of a video-based survey whereparticipants saw a four-minute video with an iCat robotplaying a cooperative game with an actor. The robot dis-played inaccurate or accurate empathic behaviour towardsthe actor (depending on the control group). Results indi-cate that inaccurate empathic behaviours have significantnegative effects on user’s trust towards robots. Also, therelationship between the robot and the actor was perceivedas closer by participants who watched the robot displayingaccurate empathic behaviours. This study is similar to theone that will be presented in this paper, but with some keydifferences in the interaction. The evaluation in this earlierstudy was video-based, whereas in this paper’s study sub- jects interacted directly with the robot and the interactionlasted at least one hour. 3. SCENARIO We developed a scenario where the Philips’ iCat robot [18]observes the chess game between two players, reacting emo-tionally and commenting the moves played on an electronicchessboard in an autonomous way (see Figure 1). The robottreats the two players differently, empathising with one of them - the “companion” - and behaving in a neutral waytowards the other player - the “opponent”.To empathise with the companion, the iCat uses a role-taking approach. After every move played on the chess-board, the robot assesses the companion’s affective stateby appraising the contextual information obtained from thegame. A previous study has shown that, in a game scenariocontext, the state of the game is relevant to discriminatethe valence (positive or negative) of the player’s affectivestate [2]. In the future, we intend to combine this informa-tion with the affect recognition system that we are develop- Figure 1: Users interacting with the iCat. ing, which also takes into account visual information suchas smiles, head movements and other facial features [3].After analysing the state of the game using a chess heuris-tic function in the perspective of the companion, the iCatpredicts the companion’s affective state and updates its ownaffective state accordingly. This way, the robot’s facial ex-pressions will be congruent to the companion’s possible af-fective state, and the comments vary whether the move wasplayed by the companion or by the opponent (for more de-tails on the generation of the robot’s empathic behavioursplease see [15]).When the iCat comments the companion’s moves, thecomments are much more empathic, in the attempt to moti-vate and encourage the companion (e.g. “you’re doing great,carry on!”,“don’t worry, you didn’t had better options”, ...).When the iCat comments the opponent’s moves, the utter-ances merely indicate the quality of the move in a very neu-tral way (e.g. “not a very good move”,“you played well thistime”, ...). Also, during the game, the robot looks at thecompanion two times more than it does to the opponentand, while commenting the moves, it uses the companion’sname more frequently (also two times more). The empathicand neutral behaviours displayed by the robot were inspiredon characteristics of empathic teachers [4]. 4. STUDY The objective of this experiment was to evaluate people’sperceptions of an empathic robot. Forty subjects took partin the experiment (36 male and 4 female, ages ranging be-tween 18 and 28, mean age 21.5). All participants wereundergraduate or graduate students recruited via email whoknew how to play chess and had never interacted with theiCat before.Participants were paired up based on their availabilityand, at the assigned schedule, they were asked to play anentire game against each other, having the iCat on theirside commenting the moves. At the end of the game, sub- jects were guided to a different room where they answereda friendship questionnaire and filled in a set of open-endedquestions to assess which goals and expectations participantshad when interacting with the iCat (“I liked that iCat...”,“When I played bad, iCat...”, “When I played well, iCat...”,“When I was feeling insecure about the game, iCat...” and“What would make me interact with iCat again is...”).Two different conditions concerning the iCat’s behaviourwere evaluated. Players towards whom the robot behaved inan empathic manner belong to the  empathic   condition, and  Figure 2: Most frequent answers to the open ques-tion “I liked that iCat...”. the remaining players belong to a control group ( neutral  ).This means that we have 20 subjects in each condition. 4.1 Results In this subsection we present the most interesting findingscollected from the open questions that subjects were askedabout their experience with the iCat. As we are workingwith qualitative data, the corpus was analysed manually.For each question, the similar responses were categorisedand associated to a label. After that, the frequencies of these categories were analysed for each condition (neutraland empathic). 4.1.1 I liked that iCat... Participants in both conditions, stated that they liked thatthe iCat provided feedback on their moves, and the fact thatthe robot used their names when speaking (for more detailssee Figure 2). In the empathic condition, almost half thesubjects also mentioned that they liked the iCat because itencouraged them in the difficult moments of the game: “iCat knew exactly the best moves I should play,and even when the game was almost lost it kept giving me hope to continue” Another participant in the empathic condition even men-tioned that the robot elicited empathy feelings on him: “I liked that the iCat used my name and com-mented my moves. Its facial expressions and move-ment made me feel empathy” 4.1.2 When I played bad, iCat... In both conditions most users acknowledged that the robotwarned them about their bad moves. In addition, some of the subjects in the empathic condition answered that theiCat got sad when they played bad, and the opposite forthe neutral condition (the iCat got happy). Four partici-pants in the empathic condition also mentioned that whenthey played bad moves, the robot encouraged them to playbetter: “The iCat got sad... but it was nice to me, saying that he was expecting more.” 4.1.3 When I played well, iCat... Almost all subjects said that the iCat congratulated themwhen they played good moves. Some participants in theempathic condition stated that the robot got happy whenthey played good moves, and some subjects in the neutralcondition said that the robot got sad. Like in the previousassertion, eight subjects from the empathic condition alsoadded that the robot encouraged them to play better: “When I played a good move, iCat demonstrated his support, and I felt good with myself.” In some situations, participants in the neutral conditiondid not agree with the robot’s evaluation of the game, yetthey tried to took advantage of the situation: “When I played well, sometimes the iCat said I didn’t, but I was taking risks. In some situations I was trying to bluff and fool my opponent, and iCat’s opposite comments were good for me be-cause my opponent seemed to give lots of rele-vance to them.” 4.1.4 When I was feeling insecure about the game,iCat... In this question, nearly one third of the subjects indicatedthat they did not feel insecure in any part of the game.For the other cases, the opinions differed among conditions.While around half the participants in the neutral group didnot notice any differences in the iCat’s behaviour, six sub- jects in the empathic group stated that the robot encouragedthem when they felt insecure during the game. “When I felt insecure during the game, the iCat tried to make me calm, so I could better play my next moves.” On the other hand, some of the subjects in the neutral con-dition recognised that the iCat supported more their oppo-nent: “It didn’t help much... I got the feeling that iCat was supporting my opponent the whole time and didn’t care about me.” 4.1.5 What would make me interact with iCat againwould be... The answers to this question could be categorised in fourdifferent topics: (1) subjects who would like to interact withthe robot again because they had fun during the interac-tion, (2) subjects who would like to play against the iCat,(3) those who wanted to improve their chess skills and (4)participants who would like to repeat the interaction as it is(playing another chess match with the iCat commenting thegame). The frequencies of these categories for each condi-tion are depicted in Figure 3. While in the neutral conditionalmost half the subjects would like to interact again with theiCat just for fun, participants in the empathic group wouldlike to play another game in this same setting and improvetheir chess skills.In addition to these motives, some of the participants alsowould like that the iCat could explain in more detail the rea-son for its comments. For instance, explain why a certainmove was good or bad, or suggest other moves when usersplay a bad move. Furthermore, some users in both condi-tions claimed that they would like to perform other type of activities with the iCat.  Figure 3: Most frequent answers to the open ques-tion “What would make me interact with iCatagain...”. 5. CONCLUSIONS AND FUTURE WORK In the last few years, promising methods for affect recog-nition in real-world settings have been reported in the lit-erature, some of which can be extremely useful in human-robot interaction. However, robots capable of recognisingthe user’s affect in real-time and selecting appropriate re-sponses taking into account the user’s state are still notnumerous. We believe that the latter capability is as im-portant as the first: what is the advantage of having anaccurate affect recognition system if, in the end, the robotbehaves in the same way? Therefore, while working on anaffect recognition system, we are also addressing aspects of empathy and how that may influence the possible relationestablished between the user and the robot.This paper presented the results of a study about peo-ple’s perceptions of a robot displaying empathic and neutralbehaviours. By analysing the answers that users provided,we can conclude that the empathic behaviours of the robotwere well recognised by users. Participants towards whomthe iCat behaved in an empathic manner found the robotmore encouraging and more sensible to their feelings. Also,more subjects from the empathic condition would like tointeract again with the robot in this scenario.This study has some limitations in terms of the sample.Ideally the sample should be gender-balanced, but only fourwomen participated in this experiment, as it was performedat a computer science university where most students aremale.In the future, we intend to integrate in this scenario anaffect recognition system that considers not only the contextof the task but also visual information from the user, andperform an experiment with repeated interactions (the sameusers playing several games). With more accurate informa-tion on the user’s affect, the robot should be able to respondto its companion even in a more sociable acceptable manner.Also, we are planning to improve the robot’s responses by,in the long-term, adapting certain empathic behaviours to aparticular user. 6. ACKNOWLEDGMENTS This work was supported by the EU FP7 ICT-215554project LIREC (LIving with Robots and intEractive Com-panions), FCT (INESC-ID multiannual funding) throughthe PIDDAC Program funds, and 3 scholarships (SFRH/BD41358/2007, 41585/2007 and 62174/2009) granted by FCT. 7. REFERENCES [1] T. Bickmore and D. Schulman. Practical approachesto comforting users with relational agents. In  CHI ’07:CHI ’07 extended abstracts on Human factors in computing systems  , pages 2291–2296, New York, NY,USA, 2007. ACM.[2] G. Castellano, I. Leite, A. Pereira, C. Martinho,A. Paiva, and P. McOwan. It’s all in the game:Towards an affect sensitive and context aware gamecompanion. In  Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on  , pages 1 –8, sept. 2009.[3] G. Castellano, A. Pereira, I. Leite, A. Paiva, and P. W.McOwan. Detecting user engagement with a robotcompanion using task and social interaction-basedfeatures. In  ICMI-MLMI ’09: Proceedings of the 2009 international conference on Multimodal interfaces  ,pages 119–126, New York, NY, USA, 2009. ACM.[4] B. Cooper, P. Brna, and A. Martins. Effectiveaffective in intelligent systems - building on evidenceof empathy in teaching and learning. In A. Paiva,editor,  IWAI  , volume 1814 of   Lecture Notes in Computer Science  , pages 21–34. Springer, 1999.[5] H. Cramer, J. Goddijn, B. Wielinga, and V. Evers.Effects of (in)accurate empathy and situationalvalence on attitudes towards robots. In  HRI ’10:Proceeding of the 5th ACM/IEEE international conference on Human-robot interaction  , pages141–142, New York, NY, USA, 2010. ACM.[6] A. P. Goldstein and G. Y. Michaels.  Empathy :development, training, and consequences / Arnold P.Goldstein, Gerald Y. Michaels  . New AmericanLibrary, 1985.[7] F. Hegel, T. Spexard, T. Vogt, G. Horstmann, andB. Wrede. Playing a different imitation game:Interaction with an Empathic Android Robot. In Proc. 2006 IEEE-RAS International Conference on Humanoid Robots (Humanoids06) , pages 56–61, 2006.[8] M. Hoffman.  History of the concept of empathy  .Cambridge University Press, 2000.[9] K. Hone. Empathic agents to reduce user frustration:The effects of varying agent characteristics.  Interact.Comput. , 18(2):227–245, 2006.[10] J. Klein, Y. Moon, and R. W. Picard. This computerresponds to user frustration. In  CHI ’99: CHI ’99 extended abstracts on Human factors in computing systems  , pages 242–243, New York, NY, USA, 1999.ACM.[11] D. L. Krebs. Altruism: An examination of the conceptand a review of the literature.  Psychological Bulletin  ,73(4):258–302, 1970.[12] S. W. McQuiggan and J. C. Lester. Modeling andevaluating empathy in embodied companion agents. Int. J. Hum.-Comput. Stud. , 65(4):348–360, 2007.[13] M. Ochs, C. Pelachaud, and D. Sadek. An empathicvirtual dialog agent to improve human-machineinteraction. In  AAMAS ’08: Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems  , pages 89–96, Richland, SC,2008. International Foundation for AutonomousAgents and Multiagent Systems.[14] A. Paiva, J. Dias, D. Sobral, R. Aylett, P. Sobreperez,  S. Woods, C. Zoll, and L. Hall. Caring for agents andagents that care: Building empathic relations withsynthetic agents. In  AAMAS ’04: Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems  , pages 194–201,Washington, DC, USA, 2004. IEEE Computer Society.[15] A. Pereira, I. Leite, S. Mascarenhas, C. Martinho, andA. Paiva. Using empathy to improve human-robotrelationships. In  Proceedings of the 3rd International Conference on Human-Robot Personal Relationships  .Springer, 2010.[16] H. Prendinger and M. Ishizuka. the EmpathicCompanion: a Character-Based Interface ThatAddresses Users’ Affective States.  Applied Artificial Intelligence  , 19(3-4):267–285, 2005.[17] L. D. Riek, P. C. Paul, and P. Robinson. When myrobot smiles at me: Enabling human-robot rapport viareal-time head gesture mimicry.  Journal on Multimodal User Interfaces  , 3(1-2):99–108, 2010.[18] A. van Breemen, X. Yan, and B. Meerbeek. icat: ananimated user-interface robot with personality. In AAMAS ’05: Proceedings of the fourth international  joint conference on Autonomous agents and multiagent systems  , pages 143–144, New York, NY,USA, 2005. ACM.[19] Z. Zeng, M. Pantic, G. I. Roisman, and T. S. Huang.A survey of affect recognition methods: Audio, visual,and spontaneous expressions.  IEEE Transactions on Pattern Analysis and Machine Intelligence  ,31(1):39–58, January 2009.
Related Search
Similar documents
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks