ECA control using a single affective User dimension

Fred Charles, Florian Pecune, Gabor Aranyi, Catherine Pelachaud, Marc Cavazza*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

User interaction with Embodied Conversational Agents (ECA) should involve a significant affective component to achieve realism in communication. This aspect has been studied through different frameworks describing the relationship between user and ECA, for instance alignment, rapport and empathy. We conducted an experiment to explore how an ECA's non-verbal expression can be controlled to respond to a single affective dimension generated by users as input. Our system is based on the mapping of a high-level affective dimension, approach/avoidance, onto a new ECA control mechanism in which Action Units (AU) are activated through a neural network. Since 'approach' has been associated to prefrontal cortex activation, we use a measure of prefrontal cortex left-asymmetry through fNIRS as a single input signal representing the user's attitude towards the ECA. We carried out the experiment with 1 0 subjects, who have been instructed to express a positive mental attitude towards the ECA. In return, the ECA facial expression would reflect the perceived attitude under a neurofeedback paradigm. Our results suggest that users are able to successfully interact with the ECA and perceive its response as consistent and realistic, both in terms of ECA responsiveness and in terms of relevance of facial expressions. From a system perspective, the empirical calibration of the network supports a progressive recruitment of various AUs, which provides a principled description of the ECA response and its intensity. Our findings suggest that complex ECA facial expressions can be successfully aligned with one high-level affective dimension. Furthermore, this use of a single dimension as input could support experiments in the fine-tuning of AU activation or their personalization to user preferred modalities.

Original languageEnglish
Title of host publicationICMI 2015 - Proceedings of the 2015 ACM International Conference on Multimodal Interaction
PublisherAssociation for Computing Machinery, Inc
Pages183-190
Number of pages8
ISBN (Electronic)9781450339124
DOIs
Publication statusPublished - 2015 Nov 9
Externally publishedYes
EventACM International Conference on Multimodal Interaction, ICMI 2015 - Seattle, United States
Duration: 2015 Nov 92015 Nov 13

Publication series

NameICMI 2015 - Proceedings of the 2015 ACM International Conference on Multimodal Interaction

Conference

ConferenceACM International Conference on Multimodal Interaction, ICMI 2015
Country/TerritoryUnited States
CitySeattle
Period15/11/915/11/13

Keywords

  • Brain-computer interface
  • Embodied Conversational Agent
  • Functional near-infrared spectroscopy (FNIRS)

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'ECA control using a single affective User dimension'. Together they form a unique fingerprint.

Cite this