Allocentric emotional affordances in HRI: The multimodal binding

Jordi Vallverdú*, Gabriele Trovato, Lorenzo Jamone

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

The concept of affordance perception is one of the distinctive traits of human cognition; and its application to robots can dramatically improve the quality of human-robot interaction (HRI). In this paper we explore and discuss the idea of “emotional affordances” by proposing a viable model for implementation into HRI; which considers allocentric and multimodal perception. We consider “2-ways” affordances: perceived object triggering an emotion; and perceived human emotion expression triggering an action. In order to make the implementation generic; the proposed model includes a library that can be customised depending on the specific robot and application scenario. We present the AAA (Affordance-Appraisal-Arousal) model; which incorporates Plutchik’s Wheel of Emotions; and we outline some numerical examples of how it can be used in different scenarios.

Original languageEnglish
Article number78
JournalMultimodal Technologies and Interaction
Volume2
Issue number4
DOIs
Publication statusPublished - 2018 Dec
Externally publishedYes

Keywords

  • Affordance
  • Allocentric
  • Emotion
  • Empathy
  • HRI
  • Libraries
  • Multimodal

ASJC Scopus subject areas

  • Neuroscience (miscellaneous)
  • Human-Computer Interaction
  • Computer Science Applications
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Allocentric emotional affordances in HRI: The multimodal binding'. Together they form a unique fingerprint.

Cite this