Emotional affordances for human–robot interaction

Jordi Vallverdú*, Gabriele Trovato

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

25 Citations (Scopus)


This paper provides a new concept for the improvement of human–robot interaction (HRI) models: ‘emotional affordances’. Emotional affordances are all the mechanisms that have emotional content as a way to transmit and/or collect emotional meaning about any context; it can include bodily expressions, social norms, values-laden objects or extended space, among others. With this rich concept, we open the way to new ways to understand the multimodal and complex nature of emotional mechanisms. Based on the grounded emotional mechanisms of human cognition and behaviour (that is, based and result of the bodily structure and its coupled relationship with the natural and/or social environment), the purpose of this paper is focused on the definition of a framework for the design of a taxonomy of emotional affordances, useful for a modal and improved understanding of the domains of emotional interactions that can emerge between humans and robots. This process will make possible in next research steps to define processing modules as well as to elicit visual display outputs (expressing emotions). Consequently, with this project we provide robotic experts with a unified taxonomy of human emotional affordances, useful for the improvement of HRI projects.

Original languageEnglish
Pages (from-to)320-334
Number of pages15
JournalAdaptive Behavior
Issue number5
Publication statusPublished - 2016 Oct 1
Externally publishedYes


  • Emotional affordance
  • affective libraries
  • cognitive libraries
  • emotional taxonomy
  • grounded cognition
  • grounded emotion
  • human–robot interaction
  • modelling

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Philosophy
  • Artificial Intelligence


Dive into the research topics of 'Emotional affordances for human–robot interaction'. Together they form a unique fingerprint.

Cite this