DAVID: An open-source platform for real-time transformation of infra-segmental emotional cues in running speech

Laura Rachman*, Marco Liuni, Pablo Arias, Andreas Lind, Petter Johansson, Lars Hall, Daniel Richardson, Katsumi Watanabe, Stéphanie Dubal, Jean Julien Aucouturier

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)

Abstract

We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a microphone, with less than 20-ms latency. We anticipate that this tool will be useful for the study of emotions in psychology and neuroscience, because it enables a high level of control over the acoustical and emotional content of experimental stimuli in a variety of laboratory situations, including real-time social situations. We present here results of a series of validation experiments aiming to position the tool against several methodological requirements: that transformed emotions be recognized at above-chance levels, valid in several languages (French, English, Swedish, and Japanese) and with a naturalness comparable to natural speech.

Original languageEnglish
Pages (from-to)323-343
Number of pages21
JournalBehavior Research Methods
Volume50
Issue number1
DOIs
Publication statusPublished - 2018 Feb 1

Keywords

  • Emotional transformations
  • Infra-segmental cues
  • Nonverbal behavior
  • Real-time
  • Software
  • Voice

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Psychology (miscellaneous)
  • Psychology(all)

Fingerprint

Dive into the research topics of 'DAVID: An open-source platform for real-time transformation of infra-segmental emotional cues in running speech'. Together they form a unique fingerprint.

Cite this