The empathic companion: A character-based interface that addresses users' affective states

Helmut Prendinger*, Mitsuru Ishizuka

*この研究の対応する著者

研究成果: Article査読

148 被引用数 (Scopus)

抄録

In this paper, we report on our efforts in developing affective character-based interfaces, i.e., interfaces that recognize and measure affective information of the user and address user affect by employing embodied characters. In particular, we describe the Empathic Companion, an animated interface agent that accompanies the user in the setting of a virtual job interview. This interface application takes physiological data (skin conductance and electromyography) of a user in realtime, interprets them as emotions, and addresses the user's affective states in the form of empathic feedback. The Empathic Companion is conceived as an educational agent that supports job seekers preparing for a job interview. We also present results from an exploratory study that aims to evaluate the impact of the Empathic Companion by measuring users' skin conductance and heart rate. While an overall positive effect of the Empathic Companion could not be shown, the outcome of the experiment suggests that empathic feedback has a positive effect on the interviewee's stress level while hearing the interviewer question.

本文言語English
ページ(範囲)267-285
ページ数19
ジャーナルApplied Artificial Intelligence
19
3-4
DOI
出版ステータスPublished - 2005 3月
外部発表はい

ASJC Scopus subject areas

  • 制御およびシステム工学
  • 電子工学および電気工学
  • 人工知能

フィンガープリント

「The empathic companion: A character-based interface that addresses users' affective states」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル