Understanding nonverbal communication cues of human personality traits in human-robot interaction

Zhihao Shen*, Armagan Elibol, Nak Young Chong


研究成果: Article査読

20 被引用数 (Scopus)


With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users'mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user's personality traits based on the user's nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient MFCC. We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant's habitual behavior using its on-board sensors. On the other hand, each participant's personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine SVM classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.

ジャーナルIEEE/CAA Journal of Automatica Sinica
出版ステータスPublished - 2020 11月

ASJC Scopus subject areas

  • 制御およびシステム工学
  • 情報システム
  • 人工知能


「Understanding nonverbal communication cues of human personality traits in human-robot interaction」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。