TY - GEN
T1 - Discrimination of emotion from movement and addition of emotion in movement to improve human-coexistence robot's personal affinity
AU - Matsumaru, Takafumi
N1 - Funding Information:
This study was supported by a grant (SCHR 637 ⁄ 2-1) from the Deutsche Forschungsgemeinschaft (DFG) given to the
PY - 2009
Y1 - 2009
N2 - This paper presents the result of trials of the discrimination of emotion from movement and the addition of emotion in movement on a teddy bear robot, aiming at both expressing a robot's emotion by movement and improving a robot's personal affinity. We addressed four kinds of emotion - joy, anger, sadness, and fear. In this research, two standpoints were considered - a performer and an observer - to establish the data of emotional movement used for analysis. The data of movement were collected as a performer's standpoint and they were sorted out in an observer's standpoint. In discriminating the emotion included in movement from the data of movement, both the method using Laban's feature quantity and the method using principal component analysis were tried. By the discrimination using principal component analysis, about 70% of rate of correct discrimination was obtained on all the four emotions. The feature of movement that each emotion could be interpreted was presumed from the coefficient of the discrimination function obtained in the discrimination using principal component analysis. Using the obtained feature the design principle of movement to add emotion into a basic movement was defined. From the verification experiment, it was suggested that the movement which people can interpret the intended emotion with relatively high probability would be produced about joy and anger. About fear and sadness, since the movement to express those emotions has small and little motions, it would be difficult to distinguish the feature and to produce clear emotional movement.
AB - This paper presents the result of trials of the discrimination of emotion from movement and the addition of emotion in movement on a teddy bear robot, aiming at both expressing a robot's emotion by movement and improving a robot's personal affinity. We addressed four kinds of emotion - joy, anger, sadness, and fear. In this research, two standpoints were considered - a performer and an observer - to establish the data of emotional movement used for analysis. The data of movement were collected as a performer's standpoint and they were sorted out in an observer's standpoint. In discriminating the emotion included in movement from the data of movement, both the method using Laban's feature quantity and the method using principal component analysis were tried. By the discrimination using principal component analysis, about 70% of rate of correct discrimination was obtained on all the four emotions. The feature of movement that each emotion could be interpreted was presumed from the coefficient of the discrimination function obtained in the discrimination using principal component analysis. Using the obtained feature the design principle of movement to add emotion into a basic movement was defined. From the verification experiment, it was suggested that the movement which people can interpret the intended emotion with relatively high probability would be produced about joy and anger. About fear and sadness, since the movement to express those emotions has small and little motions, it would be difficult to distinguish the feature and to produce clear emotional movement.
UR - http://www.scopus.com/inward/record.url?scp=72849123247&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=72849123247&partnerID=8YFLogxK
U2 - 10.1109/ROMAN.2009.5326345
DO - 10.1109/ROMAN.2009.5326345
M3 - Conference contribution
AN - SCOPUS:72849123247
SN - 9781424450817
T3 - Proceedings - IEEE International Workshop on Robot and Human Interactive Communication
SP - 387
EP - 394
BT - RO-MAN 2009 - 18th IEEE International Symposium on Robot and Human Interactive
T2 - 18th IEEE International Symposium on Robot and Human Interactive, RO-MAN 2009
Y2 - 27 September 2009 through 2 October 2009
ER -