TY - JOUR
T1 - Analysis of implicit robot control methods for joint task execution
AU - Guinot, Lena
AU - Ando, Kozo
AU - Takahashi, Shota
AU - Iwata, Hiroyasu
N1 - Funding Information:
This work was supported by JST SPRING, Grant Number JPMJSP2128, Waseda University Global Robot Academia Institute, Waseda University Green Computing Systems Research Organization and by JST ERATO Grant Number JPMJER1701.
Publisher Copyright:
© 2023, The Author(s).
PY - 2023/12
Y1 - 2023/12
N2 - Body language is an essential component of communication. The amount of unspoken information it transmits during interpersonal interactions is an invaluable complement to simple speech and makes the process smoother and more sustainable. On the contrary, existing approaches to human–machine collaboration and communication are not as intuitive. This is an issue that needs to be addressed if we aim to continue using artificial intelligence and machines to increase our cognitive or even physical capabilities. In this study, we analyse the potential of an intuitive communication method between biological and artificial agents, based on machines understanding and learning the subtle unspoken and involuntary cues found in human motion during the interaction process. Our work was divided into two stages: the first, analysing whether a machine using these implicit cues would produce the same positive effect as when they are manifested in interpersonal communication; the second, evaluating whether a machine could identify the cues manifested in human motion and learn (through the use of Long-Short Term Memory Networks) to associate them with the appropriate command intended from its user. Promising results were gathered, showing an improved work performance and reduced cognitive load on the user side when relying on the proposed method, hinting to the potential of more intuitive, human to human inspired, communication methods in human–machine interaction.
AB - Body language is an essential component of communication. The amount of unspoken information it transmits during interpersonal interactions is an invaluable complement to simple speech and makes the process smoother and more sustainable. On the contrary, existing approaches to human–machine collaboration and communication are not as intuitive. This is an issue that needs to be addressed if we aim to continue using artificial intelligence and machines to increase our cognitive or even physical capabilities. In this study, we analyse the potential of an intuitive communication method between biological and artificial agents, based on machines understanding and learning the subtle unspoken and involuntary cues found in human motion during the interaction process. Our work was divided into two stages: the first, analysing whether a machine using these implicit cues would produce the same positive effect as when they are manifested in interpersonal communication; the second, evaluating whether a machine could identify the cues manifested in human motion and learn (through the use of Long-Short Term Memory Networks) to associate them with the appropriate command intended from its user. Promising results were gathered, showing an improved work performance and reduced cognitive load on the user side when relying on the proposed method, hinting to the potential of more intuitive, human to human inspired, communication methods in human–machine interaction.
KW - Human-robot interaction
KW - Human–machine cooperation
KW - Implicit control
KW - Machine learning
UR - http://www.scopus.com/inward/record.url?scp=85160340113&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85160340113&partnerID=8YFLogxK
U2 - 10.1186/s40648-023-00249-9
DO - 10.1186/s40648-023-00249-9
M3 - Article
AN - SCOPUS:85160340113
SN - 2197-4225
VL - 10
JO - ROBOMECH Journal
JF - ROBOMECH Journal
IS - 1
M1 - 12
ER -