TY - GEN
T1 - Development of a cooperative work method based on autonomous learning of implicit instructions
AU - Guinot, Lena
AU - Iwasaki, Yukiko
AU - Takahashi, Shota
AU - Iwata, Hiroyasu
N1 - Publisher Copyright:
© 2020 Association for Computing Machinery.
PY - 2020/5/27
Y1 - 2020/5/27
N2 - Cooperative work with wearable robotics designed as an “extended body” for the wearer has the potential to improve individual productivity regardless of the context. The final purpose of this research it to design a new communication method between the wearer and a wearable robot arm as they perform daily chores simultaneously. Among previous studies on wearable robot arms, very little quantify the magnitude of the impact of robot operation on attention distribution and psychological burden for the user. The present paper presents an approach based on the idea that the robot arm could understand human intentions by reading implicit instruction cues nested in the natural motion flow of the operator performing a task. The present paper describes an Inertial Measurement Unit (IMU) sensor data - deep learning approach that enables the robot arm to learn these cues. The validity of the method was evaluated on three indexes: implicit instruction estimation accuracy, secondary task completion quality, and cognitive burden for the wearer. Results showed considerable improvement on all these proposed axes compared to other explicit operation methods (such as voice instructions), along with better results than similar implicit instruction-based researches.
AB - Cooperative work with wearable robotics designed as an “extended body” for the wearer has the potential to improve individual productivity regardless of the context. The final purpose of this research it to design a new communication method between the wearer and a wearable robot arm as they perform daily chores simultaneously. Among previous studies on wearable robot arms, very little quantify the magnitude of the impact of robot operation on attention distribution and psychological burden for the user. The present paper presents an approach based on the idea that the robot arm could understand human intentions by reading implicit instruction cues nested in the natural motion flow of the operator performing a task. The present paper describes an Inertial Measurement Unit (IMU) sensor data - deep learning approach that enables the robot arm to learn these cues. The validity of the method was evaluated on three indexes: implicit instruction estimation accuracy, secondary task completion quality, and cognitive burden for the wearer. Results showed considerable improvement on all these proposed axes compared to other explicit operation methods (such as voice instructions), along with better results than similar implicit instruction-based researches.
KW - Deep learning
KW - IMU sensor
KW - Interface design
KW - Robot collaboration
KW - Robot control
UR - http://www.scopus.com/inward/record.url?scp=85123041939&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123041939&partnerID=8YFLogxK
U2 - 10.1145/3396339.3396403
DO - 10.1145/3396339.3396403
M3 - Conference contribution
AN - SCOPUS:85123041939
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 11th Augmented Human International Conference, AH 2020
PB - Association for Computing Machinery
T2 - 11th Augmented Human International Conference, AH 2020
Y2 - 27 May 2020 through 29 May 2020
ER -