TY - GEN
T1 - Touchless human-mobile robot interaction using a projectable interactive surface
AU - Agarwal, R.
AU - Sharma, P.
AU - Saha, S. K.
AU - Matsumaru, T.
N1 - Funding Information:
The research was supported by a grant towards setting up of Programme for autonomous Robotics by BRNS, India. The collaborative works were supported by Graduate School of IPS, Waseda University through its Grant for Special Research Project (2015B-3461, 2016B-203). The authors acknowledge the help from J. P. Khatait for his perceptive criticism, Mr. Vishal Abhishek and members of the Robotics Club.
Publisher Copyright:
© 2016 IEEE.
PY - 2017/2/6
Y1 - 2017/2/6
N2 - This paper showcases the development of a mobile robot integrated with Projectable Interactive Surface to facilitate its interaction with human users. The system was designed to interact with users of any physical attributes such as height, arm span etc. without re-calibrating it. The system was designed in such a way that there would be no need for the human to come in physical contact with the robot to give it instructions. This system uses a projector to render a virtual display on the ground allowing us to project large displays. Microsoft Kinect integrated in the systems performs a dual functionality of tracking the user movements along with mapping the surrounding environment. The gestures of the tracked user are interpreted and an audio visual signal is projected by the robot in response.
AB - This paper showcases the development of a mobile robot integrated with Projectable Interactive Surface to facilitate its interaction with human users. The system was designed to interact with users of any physical attributes such as height, arm span etc. without re-calibrating it. The system was designed in such a way that there would be no need for the human to come in physical contact with the robot to give it instructions. This system uses a projector to render a virtual display on the ground allowing us to project large displays. Microsoft Kinect integrated in the systems performs a dual functionality of tracking the user movements along with mapping the surrounding environment. The gestures of the tracked user are interpreted and an audio visual signal is projected by the robot in response.
UR - http://www.scopus.com/inward/record.url?scp=85015437132&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85015437132&partnerID=8YFLogxK
U2 - 10.1109/SII.2016.7844085
DO - 10.1109/SII.2016.7844085
M3 - Conference contribution
AN - SCOPUS:85015437132
T3 - SII 2016 - 2016 IEEE/SICE International Symposium on System Integration
SP - 723
EP - 728
BT - SII 2016 - 2016 IEEE/SICE International Symposium on System Integration
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 IEEE/SICE International Symposium on System Integration, SII 2016
Y2 - 13 December 2016 through 15 December 2016
ER -