Embodied navigation for mobile robot by using direct 3D drawing in the air

Akihiro Osaki*, Tetsuji Kaneko, Yoshiyuki Miwa

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    3 Citations (Scopus)

    Abstract

    This paper proposes a method to navigate a mobile robot in human environment via an interaction through virtual 3D lines, which are handwritten in the air by a robot user who coexists with the robot. The objective of this navigation is to develop the flexible instruction of the appropriate route for a mobile robot in adaptation of various situations easily by embodied interaction between robots and users. The developed system was also integrated with the aerial 3D drawing interface and control system of robots. Mobile robots and a drawing hand position are measured by 6DOF sensors simultaneously in real time. The 3D drawing interface allows a user not only to draw 3D lines in the air based on the sensed position in the real world, but also to push, grasp and throw a drawn line manually on site. Each mobile robots are distinguished by an association with a color information of each drawn line, and search the own line in real time individually. Therefore, in this method, it was achieved that the user individually sets, and selects a specific route and change the route by drawing or manipulating lines while the robots are moving. We conducted some navigation experiments using two omni-wheel mobile robots and examined the aforementioned navigation function. Additionally, we tried some advanced navigation that can be achieved only by drawing, such as the dog-walking by likening a virtual line to string. As a result this method enables a user to instruct some complex path easily and change the root flexibly with interaction through virtual lines in response to changing environments.

    Original languageEnglish
    Title of host publicationProceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
    Pages671-676
    Number of pages6
    DOIs
    Publication statusPublished - 2008
    Event17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN - Munich
    Duration: 2008 Aug 12008 Aug 3

    Other

    Other17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
    CityMunich
    Period08/8/108/8/3

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Vision and Pattern Recognition
    • Human-Computer Interaction

    Fingerprint

    Dive into the research topics of 'Embodied navigation for mobile robot by using direct 3D drawing in the air'. Together they form a unique fingerprint.

    Cite this