TY - GEN
T1 - BlindPilot
T2 - 2020 ACM CHI Conference on Human Factors in Computing Systems, CHI EA 2020
AU - Kayukawa, Seita
AU - Ishihara, Tatsuya
AU - Takagi, Hironobu
AU - Morishima, Shigeo
AU - Asakawa, Chieko
N1 - Funding Information:
This work was supported by JST ACCEL (JPMJAC1602) and JST-Mirai Program (JPMJMI19B2).
Publisher Copyright:
© 2020 Owner/Author.
PY - 2020/4/25
Y1 - 2020/4/25
N2 - Blind people face various local navigation challenges in their daily lives such as identifying empty seats in crowded stations, navigating toward a seat, and stopping and sitting at the correct spot. Although voice navigation is a commonly used solution, it requires users to carefully follow frequent navigational sounds over short distances. Therefore, we presented an assistive robot, BlindPilot, which guides blind users to landmark objects using an intuitive handle. BlindPilot employs an RGB-D camera to detect the positions of target objects and uses LiDAR to build a 2D map of the surrounding area. On the basis of the sensing results, BlindPilot then generates a path to the object and guides the user safely. To evaluate our system, we also implemented a sound-based navigation system as a baseline system, and asked six blind participants to approach an empty chair using the two systems. We observed that BlindPilot enabled users to approach a chair faster with a greater feeling of security and less effort compared to the baseline system.
AB - Blind people face various local navigation challenges in their daily lives such as identifying empty seats in crowded stations, navigating toward a seat, and stopping and sitting at the correct spot. Although voice navigation is a commonly used solution, it requires users to carefully follow frequent navigational sounds over short distances. Therefore, we presented an assistive robot, BlindPilot, which guides blind users to landmark objects using an intuitive handle. BlindPilot employs an RGB-D camera to detect the positions of target objects and uses LiDAR to build a 2D map of the surrounding area. On the basis of the sensing results, BlindPilot then generates a path to the object and guides the user safely. To evaluate our system, we also implemented a sound-based navigation system as a baseline system, and asked six blind participants to approach an empty chair using the two systems. We observed that BlindPilot enabled users to approach a chair faster with a greater feeling of security and less effort compared to the baseline system.
KW - Local navigation
KW - Robotic system
KW - Visual impairments
UR - http://www.scopus.com/inward/record.url?scp=85090230043&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090230043&partnerID=8YFLogxK
U2 - 10.1145/3334480.3382925
DO - 10.1145/3334480.3382925
M3 - Conference contribution
AN - SCOPUS:85090230043
T3 - Conference on Human Factors in Computing Systems - Proceedings
BT - CHI EA 2020 - Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
Y2 - 25 April 2020 through 30 April 2020
ER -