TY - JOUR
T1 - Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians
AU - Kayukawa, Seita
AU - Ishihara, Tatsuya
AU - Takagi, Hironobu
AU - Morishima, Shigeo
AU - Asakawa, Chieko
N1 - Publisher Copyright:
© 2020 ACM.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/9/4
Y1 - 2020/9/4
N2 - We present a guiding system to help blind people walk in public spaces while making their walking seamless with nearby pedestrians. Blind users carry a rolling suitcase-shaped system that has two RGBD Cameras, an inertial measurement unit (IMU) sensor, and light detection and ranging (LiDAR) sensor. The system senses the behavior of surrounding pedestrians, predicts risks of collisions, and alerts users to help them avoid collisions. It has two modes: The "on-path"mode that helps users avoid collisions without changing their path by adapting their walking speed; and the "off-path"mode that navigates an alternative path to go around pedestrians standing in the way Auditory and tactile modalities have been commonly used for non-visual navigation systems, so we implemented two interfaces to evaluate the effectiveness of each modality for collision avoidance. A user study with 14 blind participants in public spaces revealed that participants could successfully avoid collisions with both modalities. We detail the characteristics of each modality.
AB - We present a guiding system to help blind people walk in public spaces while making their walking seamless with nearby pedestrians. Blind users carry a rolling suitcase-shaped system that has two RGBD Cameras, an inertial measurement unit (IMU) sensor, and light detection and ranging (LiDAR) sensor. The system senses the behavior of surrounding pedestrians, predicts risks of collisions, and alerts users to help them avoid collisions. It has two modes: The "on-path"mode that helps users avoid collisions without changing their path by adapting their walking speed; and the "off-path"mode that navigates an alternative path to go around pedestrians standing in the way Auditory and tactile modalities have been commonly used for non-visual navigation systems, so we implemented two interfaces to evaluate the effectiveness of each modality for collision avoidance. A user study with 14 blind participants in public spaces revealed that participants could successfully avoid collisions with both modalities. We detail the characteristics of each modality.
KW - Visual impairments
KW - audio interface
KW - blind navigation
KW - collision prediction
KW - pedestrian avoidance
KW - tactile interface
UR - http://www.scopus.com/inward/record.url?scp=85092430864&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85092430864&partnerID=8YFLogxK
U2 - 10.1145/3411825
DO - 10.1145/3411825
M3 - Article
AN - SCOPUS:85092430864
SN - 2474-9567
VL - 4
JO - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
JF - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
IS - 3
M1 - 85
ER -