Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians

Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, Chieko Asakawa

Research output: Contribution to journalArticlepeer-review

24 Citations (Scopus)


We present a guiding system to help blind people walk in public spaces while making their walking seamless with nearby pedestrians. Blind users carry a rolling suitcase-shaped system that has two RGBD Cameras, an inertial measurement unit (IMU) sensor, and light detection and ranging (LiDAR) sensor. The system senses the behavior of surrounding pedestrians, predicts risks of collisions, and alerts users to help them avoid collisions. It has two modes: The "on-path"mode that helps users avoid collisions without changing their path by adapting their walking speed; and the "off-path"mode that navigates an alternative path to go around pedestrians standing in the way Auditory and tactile modalities have been commonly used for non-visual navigation systems, so we implemented two interfaces to evaluate the effectiveness of each modality for collision avoidance. A user study with 14 blind participants in public spaces revealed that participants could successfully avoid collisions with both modalities. We detail the characteristics of each modality.

Original languageEnglish
Article number85
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Issue number3
Publication statusPublished - 2020 Sept 4


  • Visual impairments
  • audio interface
  • blind navigation
  • collision prediction
  • pedestrian avoidance
  • tactile interface

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Hardware and Architecture
  • Human-Computer Interaction


Dive into the research topics of 'Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians'. Together they form a unique fingerprint.

Cite this