Abstract
In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.
Original language | English |
---|---|
Title of host publication | Proceedings - IEEE International Workshop on Robot and Human Interactive Communication |
Pages | 442-449 |
Number of pages | 8 |
DOIs | |
Publication status | Published - 2001 |
Event | 10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001 - Bordeaux and Paris Duration: 2001 Sept 18 → 2001 Sept 21 |
Other
Other | 10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001 |
---|---|
City | Bordeaux and Paris |
Period | 01/9/18 → 01/9/21 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence
- Human-Computer Interaction