Abstract
We propose a method to detect hand position, posture and shapes from multiple-viewpoint images. We employ a simple elliptic model and a small number of reliable image features detected in multiple-viewpoint images to estimate the pose (position and normal axis) of a human hand, where feature extraction is employed based on distance transformation. The COG (center of gravity) position and its distance value are extracted in the process. These features are robust against changes in hand shape and can produce stable pose estimations. A `best view' is selected from the estimation results, and hand shape recognition is performed based on a Fourier descriptor. This viewpoint selection approach can overcome the problem of self-occlusion. This system can be used as a user interface device in a virtual environment, replacing glove-type devices and overcoming most of the disadvantages of contact-type devices.
Original language | English |
---|---|
Pages | 264-267 |
Number of pages | 4 |
Publication status | Published - 1998 Jan 1 |
Externally published | Yes |
Event | Proceedings of the 1998 International Conference on Multimedia Computing and Systems - Austin, TX, USA Duration: 1998 Jun 28 → 1998 Jul 1 |
Other
Other | Proceedings of the 1998 International Conference on Multimedia Computing and Systems |
---|---|
City | Austin, TX, USA |
Period | 98/6/28 → 98/7/1 |
ASJC Scopus subject areas
- Computer Science(all)
- Engineering(all)