Multiple-camera based hand pose estimation method using distance transformation

Akira Utsumi*, Tsutomu Miyasato, Fumio Kishino, Jun Ohya, Ryohei Nakatsu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


We describe a method for detecting hand position, posture, and finger bendings using multiple camera images. Stable detection can be achieved using distance transformed images. We detect the maximum point in each distance transformed image as the center of gravity (COG) point of the hand region and calculate its 3D position by stereo matching. The distance value of a COG point varies according to the angle between the camera axis and normal axis of the hand plane. Hand rotation angle can be determined in maximum likelihood estimation from the distance values in all camera images. Using the detected position and posture, the best camera for hand shape detection can be selected. This camera selection makes the hand shape detection simple and stable. This system can be used as a user interface device in a virtual environment, replacing glove-type devices and overcoming most of the disadvantages of contact-type devices.

Original languageEnglish
Pages (from-to)2116-2125
Number of pages10
JournalKyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers
Issue number12
Publication statusPublished - 1997
Externally publishedYes

ASJC Scopus subject areas

  • Media Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering


Dive into the research topics of 'Multiple-camera based hand pose estimation method using distance transformation'. Together they form a unique fingerprint.

Cite this