Extracting facial motion parameters by tracking feature points

Takahiro Otsuka, Jun Ohya

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)


A method for extracting facial motion parameters is pro- posed. The method consists of three steps. First, the feature points of the face, selected automatically in the first frame, are tracked in succes- sive frames. Then, the feature points are connected with Delaunay tri- angulation so that the motion of each point relative to the surrounding points can be computed. Finally, muscle motions are estimated based on motions of the feature points placed near each muscle. The experiments showed that the proposed method can extract facial motion parameters accurately. In addition, the facial motion parameters are used to render a facial animation sequence.

Original languageEnglish
Title of host publicationAdvanced Multimedia Content Processing - 1st International Conference, AMCP 1998, Proceedings
EditorsShojiro Nishio, Fumio Kishino
PublisherSpringer Verlag
Number of pages12
ISBN (Print)3540657622, 9783540657620
Publication statusPublished - 1999
Externally publishedYes
Event1st International Conference on Advanced Multimedia Content Processing, AMCP 1998 - Osaka, Japan
Duration: 1998 Nov 91998 Nov 11

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Other1st International Conference on Advanced Multimedia Content Processing, AMCP 1998

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)


Dive into the research topics of 'Extracting facial motion parameters by tracking feature points'. Together they form a unique fingerprint.

Cite this