A similarity-based neural network for facial expression analysis

Kenji Suzuki*, Hiroshi Yamada, Shuji Hashimoto

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    14 Citations (Scopus)


    In this paper, we introduce a novel model for the measuring of human subjective evaluation by using Relevance Learning based on a similarity-based multilayer perceptron. This work aims to achieve a multidimensional perceptual scaling that associates the physical features of a face with its semantic vector in a low-dimensional space. Unlike the conventional multilayer perceptron that learns from a set of an input feature vector and the desired output, the proposed network can obtain a nonlinear mapping between the input feature vectors and the outputs from a pair of objects and their desired relevance (distance). We conducted a facial expression analysis with both a psychological model of line-drawing image of facial expression and a real image set. Regarding the construction of semantic space, the proposed approach not only shows a good performance as compared with the conventional statistical method but is also able to project new data that are not used during the training phase. We will show some experimental results and discuss the obtained mapping function.

    Original languageEnglish
    Pages (from-to)1104-1111
    Number of pages8
    JournalPattern Recognition Letters
    Issue number9
    Publication statusPublished - 2007 Jul 1


    • Facial expressions
    • Multidimensional perceptual scaling
    • Nonlinear mapping
    • Similarity-based neural network
    • Stable dynamic parameter adaptation

    ASJC Scopus subject areas

    • Computer Vision and Pattern Recognition
    • Signal Processing
    • Electrical and Electronic Engineering


    Dive into the research topics of 'A similarity-based neural network for facial expression analysis'. Together they form a unique fingerprint.

    Cite this