Independent component analysis minimizing convex divergence

Yasuo Matsuyama*, Naoto Katsumata, Ryo Kawamura

*この研究の対応する著者

    研究成果: Article査読

    8 被引用数 (Scopus)

    抄録

    A new class of learning algorithms for independent component analysis (ICA) is presented. Starting from theoretical discussions on convex divergence, this information measure is minimized to derive new ICA algorithms. Since the convex divergence includes logarithmic information measures as special cases, the presented method comprises faster algorithms than existing logarithmic ones. Another important feature of this paper's ICA algorithm is to accept supervisory information. This ability is utilized to reduce the permutation indeterminacy which is inherent in usual ICA. By this method, the most important activation pattern can be found as the top one. The total algorithm is tested through applications to brain map distillation from functional MRI data. The derived algorithm is faster than logarithmic ones with little additional memory requirement, and can find task related brain maps successfully via conventional personal computer.

    本文言語English
    ページ(範囲)27-34
    ページ数8
    ジャーナルLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    2714
    出版ステータスPublished - 2003

    ASJC Scopus subject areas

    • コンピュータ サイエンス(全般)
    • 生化学、遺伝学、分子生物学(全般)
    • 理論的コンピュータサイエンス

    フィンガープリント

    「Independent component analysis minimizing convex divergence」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

    引用スタイル