Multiple kernel learning with gaussianity measures

Hideitsu Hino*, Nima Reyhani, Noboru Murata

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)


Kernel methods are known to be effective for nonlinear multivariate analysis. One of the main issues in the practical use of kernel methods is the selection of kernel. There have been a lot of studies on kernel selection and kernel learning. Multiple kernel learning (MKL) is one of the promising kernel optimization approaches. Kernel methods are applied to various classifiers including Fisher discriminant analysis (FDA). FDA gives the Bayes optimal classification axis if the data distribution of each class in the feature space is a gaussian with a shared covariance structure. Based on this fact, an MKL framework based on the notion of gaussianity is proposed. As a concrete implementation, an empirical characteristic function is adopted to measure gaussianity in the feature space associated with a convex combination of kernel functions, and twoMKL algorithms are derived. From experimental results on some data sets, we show that the proposed kernel learning followed by FDAoffers strong classification power.

Original languageEnglish
Pages (from-to)1853-1881
Number of pages29
JournalNeural Computation
Issue number7
Publication statusPublished - 2012

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Multiple kernel learning with gaussianity measures'. Together they form a unique fingerprint.

Cite this