Learning the kernel matrix by maximizing a KFD-based class separability criterion

Dit Yan Yeung*, Hong Chang, Guang Dai

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

Abstract

The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods.

Original languageEnglish
Pages (from-to)2021-2028
Number of pages8
JournalPattern Recognition
Volume40
Issue number7
DOIs
Publication statusPublished - Jul 2007

Keywords

  • Face recognition
  • Fisher discriminant criterion
  • Kernel Fisher discriminant
  • Kernel learning

Fingerprint

Dive into the research topics of 'Learning the kernel matrix by maximizing a KFD-based class separability criterion'. Together they form a unique fingerprint.

Cite this