Kernel eigenspace-based MLLR adaptation using multiple regression classes

Roger Hsiao*, Brian Mak

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

6 Citations (Scopus)

Abstract

Recently, we have been investigating the application of kernel methods to improve the performance of eigenvoice-based adaptation methods by exploiting possible nonlinearity in their original working space. We proposed the kernel eigenvoice adaptation (KEV) in [1], and the kernel eigenspace-based MLLR adaptation (KEMLLR) in [2]. In KEMLLR, speaker-dependent MLLR transformation matrices are mapped to a kernel-induced high dimensional feature space, and kernel principal component analysis (KPCA) is used to derive a set of eigenmatrices in the feature space. A new speaker is then represented by a linear combination of the leading eigenmatrices. In this paper, we further improve KEMLLR by the use of multiple regression classes and the quasi-Newton BFGS optimization algorithm.

Original languageEnglish
Title of host publication2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05 - Proceedings - Image and Multidimensional Signal Processing Multimedia Signal Processing
PublisherInstitute of Electrical and Electronics Engineers Inc.
PagesI985-I988
ISBN (Print)0780388747, 9780780388741
DOIs
Publication statusPublished - 2005
Event2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05 - Philadelphia, PA, United States
Duration: 18 Mar 200523 Mar 2005

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
VolumeI
ISSN (Print)1520-6149

Conference

Conference2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05
Country/TerritoryUnited States
CityPhiladelphia, PA
Period18/03/0523/03/05

Fingerprint

Dive into the research topics of 'Kernel eigenspace-based MLLR adaptation using multiple regression classes'. Together they form a unique fingerprint.

Cite this