You are here

Kernel Methods and Component Analysis for Pattern Recognition

Title: Kernel Methods and Component Analysis for Pattern Recognition.
82 views
40 downloads
Name(s): Isaacs, Jason C., author
Foo, Simon Y., professor co-directing dissertation
Meyer-Baese, Anke, professor co-directing dissertation
Liu, Xiuwen, outside committee member
Chan-Hilton, Amy, outside committee member
Department of Electrical and Computer Engineering, degree granting department
Florida State University, degree granting institution
Type of Resource: text
Genre: Text
Issuance: monographic
Date Issued: 2007
Publisher: Florida State University
Place of Publication: Tallahassee, Florida
Physical Form: computer
online resource
Extent: 1 online resource
Language(s): English
Abstract/Description: Kernel methods, as alternatives to component analysis, are mathematical tools that provide a higher dimensional representation, for feature recognition and image analysis problems. In machine learning, the kernel trick is a method for converting a linear classification learning algorithm into non-linear one, by mapping the original observations into a higher-dimensional space so that the use of a linear classifier in the new space is equivalent to a non-linear classifier in the original space. In this dissertation we present the performance results of several continuous distribution function kernels, lattice oscillation model kernels, Kelvin function kernels, and orthogonal polynomial kernels on select benchmarking databases. In addition, we develop methods to analyze the use of these kernels for projection analysis applications; principal component analysis, independent component analysis, and optimal projection analysis. We compare the performance results with known kernel methods on several benchmarks. Empirical results show that several of these kernels outperform other previously suggested kernels on these data sets. Additionally, we develop a genetic algorithm-based kernel optimal projection analysis method which, through extensive testing, demonstrates a ten percent average improvement in performance on all data sets over the kernel principal component analysis projection. We also compare our kernels methods for kernel eigenface representations with previous techniques. Finally, we analyze the benchmark databases used here to determine whether we can aid in the selection of a particular kernel that would perform optimally based on the statistical characteristics of each database.
Identifier: FSU_migr_etd-3861 (IID)
Submitted Note: A Dissertation submitted to the Department of Electrical and Computer Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy.
Degree Awarded: Spring Semester, 2007.
Date of Defense: April 12, 2007.
Keywords: Pattern Recognition, Optimal Projection Analysis, Optimal Component Analysis, KPCA, Statistical Analysis, KICA, Kernels
Bibliography Note: Includes bibliographical references.
Advisory Committee: Simon Y. Foo, Professor Co-Directing Dissertation; Anke Meyer-Baese, Professor Co-Directing Dissertation; Xiuwen Liu, Outside Committee Member; Amy Chan-Hilton, Outside Committee Member.
Subject(s): Electrical engineering
Computer engineering
Persistent Link to This Record: http://purl.flvc.org/fsu/fd/FSU_migr_etd-3861
Owner Institution: FSU

Choose the citation style.
Isaacs, J. C. (2007). Kernel Methods and Component Analysis for Pattern Recognition. Retrieved from http://purl.flvc.org/fsu/fd/FSU_migr_etd-3861