Nonlinear component analysis as a kernel eigenvalue problem

被引:6065
作者
Scholkopf, B [1 ]
Smola, A
Muller, KR
机构
[1] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
[2] GMD First Forschungszentrum Informat Tech, D-12489 Berlin, Germany
关键词
D O I
10.1162/089976698300017467
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map-for instance, the space of all possible five-pixel products in 16 x 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.
引用
收藏
页码:1299 / 1319
页数:21
相关论文
共 18 条
[1]  
AIZERMAN MA, 1965, AUTOMAT REM CONTR+, V25, P821
[2]  
[Anonymous], 44 M PLANCK I BIOL K
[3]  
Boser B. E., 1992, Proceedings of the Fifth Annual ACM Workshop on Computational Learning Theory, P144, DOI 10.1145/130385.130401
[4]  
BREGLER C, 1994, ADV NEURAL INFORMATI, V6
[5]  
Burges C., 1996, P 13 INT C MACH LEAR
[6]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[7]  
COURANT R, 1953, METHODS MATH PHYSICS, V1
[8]  
Diamantaras KI, 1996, Principal Component Neural Networks: Theory and Applications
[9]  
Dunford N., 1963, LINEAR OPERATORS
[10]   PRINCIPAL CURVES [J].
HASTIE, T ;
STUETZLE, W .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (406) :502-516