SUPPORT-VECTOR NETWORKS

被引:21738
作者
CORTES, C
VAPNIK, V
机构
关键词
PATTERN RECOGNITION; EFFICIENT LEARNING ALGORITHMS; NEURAL NETWORKS; RADIAL BASIS FUNCTION CLASSIFIERS; POLYNOMIAL CLASSIFIERS;
D O I
10.1007/BF00994018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
引用
收藏
页码:273 / 297
页数:25
相关论文
共 14 条
[1]  
Aizerman M., 1964, AUTOMAT REM CONTR, V25, P821, DOI DOI 10.1234/12345678
[2]  
ANDERSON TW, 1966, ANN MATH STAT, V33, P420
[3]  
Boser B, 1992, 5 ANN WORKSH COMP LE, V5, P144
[4]  
BOTTOU L, 1994, 12TH P INT C PATT RE
[5]  
BROMLEY J, 1991, AT T1135991081916TM
[6]  
COURANT R, 1953, METHODS MATH PHYSICS
[7]  
Cun YL, 1985, COGNITIVA, V85, P599
[8]  
Fisher R.A., 1936, ANN EUGEN, V7, P111
[9]  
LeCun Y., 1989, P ADV NEURAL INFORM, P396
[10]  
PARKER DB, 1985, MIT TR47 CTR COMP RE