Choosing Multiple Parameters for Support Vector Machines

被引:361
作者
Olivier Chapelle
Vladimir Vapnik
Olivier Bousquet
Sayan Mukherjee
机构
[1] LIP6,
[2] AT&T Research Labs,undefined
[3] École Polytechnique,undefined
[4] MIT,undefined
来源
Machine Learning | 2002年 / 46卷
关键词
support vector machines; kernel selection; leave-one-out procedure; gradient descent; feature selection;
D O I
暂无
中图分类号
学科分类号
摘要
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.
引用
收藏
页码:131 / 159
页数:28
相关论文
共 17 条
[1]  
Cortes C.(1995)Support vector networks Machine Learning 20 273-297
[2]  
Vapnik V.(1999)Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring Science 286 531-537
[3]  
Golub T.(2001)Soft margins for AdaBoost Machine Learning 42 287-320
[4]  
Slonim D.(undefined)undefined undefined undefined undefined-undefined
[5]  
Tamayo P.(undefined)undefined undefined undefined undefined-undefined
[6]  
Huard C.(undefined)undefined undefined undefined undefined-undefined
[7]  
Gaasenbeek M.(undefined)undefined undefined undefined undefined-undefined
[8]  
Mesirov J. P.(undefined)undefined undefined undefined undefined-undefined
[9]  
Coller H.(undefined)undefined undefined undefined undefined-undefined
[10]  
Loh M. L.(undefined)undefined undefined undefined undefined-undefined