全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于遗传算法与经验误差最小化的SVM模型选择方法

, PP. 65-71

Keywords: 支持向量机,核函数,核参数,经验误差,遗传算法

Full-Text   Cite this paper   Add to My Lib

Abstract:

支持向量机(SVM)的推广能力依赖于核函数形式及核参数和惩罚因子的选取,即模型选择.在分析参数对分类器识别精度的影响基础上,提出了基于遗传算法和经验误差最小化的支持向量机参数选择方法.在13个UCI数据集上的实验表明了本文算法的正确性与有效性,且具有良好的推广性能.

References

[1]  [ Ra tsch G, Onoda T, M uller K R. Softm arg ins forAdaBoost [ J]. M ach ine Lea rning, 2001, 42: 287-320.
[2]  [ Chapelle O, Vapn ik V, Bousquet O, e t a .l Choo sing mu ltiple param ete rs for support vectorm ach ines [ J]. M achine Learn ing,
[3]  2002, 46: 131-159.
[4]  [ Keerth i S S. E fficien t tun ing o f SVM hyperparam eters using radius m arg in bound and ite ra tive a lgor ithms [ J]. IEEE T ransactions
[5]  on Neural Netwo rks, 2002, 13: 1 225-1 229.
[6]  [ Duan K, Keerth i S S, Poo A N. Eva luation of s imp le pe rfo rm ancem easures for tun ing SVM hyperparam eters [ J]. N eurocompu
[7]  ting, 2003, 51: 41-59.
[8]  J, Scholkopf B, Schuurm ans D. Advances in largem a rg in c lassifiers. Cambr idg eMA: M IT Press, 1999: 67-74.
[9]  [ Aya tN E, CherietM, Suen C Y. Optim iza tion o f the SVM kerne ls using an emp irical erro rm inim ization schem e [ C ] / /Lee S
[10]  W, Verri A. Pattern Recognition w ith Support V ec torM ach ines. Berlin H e ide lberg: Spr inger, 2002, 2388: 354-369.
[11]  [ AdankonM M, Che rietM, Aya tN E. Optim izing resources in m ode l selection for support vectorm ach ines [ C ] / /2005 International
[12]  Jo in t Conference on Neura lN etworks. Canada, M ontrea,l 2005: 925-930.
[13]  [ Aya tN E, Cher ie tM, Suen C Y. Autom aticmodel se lection fo r the optim ization o f the SVM kerne ls [ J] . Patte rn Recogn ition,
[14]  2005, 38: 1 733-1 745.
[15]  [ AdankonM M, Cher ie tM. New form ulation of SVM fo rmode l selection [ C] / /2006 Internationa l Jo int Conference on Neural
[16]  Ne tw orks. C anada, Vancouver: IEEE Press, 2006: 1 900-1 907.
[17]  [ Zheng C H, L iC J. Autom atic param e ters se lection for SVM based on GA [ C] / /5thW or ld Congress on Inte lligen tC ontro l and
[18]  Automa tion. H ang zhou, Ch ina: IEEE Press, 2004: 1 869-1 872.
[19]  [ Jav ierA, Sa turninoM, Philip S. Tun ing L1-SVM hype r-param eters w ith m od ified radius m arg in bounds and sim ulated annealing
[20]  [ C] / /Com putationa l and Amb ient Inte lligence. Berlin H e ide lberg: Spr inge r-Ver lag, 2007, 4507: 284-291.
[21]  [ Guo X C, L iang Y C, W u C G, e t a.l PSO-based hyper-param eters se lection fo r LS-SVM c lassifiers [ C] / / Neural Inform ation
[22]  Processing. H ongkong, Ch ina: IEEE Press, 2006, 4233: 1 138-1 147.
[23]  [ P latt J. Probabilistic outputs for suppo rt vectorm achines and compar isons to regu lar ized likelihood m ethods [ C ] / /Bartlett P
[24]  [ Ratsch G. Benchm ark da ta sets[ EB /OL] . http: / / ida. first. fhg. de /pro jects /bench /benchm arks. htm. 1999 /2003- 7.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133