全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

大规模SVDD的坐标下降算法

, PP. 950-957

Keywords: 支持向量数据描述(SVDD),收敛速率,坐标下降,解析解

Full-Text   Cite this paper   Add to My Lib

Abstract:

支持向量数据描述(SVDD)是一种无监督学习算法,在图像识别和信息安全等领域有重要应用。坐标下降方法是求解大规模分类问题的有效方法,具有简洁的操作流程和快速的收敛速率。文中针对大规模SVDD提出一种高效的对偶坐标下降算法,算法每步迭代的子问题都可获得解析解,并可使用加速策略和简便运算减少计算量。同时给出3种子问题的选择方法,并分析对比各自优劣。实验对仿真和真实大规模数据库进行算法验证。与LibSVDD相比,文中方法更具优势,1。4s求解105样本规模的ijcnn文本库。

References

[1]  Tax D M J,Duin R P W.Support Vector Domain Description.Pat-tern Recognition Letters,1999,20(11/12/13): 1191-1199
[2]  Schlkopf B,Platt J,Shawe-Taylor J,et al.Estimating the Support of a High-Dimensional Distribution .Neural Computation,2001,13(7): 1443-1471
[3]  Tax D M J,Duin R P W.Support Vector Data Description.Machine Learning,2004,54: 45-66
[4]  Tsang I W,Kwok J T,Cheung P M.Core Vector Machines: Fast SVM Training on Very Large Data Sets.Journal of Machine Learning Research,2005,6: 363-392
[5]  Loosli G,Canu S.Comments on the “Core Vector Machines: Fast SVM Training on Very Large Data Sets”.Journal of Machine Learning Research,2007,8: 291-301
[6]  Hsieh C J,Chang K W,Lin C J,et al.A Dual Coordinate Descent Method for Large-Scale Linear SVM // Proc of the 25th International Conference on Machine Learning.Helsinki,Finland,2008 : 408-415
[7]  Zhang T.Solving Large Scale Linear Prediction Problems Using Stochastic Gradient Descent Algorithms // Proc of the 21st International Conference on Machine Learning. Haifa,Israel,2004: 919-926
[8]  Shalev-Shwartz S,Singer Y,Srebro N.Pegasos: Primal Estimated Sub-Gradient Solver for SVM // Proc of the 24th International Conference on Machine Learning.Corvallis,USA,2007: 807-814
[9]  Bordes A,Bottou L.Careful Quasi-Newton Stochastic Gradient Descent.Journal of Machine Learning Research,2009,10: 1737-1754
[10]  Joachims T.Training Linear SVMs in Linear Time // Proc of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.Philadelphia,USA,2006: 82-95
[11]  Smola A J,Vishwanathan S V N,Le Q.Bundle Methods for Machine Learning // Platt J C,Koller D,Singer Y,et al,eds.Advances in Neural Information Processing Systems.Cambridge,USA: MIT Press,2007,XX: 1377-1384
[12]  Lin C J,Weng R C,Keerthi S S.Trust Region Newton Method for Large-Scale Logistic Regression.Journal of Machine Learning Research,2008,9: 627-650
[13]  Chang K W,Hsieh C J,Lin C J.Coordinate Descent Method for Large-Scale L2-Loss Linear SVM.Journal of Machine Learning Research,2008,9: 1369-1398
[14]  Wu Yichao,Liu Yufeng.Robust Truncated Hinge Loss Support Vector Machines.Journal of the American Statistical Association,2007,102(479): 974-983
[15]  Wu M,Ye J.A Small Sphere and Large Margin Approach for Novelty Detection Using Training Data with Outliers.IEEE Trans on Pattern Analysis and Machine Intelligence,2009,31(11): 2088-2092
[16]  Saha A,Tewari A.On the Finite Time Convergence of Cyclic Coordinate Descent Methods [EB/OL].[2011-06-27].http://arxiv.org/abs/1005.2146
[17]  Nesterov Y.Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems[EB/OL].[2010-01-01].http://130.104.5.100/cps/ucl/doc /core/documents/coredp2010_2web.pdf

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133