全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于递归约简的在线自适应最小二乘支持向量回归机

DOI: 10.13195/j.kzyjc.2012.1791, PP. 50-56

Keywords: 最小二乘支持向量回归机,在线,自适应,迭代策略,约简技术

Full-Text   Cite this paper   Add to My Lib

Abstract:

鉴于传统在线最小二乘支持向量机在解决时变对象的回归问题时,模型跟踪精度不高,支持向量不够稀疏,结合迭代策略和约简技术,提出一种在线自适应迭代约简最小二乘支持向量机.该方法考虑新增样本与历史数据共同作用对现有模型产生的约束影响,寻求对目标函数贡献最大的样本作为新增支持向量,实现了支持向量稀疏化,提高了在线预测精度与速度.仿真对比分析表明该方法可行有效,较传统方法回归精度高且所需支持向量数目最少.

References

[1]  张浩然, 汪晓东. 回归最小二乘支持向量机的增量和在线式学习算法[J]. 计算机学报, 2006, 29(3): 400-406.
[2]  (Zhang H R, Wang X D. Incremental and online learning algorithm for regression least squares support vector machine[J]. Chinese J of Computers, 2006, 29(3): 400-406.)
[3]  张淑宁, 王福利, 尤富强, 等. 基于鲁棒学习的最小二乘支持向量机及其应用[J]. 控制与决策, 2010, 25(8):1169-1177.
[4]  (Zhang S N,Wang F L, You F Q, et al. Robust least squares support vector machine based on robust learning algorithm and its application[J]. Control and Decision, 2010, 25(8): 1169-1177.)
[5]  张淑宁, 王福利, 何大阔, 等. 在线鲁棒最小二乘支持向量机回归建模[J]. 控制理论与应用, 2011, 28(11): 1601-1606.
[6]  (Zhang S N, Wang F L, He D K, et al. Modeling method of online robust least-squares-support-vector regression[J]. Control Theory & Applications, 2011, 28(11): 1601-1606.)
[7]  Jiao L, Bo L, Wang L. Fast sparse approximation for least squares support vector machine[J]. IEEE Trans on Neural Networks, 2007, 18(3): 685-697.
[8]  Cortez P, Cerdara A, Tsanas A. Uc irvine machine learning repository[DB/OL]. [2009-10-07](2012-08-20). http://archive.ics.uci.edulmll.
[9]  Vapnik V N. The nature of statistical learning theory[M]. New York: Springer-Verlag Press, 1995: 138-170.
[10]  Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines[C]. Proc of IEEE Signal Processing Society Workshop. New York: IEEE Press, 1997: 276-285.
[11]  Joachims T. Making large-scale SVM learning practical[C]. Advances in Kernel Methods: Support Vector Machine. Cambridge: MIT Press, 1999: 169-184.
[12]  Collobert R, Bengio S. SVMTorch: Support vector machines for large-scale regression problems[J]. J of Machine Learning Research, 2001, 1(2): 143-160.
[13]  Shevade S K, Keerthi S S, Bhattacharyya C, et al. Improvements to the SMO algorithm for SVM regression[J]. IEEE Trans on Neural Networks, 2000, 11(5): 1188-1193.
[14]  Suykens J A K, Vandewalle J. Least squares support vector machine classifiers[J]. Neural Processing Letter, 1999, 9(3): 293-300.
[15]  Suykens J A K, Brabanter J D, Lukas L, et al. Weighted least squares support vector machines: Robustness and sparse approximation[J]. Neurocomputing, 2002, 48(1): 85-105.
[16]  Shim J, Hwang C, Nau S. Robust LSSVM regression using fuzzy ??-means clustering[C]. Proc of the 2nd Int Conf on Natural Computation. Xi’an: Springer Press, 2006, 9: 157-166.
[17]  Suykens J A K, Lukas L, Vandewalle J. Sparse approximation using least squares vector machines[C]. Proc of IEEE Int Symposium on Circuits and Systems. New Jersey: IEEE Press, 2000: 757-760.
[18]  Zhao Y P, Sun J G. Recursive reduced least squares support vector regression[J]. Pattern Recognition, 2009, 42(5): 837-842.
[19]  Zhao Y P, Sun J G, Zhong H D, et al. An improved recursive reduced least squares support vector regression[J]. Neurocomputing, 2012, 87(3): 1-9.
[20]  Cauwenberghs G, Poggio T. Incremental and decremental support vector machine learning[C]. Proc of the 14th Annual Neural Information Processing Systems Conf. Colorado, 2001: 409-423.
[21]  Wang H, Pi D Y, Sun Y X. Online SVM regression algorithm-based adaptive inverse control[J]. Neurocomputing, 2007, 70(4): 952-959.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133