全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于奇异值分解及PRESS统计的模型结构优化方法

, PP. 1273-1276

Keywords: 奇异值分解,模型结构优化,PRESS,统计,稀疏基选择

Full-Text   Cite this paper   Add to My Lib

Abstract:

针对线性参数模型的基函数选择问题,结合奇异值分解和PRESS统计提出一种模型结构优化算法.通过预先对候选基函数矩阵进行分块操作,减少非最优列间的重复比较.在此基础上,对各子块采用奇异值分解与PRESS统计相结合的方法进行选择,直接以模型的泛化能力作为目标,自适应地选择基函数.通过奇异值分解,在降低候选基函数数量的同时,使其彼此之间相互正交,有效地简化了PRESS统计的计算复杂度.仿真结果表明,所提出的方法能够有效简化模型结构,并保持较高的预测精度.

References

[1]  M. Han, Y.J. Wang. Analysis and modeling of multivariate chaotic time series based on neural network. Expert System with Applications 36(2) (2009) 1280-1290 [2] G. Feng, G. B. Huang, Q. Lin, and R. Gay, “Error Minimized Extreme Learning Machine with Growth of Hidden Nodes and Incremental Learning”, IEEE Transactions on Neural Networks, vol. 20, no. 8, pp. 1352-1357, 2009. [3] I. Rojas, H. Pomares, J. L. Bernier, J. Ortega, B. Pino, F. J. Pelayo, and A. Prieto, “Time series analysis using normalized PG-RBF network with regression weights,” Neurocomputing, vol. 42, pp. 267–285, 2002. [4] T. Simila, J. Tikka, "Combined input variable selection and model complexity control for nonlinear regression," Pattern Recognition Letters, vol. 30, pp. 231-236, 2009. [5] J. Tikka, "Simultaneous input variable and basis function selection for RBF networks," Neurocomputing, vol. 72, pp. 2549-2658, 2009. [6] K. Z. Mao and S. A. Billings, "Algorithms for minimal model structure detection in nonlinear dynamic system identification," International Journal of Control, vol. 68, pp. 311-330, Sep 20 1997. [7] S. Chen, X. Hong, C. J. Harris and P. M. Sharkey, "Sparse Modeling Using Orthogonal Forward Regression With PRESS Statistic and Regularization," IEEE Transactions on Systems, Man, and Cybernetics, Part: B, vol. 34, no. 2, pp. 898-911, 2004. [8] X. Hong, S. Chen and C. J. Harris, "A-Optimality Orthogonal Forward Regression Algorithm Using Branch and Bound," IEEE Transactions on Neural Networks, vol. 19, no. 11, pp. 1961-1967, 2008. [9] X. Hong and S. Chen, "M-estimator and D-optimality model construction using orthogonal forward regression," IEEE Transactions on Systems, Man, and Cybernetics, Part: B, vol. 35, no. 1, pp. 155-162, 2005. [10] A. Bjorck, "Solving Linear Least-Squares Problems by Gram–Schmidt Orthogonalization," BIT, vol. 7, pp. 1-21, 1967. [11] D. Achiya, "A modified Gram-Schmidt Algorithm with Iterrabive Orthogonalization and Column Pivoting," Linear Algebra and its Applications, vol. 310, pp. 25-42, 2005. [12] M. E. Tipping, “Sparse Bayesian learning and the relevance vector machine,” J. Machine Learning Res., vol. 1, pp. 211-244, 2001. [13] S. A. Billings and S. Chen, “The determination of multivariable nonlinear models for dynamic systems,” in Control Dynamic Systems, Neural Network Systems Techniques and Applications, C. T. Leondes, Ed. San Diego, CA: Academic, vol. 7, pp. 231–278, 1998. [14] J. A. K. Suykens, T. VanGestel, J. DeBrabanter, B. DeMoor, J. Vandewalle, Least squares support vector machines, 2002. [15] G. C. Cawley, N. L. C. Talbot, Reduced rank kernel ridge regression, Neural Processing Letters, vol. 16, pp. 293-302, 2002.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133