全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Tree Model Optimization Criterion without Using Prediction Error

DOI: 10.4236/ojs.2012.25061, PP. 478-483

Keywords: Cross-Validation, Model Optimization Criterion, One-SE Rule, Significance Level, Tree Model

Full-Text   Cite this paper   Add to My Lib

Abstract:

The use of prediction error to optimize the number of splitting rules in a tree model does not control the probability of the emergence of splitting rules with a predictor that has no functional relationship with the target variable. To solve this problem, a new optimization method is proposed. Using this method, the probability that the predictors used in splitting rules in the optimized tree model have no functional relationships with the target variable is confined to less than 0.05. It is fairly convincing that the tree model given by the new method represents knowledge contained in the data.

References

[1]  L. Breiman, F. Friedman, C. J. Stone and R. A. Olshen, “Classification and Regression Trees,” Chapman and Hall/CRC, London, 1984.
[2]  K. Takezawa, “Introduction to Nonparametric Regression,” Wiley, Hoboken, 2006.
[3]  J. M. Chambers and T. J. Hastie, “Statistical Models in S,” Chapman and Hall/CRC, London 1991.
[4]  K. Takezawa, “Flexible Model Selection Criterion for Multiple Regression,” Open Journal of Statistics, Vol. 2, No. 4, 2012, pp. 401-407.
[5]  M. Verbeek, “A Guide to Modern Econometrics,” 2nd Edition, Wiley, Hoboken, 2004.
[6]  T. Hastie, R. Tibshirani and J. Friedman. “Elements of Statistical Learning,” 1st Edition, Springer, New York, 2001.
[7]  R. O. Duda, P. E. Hart and D. G. Stork, “Pattern Classification,” 2nd Edition, Wiley-Interscience, Hoboken, 2000.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133