Machine
Learning has undergone a tremendous progress, which is evolutionary
over the last decade. It is widely used to make predictions that lead to the
most valuable decisions. Many experts in economics use models derived from Machine
Learning as important assistance, and many companies would use Neural Network,
a model in bankruptcy prediction, as their guide to prevent potential failure.
However, although Neural Networks can process a tremendous amount of attribute
factors, it results in overfitting frequently when more statistics is taken in.
By using K-Nearest Neighbor and Random Forest, we can obtain better results
from different perspectives. This paper testifies the optimal algorithm for bankruptcy
calculation by comparing the results of the two methods.
References
[1]
Beriman, L. (2001). Random Forest. Berkeley, CA: University of California.
[2]
Ho, T. K. (1995). Random Decision Forests. Proceedings of the Third International Conference on Document Analysis and Recognition, 1, 278-282. https://doi.org/10.1109/ICDAR.1995.598994
[3]
Józwik, A. (1983). A Learning Scheme for a Fuzzy-NN Rule. Pattern Recognition Letters, 1, 287-289. https://doi.org/10.1016/0167-8655(83)90064-8
[4]
Kim, M., & Han, I. (2003). The Discovery of Experts’ Decision Rules from Quantitative Data Using Genetic Algorithms. Expert Systems with Applications, 25, 637-646. https://doi.org/10.1016/S0957-4174(03)00102-7
[5]
Nielsen, A. M. (2015). Neural Networks and Deep Learning. San Francisco, CA: Determination Press.
[6]
Saravanan, T. (2010). A Detailed Introduction to K-Nearest Neighbor (KNN) Algorithm. God. Your Book Is Great. https://saravananthirumuruganathan.wordpress.com/2010/05/17/a-detailed-introduction-to-k-nearest-neighbor-knn-algorithm/
[7]
Srivastava, N., Hinton, G. E., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: A Simple Way to Prevent Neural Networks from Overfitting. Journal of Machine Learning Research, 15, 1929-1958.