全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

Comparative Analysis of Different Classifiers for the Wisconsin Breast Cancer Dataset

DOI: 10.4236/oalib.1100660, PP. 1-7

Subject Areas: Artificial Intelligence, Bioinformatics

Keywords: WBCD, Machine Learning, Classification, Naive Bayes, Neural Networks, SVM

Full-Text   Cite this paper   Add to My Lib

Abstract

The Wisconsin Breast Cancer Dataset has been heavily cited as a benchmark dataset for classification. Neural Network techniques such as Neural Networks, Probabilistic Neural Networks, and Regression Neural Networks have been shown to perform very well on this dataset. However, despite its obvious practical importance and implications for cancer research, a thorough investigation of all modern classification techniques on this dataset remains to be done. In this paper we examine the efficacy of classifiers such as Random Forests with varying number of trees, Support Vector Machines with different kernels, Naive Bayes model and neural networks on the accuracy of classifying the masses in the dataset as benign/malignant. Results indicate that Support Vector machines with a Radial Basis function kernel give the best accuracy of all the models attempted. This indicates that there are non-linearities present in the dataset and that the Support vector machine does a good job of mapping the data into a higher dimensional space in which the non-linearities fade away and the data becomes linearly separable by large margin classifier like the support vector machine. These methods show that modern machine learning methods could provide for improved accuracy for early prediction of cancerous tumors.

Cite this paper

Vig, L. (2014). Comparative Analysis of Different Classifiers for the Wisconsin Breast Cancer Dataset. Open Access Library Journal, 1, e660. doi: http://dx.doi.org/10.4236/oalib.1100660.

References

[1]  Estimated New Cancer Cases and Deaths for 2004. 
http://seer.cancer.gov/cgi-bin/csr/1975_2001/search.pl#results
[2]  Wang, T.C. and Karayiannis, N.B. (1998) Detection of Microcalcifications in Digital Mammograms Using Wavelets. IEEE Transactions on Medical Imaging, 17, 49-509. 
http://dx.doi.org/10.1109/42.730395
[3]  Huo, Z., Giger, M., Vyborny, C., Wolverton, D., Schmidt, R. and Doi, K. (1998) Automated Computerized Classification of Malignant and Benign Mass Lesions on Digital Mammograms. Academic Radiology, 5, 155-168. 
http://dx.doi.org/10.1016/S1076-6332(98)80278-X
[4]  Cheng, H.-D., Lui Y.M. and Freimanis, R.I. (1998) A Novel Approach to Microcalcification Detection Using Fuzzy Logic Technique. IEEE Transactions on Medical Imaging, 17, 442-450. 
http://dx.doi.org/10.1109/42.712133
[5]  Pendharkar, P.C., Rodger, J.A., Yaverbaum, G.J., Herman, N. and Benner, M. (1999) Association, Statistical, Mathematical and Neural Approaches for Mining Breast Cancer Patterns, Expert Systems with Applications, 17, 223-232. DRAFT VERSION of paper to Appear at the Oncology Reports, Special Issue Computational Analysis and Decision Support Systems in Oncology, Last Quarter 2005.
[6]  Setiono R. (2000) Generating Concise and Accurate Classification Rules for Breast Cancer Diagnosis. Artificial Intelligence in Medicine, 18, 205-219. 
http://dx.doi.org/10.1016/S0933-3657(99)00041-X
[7]  Chen, D., Chang, R.F. and Huang, Y.L. (2000) Breast Cancer Diagnosis Using Self-Organizing Map for Sonography. Ultrasound in Medical Biology, 26, 405-411. 
http://dx.doi.org/10.1016/S0301-5629(99)00156-8
[8]  Giger, M., Huo, Z., Kupinski, M. and Vyborny, C. (2000) Computer-Aided Diagnosis in Mammography. In: Sonka, M. adn Fitzpatrick, J., Eds., Handbook of Medical Imaging, Medical Image Processing and Analysis, Vol. 2, SPIE Press, 386-408.
[9]  Wisconsin Diagnostic Breast Cancer (WDBC) Dataset and Wisconsin Prognostic Breast Cancer (WPBC) Dataset. 
http://ftp.ics.uci.edu/pub/machine-learning-databases/breast-cancer-wisconsin/
[10]  Tourassi, G.D., Markey, M.K., Lo, J.Y. and Floyd Jr., C.E. (2001) A Neural Network Approach to Breast Cancer Diagnosis as a Constraint Satisfaction Problem. Medical Physics, 28, 804-811. 
http://dx.doi.org/10.1118/1.1367861
[11]  Wolberg, W.H., Street, W.N., Heisey, D.M. and Mangasarian, O.L. (1995) Computer-Derived Nuclear Features Distinguish Malignant from Benign Breast Cytology. Human Pathology, 26, 792-796. 
http://dx.doi.org/10.1016/0046-8177(95)90229-5
[12]  Wolberg, W.H., Street, W.N. and Mangasarian, O.L. (1994) Machine Learning Techniques to Diagnose Breast Cancer from Image-Processed Nuclear Features of Fine-Needle Aspirates. Cancer Letters, 77, 163-171. 
http://dx.doi.org/10.1016/0304-3835(94)90099-X
[13]  Wolberg, W.H., Street, W.N. and Mangasarian, O.L. (1995) Image Analysis and Machine Learning Applied to Breast Cancer Diagnosis and Prognosis. Analytical and Quantitative Cytology and Histology, 17, 77-87.
[14]  Jiang, Y., Nishikawa, R., Wolverton, D., Metz, C., Giger, M.L., Schmidt, R. and Doi, K. (1996) Malignant and Benign Clustered Microcalcifications: Automated Feature Analysis and Classification. Radiology, 198, 671-678. 
http://dx.doi.org/10.1148/radiology.198.3.8628853
[15]  Taylor, P., Fox, J. and Todd-Pokropek, A. (1998) Evaluation of a Decision Aid for the Classification of Microcalcifications. In: Digital Mammography, Kluwer Academic Publishers, Nijmegen, 237-244.
[16]  Hoya, T. and Chambers, J.A. (2001) Heuristic Pattern Correction Scheme Using Adaptively Trained Generalized Regression Neural Networks. IEEE Transactions on Neural Networks, 12, 91-100. 
http://dx.doi.org/10.1109/72.896798
[17]  Kaban, A. and Girolami, M. (2000) Initialized and Guided EM-Clustering of Sparse Binary Data with Application to Text Based Documents. 15th International Conference on Pattern Recognition, 2, 744-747.
[18]  Rosenblatt, F. (1958) The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Cornell Aeronautical Laboratory, Psychological Review, 65, 386-408.
[19]  Minsky, M.L. and Papert, S.A. (1969) Perceptrons. MIT Press, Cambridge.
[20]  Rumelhart, D.E., Hinton, G.E. and Williams, R.J. (1986) Learning Representations by Back-Propagating Errors. Nature, 323, 533-536. 
http://dx.doi.org/10.1038/323533a0
[21]  Kolmogorov, A.N. (1957) On the Representation of Continuous Functions of Many Variables by Superposition of Continuous Functions of One Variable and Addition. Doklady Akademii Nauk SSSR, 144, 679-681. American Mathematical Society Translation, 28, 55-59 [1963].
[22]  Breiman, L. (2001) Random Forests. Machine Learning, 45, 5-32. 
http://dx.doi.org/10.1023/A:1010933404324
[23]  Ho, T.K. (1995) Random Decision Forest. Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, 14-16 August 1995, 278-282.
[24]  Chipman, H.A., George, E.I. and McCulloch, R.E. (1998) Bayesian CART Model Search. Journal of the American Statistical Association, 93, 935-948. 
http://dx.doi.org/10.1080/01621459.1998.10473750
[25]  Breiman, L. (1996) Bagging Predictors. Machine Learning, 24, 123-140. 
http://dx.doi.org/10.1007/BF00058655
[26]  Platt, J. (1998) Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines, Advances in Kernel Methods. Support Vector Learning, MIT Press, Bos-ton.
[27]  Altman, N.S. (1992) An Introduction to Kernel and Nearest Neighbor Nonparametric Regression. The American Statistician, 46, 175-185.

Full-Text


comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413