The aim of this paper is to broaden the application of Stochastic Configuration Network (SCN) in the semi-supervised domain by utilizing common unlabeled data in daily life. It can enhance the classification accuracy of decentralized SCN algorithms while effectively protecting user privacy. To this end, we propose a decentralized semi-supervised learning algorithm for SCN, called DMT-SCN, which introduces teacher and student models by combining the idea of consistency regularization to improve the response speed of model iterations. In order to reduce the possible negative impact of unsupervised data on the model, we purposely change the way of adding noise to the unlabeled data. Simulation results show that the algorithm can effectively utilize unlabeled data to improve the classification accuracy of SCN training and is robust under different ground simulation environments.
References
[1]
Gupta, D. and Rani, R. (2019) A Study of Big Data Evolution and Research Challenges. Journal of Information Science, 45, 322-340. https://doi.org/10.1177/0165551518789880
[2]
Phong, L.T., Aono, Y., Hayashi, T., Wang, L. and Moriai, S. (2018) Privacy Preserving Deep Learning via Additively Homomorphic Encryption. IEEE Transactions on Information Forensics and Security, 13, 1333-1345. https://doi.org/10.1109/TIFS.2017.2787987
[3]
Wolff, R., Bhaduri, K. and Kargupta, H. (2008) A Generic Local Algorithm for Mining Data Streams in Large Distributed Systems. IEEE Transactions on Knowledge and Data Engineering, 21, 465-478. https://doi.org/10.1109/TKDE.2008.169
[4]
Qin, J., Fu, W., Gao, H. and Zheng, W.X. (2016) Distributed k-Means Algorithm and Fuzzy C-Means Algorithm for Sensor Networks Based on Multiagent Consensus Theory. IEEE Transactions on Cybernetics, 47, 772-783. https://doi.org/10.1109/TCYB.2016.2526683
[5]
Zhang, E., Seiler, S., Chen, M., Lu, W. and Gu, X. (2020) Birads Features Oriented Semi-Supervised Deep Learning for Breast Ultrasound Computer-Aided Diagnosis. Physics in Medicine & Biology, 65, Article 125005. https://doi.org/10.1088/1361-6560/ab7e7d
[6]
Chen, X., Ren, B., Chen, M., Wang, Q., Zhang, L. and Yan, G. (2016) Nllss: Predicting Synergistic Drug Combinations Based on Semi-Supervised Learning. PLoS Computational Biology, 12, e1004975. https://doi.org/10.1371/journal.pcbi.1004975
[7]
Zhao, H., Zheng, J., Deng, W., et al. (2020) Semi-Supervised Broad Learning System Based on Manifold Regularization and Broad Network. IEEE Transactions on Circuits and Systems I: Regular Papers, 67, 983-994. https://doi.org/10.1109/TCSI.2019.2959886
[8]
Van Engelen, J.E. and Hoos, H.H. (2020) A Survey on Semi-Supervised Learning. Machine Learning, 109, 373-440. https://doi.org/10.1007/s10994-019-05855-6
[9]
Scardapane, S., Fierimonte, R., Di Lorenzo, P., Panella, M. and Uncini, A. (2016) Distributed Semi-Supervised Support Vector Machines. Neural Networks, 80, 43-52. https://doi.org/10.1016/j.neunet.2016.04.007
[10]
Fierimonte, R., Scardapane, S., Uncini, A. and Panella, M. (2016) Fully Decentralized Semi-Supervised Learning via Privacy-Preserving Matrix Completion. IEEE Transactions on Neural Networks and Learning Systems, 28, 2699-2711. https://doi.org/10.1109/TNNLS.2016.2597444
[11]
Xie, J., Liu, S.-Y. and Chen, J.-X. (2022) A Framework for Distributed Semi-Supervised Learning Using Single-Layer Feed forward Networks. Machine Intelligence Research, 19, 63-74. https://doi.org/10.1007/s11633-022-1315-6
[12]
Wang, D. and Li, M. (2017) Stochastic Configuration Networks: Fundamentals and Algorithms. IEEE Transactions on Cybernetics, 47, 3466-3479. https://doi.org/10.1109/TCYB.2017.2734043
[13]
Tarvainen, A. and Valpola, H. (2017) Mean Teachers Are Better Role Models: Weight-Averaged Consistency Targets Improve Semi-Supervised Deep Learning Results. Advances in Neural Information Processing Systems, 30, 1195-1204. https://doi.org/10.48550/arXiv.1703.01780
[14]
Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J., et al. (2011) Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers. Foundations and Trends® in Machine learning, 3, 1-122. https://doi.org/10.1561/2200000016