全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
-  2017 

基于Parameter Server框架的大数据挖掘优化算法
Optimization algorithm for big data mining based on parameter server framework

DOI: 10.6040/j.issn.1672-3961.0.2016.339

Keywords: 优化算法,分布式系统,大数据,样本差异性,机器学习,
big data
,sample diversity,machine learning,distributed system,optimization

Full-Text   Cite this paper   Add to My Lib

Abstract:

摘要: 基于大数据挖掘的实时性要求和数据样本的多样性特征,提出一种面向大数据挖掘的机器学习模型训练优化算法。分析当前算法的迭代计算过程,根据模型向量的改变量将迭代过程分为粗调和微调两个阶段,并发现在微调阶段绝大部分样本对计算结果的影响极小,因此可以在微调阶段不计算此类样本的梯度而直接采用上次迭代的计算结果,从而减小计算量,提升计算效率。试验结果表明,算法在分布式集群环境下可以减小模型训练约35%的计算量,且训练得到的模型准确度在正常范围内,可有效提高大数据挖掘的实时性。
Abstract: Traditional machine learning algorithms for small data were not applicable for mining of big data. An optimization algorithm for machine learning and big data mining was proposed. The iterative computation of machine learning algorithms was divided into two phases according to the change of model vector. According to the observation that most samples contributed little to the model update during the iteration, the computation load of machine learning algorithms could be reduced by reusing the iterative computing results of this kind of samples. The experimental results showed that the proposed method could reduce the computation load by 35%, with little effect on prediction accuracy of the training model

References

[1]  KUMAR Abhimanu, BEUTEL Alex, HO Qirong, et al. Fugue: slow-worker-agnostic distributed learning for big models on big data[C] //Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics. Reykjavik, Iceland: JMLR, 2014:531-539.
[2]  LI Mu, ANDERSEN David G, SMOLA Alexander J, et al. Communication efficient distributed machine learning with the parameter server[C] //28th Annual Conference on Neural Information Processing Systems. Montreal, Canada: MIT Press, 2014: 19-27.
[3]  HO Qirong, CIPAR James, CUI Henggang, et al. More effective distributed ML via a stale synchronous parallel parameter server[C] //27th Annual Conference on Neural Information Processing Systems. Lake Tahoe, United States: MIT Press, 2013: 1223-1231.
[4]  LANGFORD John, SMOLA Alexander J, ZINKEVICH Martin. Slow learners are fast[C] //23rd Annual Conference on Neural Information Processing Systems. Vancouver, Canada: MIT Press, 2009: 2331-2339.
[5]  ZINKEVICH Martin A, WEIMER Markus, SMOLA Alex, et al. Parallelized stochastic gradient descent[C] //24th Annual Conference on Neural Information Processing Systems. Vancouver, Canada: MIT Press, 2009: 2331-233.
[6]  张引,陈敏,廖小飞. 大数据应用的现状与展望[J]. 计算机研究和发展,2013, 50(S2):216-233 ZHANG Yin, CHEN Min, LIAO Xiaofei. Big data applications: a survey[J]. Journal of Computer Research and Development, 2013, 50(S2):216-233.
[7]  王元卓,靳小龙,程学旗. 网络大数据:现状与展望[J]. 计算机学报,2013,36(6):1125-1138. WANG Yuanzhuo, JIN Xiaolong, CHENG Xueqi. Network big data: present and future[J]. Chinese Journal of Computers, 2013, 36(6):1125-1138.
[8]  刘红岩,陈剑,陈国青. 数据挖掘中的数据分类算法综述[J].清华大学学报(自然科学版),2002,42(6):727-730. LIU Hongyan, CHEN Jian, CHEN Guoqing. Review of classification algorithms for data mining[J]. Journal of Tsinghua University(Science & Technology), 2002, 42(6):727-730.
[9]  吴启晖,邱俊飞,丁国如. 面向频谱大数据处理的机器学习方法[J].数据采集与处理,2015,30(4):703-713. WU Qihui, QIU Junfei, DING Guoru. Machine learning methods for big spectrum data processing[J]. Journal of Data Acquisition and Processing, 2015, 30(4):703-713.
[10]  郭迟,刘经南,方媛,等. 位置大数据的价值提取与协同挖掘方法[J]. 软件学报,2014, 25(4):713-730. GUO Chi, LIU Jingnan, FANG Yuan, et al. Value extraction and collaborative mining methods for location big data[J]. Journal of Software, 2014, 25(4):713-730.
[11]  张蕾,章毅. 大数据分析的无限深度神经网络方法[J]. 计算机研究与发展,2016,53(1):68-79. ZHANG Lei, ZHANG Yi. Big data analysis by infinite deep neural networks[J].Journal of Computer Research and Development, 2016, 53(1):68-79.
[12]  耿丽娟,李星毅. 用于大数据分类的KNN算法研究[J]. 计算机应用研究,2014, 31(5):1342-1344. GENG Lijuan, LI Xingyi. Improvements of KNN algorithm for big data classification[J]. Application Research of Computers, 2014, 31(5):1342-1344.
[13]  陈国良,毛睿,陆克中. 大数据并行计算框架[J]. 科学通报,2015,60:566-569. CHEN Guoliang, MAO Rui, LU Kezhong. Parallel computing framework for big data[J]. Chinese Science Bulletin, 2015, 60:566-569.
[14]  POWER Russell, LI Jinyang. Piccolo: building fast, distributed programs with partitioned tables[C] //9th USENIX Symposium on Operating Systems Design and Implementation. Vancouver, Canada: USENIX, 2010: 293-306.
[15]  CHILIMBI Trishul, SUZUE Yutaka, APACIBLE Johnson, et al. Project adam: building an efficient and scalable deep learning training system[C] //11th USENIX Symposium on Operating Systems Design and Implementation. Broomfield, USA: USENIX, 2014: 571-582.
[16]  XING Eric P, HO Qirong, DAI Wei, et al. Petuum: a new platform for distributed machine learning on big data[C] //Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Sydney, NSW, Australia: ACM, 2015: 1335-1344.
[17]  YUAN Jinhui, GAO Fei, HO Qirong, et al. Light LDA: big topic models on modest computer clusters[C] //Proceedings of the 24th International Conference on World Wide Web. Florence, Italy: Springer, 2015:1351-1361
[18]  LIU Ji, WRIGHT S J, RE Christopher, et al. An asynchronous parallel stochastic coordinate descent algorithm[J]. Journal of Machine Learning Research, 2015, 16(1):285-322.
[19]  LI Mu, ANDERSEN David G, PARK Jun Woo, et al. Scaling distributed machine learning with the parameter server[C] //11th USENIX Symposium on Operating Systems Design and Implementation. Broomfield, USA: USENIX, 2014:583-598.
[20]  LEWIS David D, YANG Yiming, ROSE Tony G, et al. RCV1: a new benchmark collection for text categorization research[J]. Journal of Machine Learning Research, 2004, 5:361-397.
[21]  HSIEH C J, YU H F, DHILLON I S. PASSCoDe: parallel asynchronous stochastic dual coordinate descent[C] //Proceedings of the 32nd International Conference on Machine Learning. Lille, France: ACM, 2015: 2370-2379.
[22]  CHU Chengtao, KIM Sangkyun, LIN Yian, et al. Map-reduce for machine learning on multicore[C] //20th Annual Conference on Neural Information Processing Systems Vancouver. British Columbia, Canada: MIT Press, 2006:281-288.
[23]  程学旗,靳小龙,王元卓. 大数据系统和分析技术综述[J]. 软件学报,2014,25(9):1889-1908. CHENG Xueqi, JIN Xiaolong, WANG Yuanzhuo. Survey on big data system and analytic technology[J]. Journal of Software, 2014, 25(9):1889-1908.
[24]  何清,李宁,罗文娟,等. 大数据下的机器学习算法综述[J]. 模式识别与人工智能,2014,27(4):327-336. HE Qing, LI Ning, LUO Wenjuan, et al. A survey of machine learning algorithms for big data[J]. Pattern Recognition and Artificial Intelligence, 2014, 27(4):327-336.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133