全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

共享隐空间迁移SVM

DOI: 10.3724/SP.J.1004.2014.02276, PP. 2276-2287

Keywords: 迁移学习,大间隔分类器,隐空间,支持向量机

Full-Text   Cite this paper   Add to My Lib

Abstract:

?在机器学习中,迁移学习被证明能有效使用一个领域信息提高另一个领域中受训模型的分类精度.迁移学习总是假设相关领域间共享某些隐含因素,但在当前的迁移学习方法中,该部分隐含因素依然未得到充分探讨.本研究引入低维共享隐空间的迁移学习方法,基于经典支持向量机(Supportvectormachine,SVM)分类模型得到融入共享隐空间的迁移支持向量机,该模型较以往相关方法能更好地利用隐空间这一有效信息,从而提高所得分类器的泛化性能.相关实验结果亦验证了所提方法的有效性.

References

[1]  Duan L X, Tsang I W, Xu D. Domains transfer multiple kernel learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(3): 465-479
[2]  Tu W T, Sun S L. A subject transfer framework for egg classification. Neurocomputing, 2012, 82: 109-116
[3]  Ando R K, Zhang T. A framework for learning predictive structures from multiple tasks and unlabeled data. Journal Machine Learning Research, 2005, 6: 1817-1853
[4]  Pan S J, Kwok J T, Yang Q. Transfer learning via dimensionality reduction. In: Proceedings of the 23th International Conference on Artificial Intelligence. Chicago, USA: ACM 2008. 677-682
[5]  Shao M, Castillo C, Gu Z H, Fu Y. Low-rank transfer subspace learning. In: Proceedings of the 12th International Conference on Data Mining. Brussels, Belgium: IEEE 2012. 1104-1109
[6]  Gupta S K, Phung D, Adams B, Adams B, Venkatesh S. Regularized nonnegative shared subspace learning. Data mining and knowledge discovery, 2013, 26(1): 57-97
[7]  Domeniconi C, Gunopulos D, Ma S, Yan B J, Al-Razgan M, Papadopoulos D. Locally adaptive metrics for custering high dimensional data. Data Mining and Knowledge Discovery, 2007, 14(1): 63-97
[8]  Yu J, Cheng Q S, Huang H K. Analysis of the weighting exponent in the FCM. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2004, 34(1): 634 -639
[9]  Golub G H, Van Loan C F. Matrix Computations (3rd Edition). Baltimore: The Johns Hopkins University Press, 1996.
[10]  Discovery Challenge [Online], available: http://www.ecml pkdd2006.org/challenge.html, January 3, 2013
[11]  Quanz B, Huan J. Large margin transductive transfer learning. In: Proceedings of the 18th ACM Conference on Information and Knowledge Management. New York, USA: ACM, 2009. 1327-1336
[12]  Ji S W, Tang L, Yu S P, Ye J P. A shared-subspace learning framework for multi-label classification. ACM Transactions on Knowledge Discovery From Data, 2010, 4(2), Article No.8, DOI: 10.1145/1754428.1754431
[13]  Gu Xin, Wang Shi-Tong, Xu Min. A new cross-multidomain classification algorithm and its fast version for large datasets. Acta Automatica Sinica, 2014, 40(3): 531-547(顾鑫, 王士同, 许敏. 基于多源的跨领域数据分类快速新算法. 自动化学报, 2014, 40(3): 531-547)
[14]  Evgeniou T, Micchelli C A, Pontil M. Learning multiple tasks with kernel methods. Journal of Machine Learning Research, 2005, 6(4): 615-637
[15]  Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345-1359
[16]  Zheng V W, Pan J L, Yang Q, Pan J F. Transferring multi-device localization models using latent multi-task learning. In: Proceedings of the 23th International Conference on Artificial Intelligence. Chicago, USA: ACM, 2008. 1427-1432
[17]  Si S, Tao D C, Geng B. Bregman divergence-based regularization for transfer subspace learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(7): 929-942
[18]  Yang S Z, Lin M, Hou C P, Zhang C S, Wu Y. A general framework for transfer sparse subspace learning. Neural Computing and Applications, 2012, 21(7): 1801-1817
[19]  Vapnik V. Statistical Learning Theory. New Jersey: Wiley-Interscience Press, 1998.
[20]  Wu K L, Yu J, Yang M S. A novel fuzzy clustering algorithm based on a fuzzy scatter matrix with optimality tests. Pattern Recognition Letters, 2005, 26(5): 639-652
[21]  Deng Z H, Choi K S, Chung F L, Wang S T. Enhanced soft subspace clustering integrating within-cluster and between-cluster information. Pattern Recognition, 2010, 43(3): 767- 781
[22]  Yang S, Yan S C, Zhang C, Tang X O. Bilinear analysis for kernel selection and nonlinear feature extraction. IEEE Transactions on Neural Networks, 2007, 18(5): 1442-1452
[23]  Jiang Yi-Zhang, Deng Zhao-Hong, Wang Shi-Tong. Mamdani-Larsen type transfer learning fuzzy system. Acta Automatica Sinica, 2012, 38(9): 1393-1409(蒋亦樟, 邓赵红, 王士同. ML型迁移学习模糊系统. 自动化学报, 2012, 38(9): 1393-1409)
[24]  Gao J, Fan W, Jiang J, Han J W. Knowledge transfer via multiple model local structure mapping. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2008. 283-291
[25]  Wu P, Dietterich T G. Improving SVM accuracy by training on auxiliary data sources. In: Proceedings of the 21st International Conference on Machine Learning. New York, USA: ACM, 2004. 110-117
[26]  Chang C C, Lin C J. LIBSVM: A library for support vector machines [Online], available: http://www.csie.ntu.edu.tw/ ~cjlin/libsvm, October 14, 2012
[27]  Tao Jian-Wen, Wang Shi-Tong. Kernel distribution consistency based local domain adaptation learning. Acta Automatica Sinica, 2013, 39(8): 1295-1309 (陶剑文, 王士同. 核分布一致局部领域适应学习. 自动化学报, 2013, 39(8): 1295-1309)
[28]  Dai W Y, Yang Q, Xue G R, Yu Y. Boosting for transfer learning. In: Proceedings of the 24th International Conference on Machine Learning. New York, USA: ACM, 2007. 193-200
[29]  Tao Jian-Wen, Wang Shi-Tong. Domain adaptation kernel support vector machine. Acta Automatica Sinica, 2012, 38(5): 797-811(陶剑文, 王士同. 领域适应核支持向量机. 自动化学报, 2012, 38(5): 797-811)

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133