Zhao Z, Wang L, Liu H, Ye J P. On similarity preserving feature selection. IEEE Transactions on Knowledge and Data Engineering, 2013, 25(3): 619-632
[2]
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373-1396
[3]
Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290(5500): 2323- 2326
[4]
Cai D, Zhang C Y, He X F. Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD'10). New York, NY, USA: ACM, 2010. 333-342
[5]
Zhao Z, Liu H. Semi-supervised feature selection via spectral analysis. In: Proceedings of the 2007 SIAM International Conference on Data Mining. Minneapolis, Minnesota: SIAM, 2007. 26-28
[6]
Nie F P, Xiang S M, Jia Y Q, Zhang C S, Yan S C. Trace ratio criterion for feature selection. In: Proceedings of the 23rd National Conference on Artificial Intelligence. Chicago, Illinois, USA: AAAI, 2008. 671-676
[7]
Efron B, Hastie T, Johnstone I, Tibshirani R. Least angle regression. The Annals of Statistics, 2004, 32(2): 407-499
[8]
Duda R O, Hart P E, Stork D G. Pattern Classification. New York: Wiley, 2001.
[9]
Peng H C, Long F H, Ding C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(8): 1226-1238
[10]
Zhao Z, Wang L, Liu H. Efficient spectral feature selection with minimum redundancy. In: Proceedings of the 24th AAAI Conference on Artificial Intelligence. Atlanta, Georgia, USA: AAAI, 2010. 673-678
[11]
Chung F R K. Spectral Graph Theory. American Mathematical Society, 1997.
[12]
von Luxburg U. A tutorial on spectral clustering. Statistics and Computing, 2007, 17(4): 395-416
[13]
Saul L K, Roweis S T. An Introduction to Locally Linear Embedding, Technical Report [Online], available: http:// www.cs.toronto.edu/~roweis/lle/publications.html, March 1, 2014
[14]
Guyon I, Elisseeff A. An introduction to variable and feature selection. The Journal of Machine Learning Research, 2003, 3: 1157-1182
[15]
Bishop C M. Pattern Recognition and Machine Learning. New York: Springer, 2006.
[16]
Bengio Y. Learning Deep Architectures for AI. Hanover, MA, USA: Now Publishers Inc., 2009.
[17]
de la Torre F, Black M J. A framework for robust subspace learning. International Journal of Computer Vision, 2003, 54(1-3): 117-142
[18]
Zhao Z, Morstatter F, Sharma S, Alelyani S, Anand A, Liu H. Advancing Feature Selection Research, Technical Report, Arizona State University, [Online], available: http://www. public.asu.edu/~zzhao15/2010, March 1, 2014
[19]
He X F, Cai D, Niyogi P. Laplacian score for feature selection. In: Proceedings of the 2006 Advances in Neural Information Processing Systems. Cambridge, MA: MIP, 2006. 507-514
[20]
Zhao Z, Liu H. Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th International Conference on Machine Learning. New York, NY, USA: ACM, 2007. 1151-1157
[21]
Kohavi R, John G H. Wrappers for feature subset selection. Artificial Intelligence, 1997, 97(1-2): 273-324
[22]
Quinlan J R. C4. 5: Programs for Machine Learning. San Francisco, CA, USA: Morgan Kaufmann, 1993.
[23]
Koller D, Friedman N. Probabilistic Graphical Models: Principles and Techniques. Cambridge: MIT Press, 2009.
[24]
Nie F P, Huang H, Cai X, Ding C. Efficient and robust feature selection via joint l2, 1-norms minimization. In: Proceedings of the 2010 Advances in Neural Information Processing Systems. Vancouver, British Columbia, Canada, 2010. 1813-1821
[25]
Jiang Y, Ren J T. Eigenvalue sensitive feature selection. In: Proceedings of the 28th International Conference on Machine Learning. New York, NY, USA: ACM, 2011. 89-96
[26]
Belkin M, Niyogi P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Proceedings of the 2001 Advances in Neural Information Processing Systems, Vancouver. British Columbia, Canada: MIT, 2001. 585-591
[27]
Cai D. Spectral Regression: A Regression Framework for Efficient Regularized Subspace Learning [Ph.D. dissertation], Department of Computer Science, University of Illinois at Urbana-Champaign, USA, 2009.
[28]
Cai D, Bao H J, He X F. Sparse concept coding for visual analysis. In: Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Providence, RI: IEEE, 2011. 2905-2910
[29]
Qiao H, Zhang P, Zhang B, Zheng S W. Tracking feature extraction based on manifold learning framework. Journal of Experimental and Theoretical Artificial Intelligence, 2011, 23(1): 23-38