全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Kernel Local Linear Discriminate Method for Dimensionality Reduction and Its Application in Machinery Fault Diagnosis

DOI: 10.1155/2014/283750

Full-Text   Cite this paper   Add to My Lib

Abstract:

Dimensionality reduction is a crucial task in machinery fault diagnosis. Recently, as a popular dimensional reduction technology, manifold learning has been successfully used in many fields. However, most of these technologies are not suitable for the task, because they are unsupervised in nature and fail to discover the discriminate structure in the data. To overcome these weaknesses, kernel local linear discriminate (KLLD) algorithm is proposed. KLLD algorithm is a novel algorithm which combines the advantage of neighborhood preserving projections (NPP), Floyd, maximum margin criterion (MMC), and kernel trick. KLLD has four advantages. First of all, KLLD is a supervised dimension reduction method that can overcome the out-of-sample problems. Secondly, short-circuit problem can be avoided. Thirdly, KLLD algorithm can use between-class scatter matrix and inner-class scatter matrix more efficiently. Lastly, kernel trick is included in KLLD algorithm to find more precise solution. The main feature of the proposed method is that it attempts to both preserve the intrinsic neighborhood geometry of the increased data and exact the discriminate information. Experiments have been performed to evaluate the new method. The results show that KLLD has more benefits than traditional methods. 1. Introduction With the information collection technology becoming more and more advanced, a huge number of data have been produced during mechanical equipment running process. The sensitive information which reflects the running status of the equipment has been submerged in a large amount of redundant data. Effective dimensionality reduction can solve this problem. Dimensionality reduction is one of the key technologies for equipment condition monitoring and fault diagnosis. Nonlinear and nonstationary vibration signals generated by the rolling bearing [1, 2] make the original high-dimensional feature space which consists of the statistical characteristics of the signal inseparable. The traditional linear dimensionality reduction methods such as PCA and ICA not only are under the assumption of global linear structure of the data but also use different linear transformation matrix to find the best low-dimensional projection. The classification information plays an important role. In nonlinear conditions such as the original high dimensional feature space possesses a non-linear structure, however, the classification information is difficult to obtain by linear methods. KPCA is a traditional nonlinear dimensionality reduction method, which achieves the task of dimensionality

References

[1]  W. Y. Liu and J. G. Han, “Rolling element bearing fault recognition approach based on fuzzy clustering bispectrum estimation,” Shock and Vibration, vol. 20, pp. 213–225, 2013.
[2]  G. F. Wang, X. L. Feng, and C. Liu, “Bearing fault classification based on conditional random field,” Shock and Vibration, vol. 20, pp. 591–600, 2013.
[3]  S. Mika, G. R?tsch, J. Weston, B. Scholkopf, and K.-R. Muller, “Fisher discriminant analysis with kernels,” in Proceedings of the 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99), pp. 41–48, August 1999.
[4]  Q.-S. Jiang, M.-P. Jia, J.-Z. Hu, and F.-Y. Xu, “Method of fault pattern recognition based on laplacian eigenmaps,” Journal of System Simulation, vol. 20, no. 20, pp. 5710–5713, 2008.
[5]  W. Yang, C. Sun, and L. Zhang, “A multi-manifold discriminant analysis method for image feature extraction,” Pattern Recognition, vol. 44, no. 8, pp. 1649–1657, 2011.
[6]  P. K. Yan, W. X. Zhang, and B. Turkbey, “Global structure constrained local shape prior estimation for medical image segmentation,” Computer Vision and Image Understanding, vol. 117, pp. 1017–1026, 2013.
[7]  V. D. Silva and J. B. Tenenbaum, “Global versus local methods in nonlinear dimensionality reduction,” Advances in Neural Information Processing Systems, vol. 15, pp. 705–712, 2003.
[8]  J. B. Tenenbaum, V. De Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319–2323, 2000.
[9]  S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000.
[10]  D. Cai, X. F. He, and K. Zhou, “Locally sensitive discriminant analysis,” in Proceedings of the 20th International Joint Conference on Artifical Intelligence (IJCAI'07), V. Manuela, Ed., pp. 708–713, 2007.
[11]  F. Li, B. P. Tang, and R. S. Yang, “Rotating machine fault diagnosis using dimension reduction with linear local tangent space alignment,” Measurement, vol. 46, pp. 2525–2539, 2013.
[12]  W. Zhang, W. Zhou, and B. Li, “Fault diagnosis approach based on fractal dimension LLE and Fisher discriminant,” Chinese Journal of Scientific Instrument, vol. 31, no. 2, pp. 325–333, 2010.
[13]  Z. Li, X. Yan, C. Yuan, J. Zhao, and Z. Peng, “A new method of nonlinear feature extraction for multi-fault diagnosis of rotor systems,” Noise & Vibration Worldwide, vol. 41, no. 10, pp. 29–37, 2010.
[14]  B. Li and Y. Zhang, “Supervised locally linear embedding projection (SLLEP) for machinery fault diagnosis,” Mechanical Systems and Signal Processing, vol. 25, no. 8, pp. 3125–3134, 2011.
[15]  Y. Lei, Z. He, and Y. Zi, “A new approach to intelligent fault diagnosis of rotating machinery,” Expert Systems with Applications, vol. 35, no. 4, pp. 1593–1600, 2008.
[16]  Y. W. Pang, L. Zhang, and Z. K. Liu, “Neighborhood preserving projections (NPP): a novel linear linear dimension reduction method,” in Advances in Intelligent Computing, vol. 3644 of Lecture Notes in Computer Science, pp. 117–125, Springer, Berlin, Germany, 2005.
[17]  H. F. Li, T. Jiang, and K. Zhang, “Efficient and robust feature extraction by maximum margin criterion,” IEEE Transactions on Neural Networks, vol. 17, no. 1, pp. 157–165, 2006.
[18]  S. L. Jiang, J. Y. Zhang, and J. Xu, “Discriminative analysis algorithm for linear local tangent spaces,” Journal of Huazhong University of Science and Technology, vol. 38, no. 10, pp. 66–69, 2010.
[19]  G. H. Wen, L. J. Jiang, and J. Wen, “Using locally estimated geodesic distances to improve Hessian local linear embedding,” CAAI Transactions on Intelligent Systems, vol. 3, pp. 429–435, 2008.
[20]  C.-C. Chang and C.-J. Lin, “LIBSVM: a library for support vector machines,” ACM Transactions on Intelligent Systems and Technology, vol. 2, no. 3, article 27, 2011.
[21]  O. Kouropteva, O. Okun, and A. Hadid, “Beyond locally linear embedding algorithm,” MVG-01-2002, Machine Vision Group, University of Oulu, 49, 2002.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133