全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
-  2017 

基于IMU与单目视觉融合的姿态测量方法
Hybrid Pose Measurement Based on Fusion of IMU and Monocular Vision

DOI: 10.11784/tdxbz201604003

Keywords: 惯性测量单元,视觉测量,融合,姿态测量,坐标系标定
IMU
,visual measurement,fusion,pose measurement,coordinate calibration

Full-Text   Cite this paper   Add to My Lib

Abstract:

快速准确测量运动目标的姿态, 在航天航空、机器人等领域应用广泛.针对惯性姿态测量速度快但精度不足而视觉姿态测量精度高但速度慢的特点, 提出了一种基于IMU与单目视觉融合的姿态测量方法.IMU姿态测量时, 使用欧拉角迭代公式解算运动目标的姿态; 单目视觉姿态测量时, 使用POSIT算法求解被测目标的姿态; 再使用滤波器融合两者的测量结果; 用融合结果与惯性测量结果的差值修正并更新漂移误差曲线.根据被测目标的两组匀速转动信息, 提出了双矢量正交化标定法标定IMU坐标系到目标坐标系的旋转矩阵; 根据单目视觉采集的3幅图像信息, 提出了三图快速标定法标定靶标坐标系到目标坐标系的旋转矩阵.实验表明, 提出的姿态测量方法能实现快速姿态测量, 测量精度高.
Quick and accurate object-pose measurement is widely used in aerospace and robot fields. Inertial pose measurement,using Euler angle iteration formula,has an advantage in measuring velocity but suffers from slow drift. In contrast,visual pose measurement based on POSIT algorithm is slow but accurate. This paper proposes a method for measuring attitude by integrating the data from the two types of measurements.filter is used for fusion. In this paper,Euler drift error curve,caused by the drift of gyroscopic sensor,is corrected and updated according to the difference between the hybrid measurement result and the output of inertial measurement. In order to calibrate the coordinate relationships in pose measurement system,this paper promotes a double-vector orthogonal calibration method and a three-picture fast calibration method. Those methods use the output of IMU and the information from three captured pictures when the object takes a determined rotation. Experimental results show that hybrid pose measurement with fusion resolution is quicker than visual orientation measurement,with higheraccuracy

References

[1]  钱山. 在轨服务航天器相对测量及姿态控制研究[D]. 长沙:国防科学技术大学航天科学与工程学院, 2010.
[2]  Foxlin E, Altshuler Y, Naimark L, et al. Flight Tracker:A novel optical/inertial tracker for cockpit enhanced vision[C]// <i>Third IEEE and ACM International Symposium on Mixed and Augmented Reality</i>, 2004(<i>ISMAR<i> 2004)<i>.<i> IEEE, 2004:212-221.
[3]  王康友. 室内近距高精度惯性/视觉融合定姿研究[D]. 南京:南京航空航天大学自动化学院, 2012.
[4]  Wang Kangyou. Research on High-Precision Attitude Estimation Based on INS/Vision Fusion for Indoor Flight [D]. Nanjing:School of Automation, Nanjing University of Aeronautics and Astronautics, 2012(in Chinese).
[5]  Abdel-Aziz Y I, Karara H M, Hauck M. Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry[J]. <i>Photogrammetric Engineering & Remote Sensing</i>, 2015, 81(2):103-107.
[6]  Dementhon D F, Davis L S. Model-based object pose in 25 lines of code[J]. <i>International Journal of Computer Vision</i>, 1995, 15(1/2):123-141.
[7]  Gao Zhongyu. <i>Inertial Navigation System Technology</i> [M]. Beijing:Tsinghua University Press, 2012(in Chinese).
[8]  Liu M, Zhang S, Jin Y. Multi-sensor optimal <i>H</i><sub>∞</sub> fusion filters for delayed nonlinear intelligent systems based on a unified model[J]. <i>Neural Networks the Official Journal of the International Neural Network Society</i>, 2011, 24(3):280-290.
[9]  Li Q, Wang H, Zhang W, et al. <i>H</i><sub>∞</sub> fusion filter design in multi-sensor fusion system with state time-delays [C]// 2007 <i>IEEE International Conference on Automation and Logistics</i>. IEEE, 2007:2784-2789.
[10]  Liu Xiaoguang, Hu Jingtao, Wang He. Research on integrated navigation method based on adaptive <i>H</i><sub>∞ </sub>filter [J]. <i>Chinese Journal of Scientific Instrument</i>, 2014, 35(5):1013-1021(in Chinese).
[11]  Erik Murphy-Chutorian, Mohan Manubhai Trivedi. Head pose estimation and augmented reality tracking:An integrated system and evaluation for monitoring driver awareness[J]. <i>IEEE Transactions on Intelligent Transportation Systems</i>, 2010, 11(2):300-311.
[12]  Qian Shan. Research on Relative Measurement and Attitude Control for On-Orbit Servicing SpaceCraft[D]. Changsha:School of Aerospace Science and Engineer-ing, National University of Defense Technology, 2010 (in Chinese).
[13]  Nützi G, Weiss S, Scaramuzza D, et al. Fusion of IMU and vision for absolute scale estimation in monocular SLAM[J]. <i>Journal of Intelligent & Robotic Systems</i>, 2011, 61(1/2/3/4):287-299.
[14]  Jekeli C. <i>Inertial Navigation Systems with Geodetic Applications</i>[M]. Berlin:Walter de Gruyter, 2001.
[15]  Hugo Blanc. Inertial and Optical Hybrid Head Tracking for Helicopter Simulators[D]. Stockholm:Royal Institute of Technology, 2013.
[16]  张树侠, 孙静. 捷联式惯性导航系统[M]. 北京:国防工业出版社, 1992.
[17]  陈晓冬, 杜承阳, 朱晓田, 等. 一种用于介入式内窥手术的多传感融合定位方法[J]. 中国激光, 2014, 41(12):1204001-1-1204001-5.
[18]  Chen Xiaodong, Du Chengyang, Zhu Xiaotian, et al. A sensing-fusion tracking method for inversion endoscope surgery[J]. <i>Chinese Journal of Lasers</i>, 2014, 41(12):1204001-1-1204001-5(in Chinese).
[19]  Welch G, Bishop G. An introduction to the Kalman filter [J]. <i>University of North Carolina at Chapel Hill</i>, 1995(7):127-132.
[20]  Foxlin E. Inertial head-tracker sensor fusion by a complimentary separate-bias Kalman filter[C]// <i>Proceedings of the IEEE Virtual Reality Annual International Symposium</i>, 1996. IEEE, 1996:185-194, 267.
[21]  Foxlin E M. Generalized architecture for simultaneous localization, auto-calibration, and map-building[C]// <i>IEEE/RSJ International Conference on Intelligent Robots and Systems</i>. IEEE, 2002, 1:527-533.
[22]  高钟毓. 惯性导航系统技术[M]. 北京:清华大学出版社, 2012.
[23]  刘晓光, 胡静涛, 王鹤. 基于自适应<i>H</i><sub>∞</sub>滤波的组合导航方法研究[J]. 仪器仪表学报, 2014, 35(5):1013-1021.
[24]  Han Songlai, Wang Jinling. A novel method to integrate IMU and magnetometers in attitude and heading reference systems[J]. <i>Journal of Navigation</i>, 2011, 64(4):727-738.
[25]  王新龙. 惯性导航基础[M]. 西安:西北工业大学出版社, 2013.
[26]  Wang Xinlong. <i>Basis of Inertial Navigation</i>[M]. Xi’an:Northwestern Polytechnical University Press, 2013(in Chinese).
[27]  Zhang Shuxia, Sun Jing. <i>Strapdown Inertial Navigation System</i>[M]. Beijing:National Defense Industry Press, 1992(in Chinese).</i></i></i></i>

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133