全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

越野环境下轮速里程计与VIO紧耦合的定位算法
Positioning Algorithm with Tightly Coupled Wheel Odometry and VIO in Off-Road Environments

DOI: 10.12677/sea.2025.142027, PP. 291-304

Keywords: 越野环境,VIO,紧耦合,轮速里程计,混合预积分,零速更新,卡方检验
Off-Road Environments
, VIO, Tightly Coupled, Wheel Odometry, Hybrid Pre-Integration, Zero-Velocity Update, Chi-Squared Test

Full-Text   Cite this paper   Add to My Lib

Abstract:

针对视觉惯性里程计(Visual-Inertial Odometry, VIO)在越野环境中定位性能显著下降的问题,本文提出了一种基于轮速里程计与VIO紧耦合的算法HW-VIO (Hybrid Wheel-VIO)。该算法融合了IMU与轮速里程计的特点,设计了混合预积分观测模型,并利用轮速里程计的零速更新校正IMU加速度计和陀螺仪的偏置误差。为改善轮速计异常值频发的问题,本文引入卡方检验算法,对混合预积分残差进行评估,从而稳健识别并剔除异常数据。最后,在三种难度不同的野外农田场景中对算法进行了测试。实验结果表明,本文算法能够显著提高VIO系统的性能,平均定位精度提升47%。此外,通过消融实验进一步验证了混合预积分观测模型的有效性,相较于直接进行轮速融合的W-VIO (Wheel-VIO)算法,平均定位精度提升达50%。
The declining localization performance of Visual-Inertial Odometry (VIO) in off-road environments is a significant challenge. To address this issue, a tightly coupled algorithm named HW-VIO (Hybrid Wheel-VIO) is proposed, combining wheel odometry and VIO. The method leverages the complementary properties of IMU and wheel odometry by introducing a hybrid pre-integration observation model, where zero-velocity updates from wheel odometry are employed to dynamically correct accelerometer and gyroscope biases in the IMU. To handle the frequent occurrence of outliers in wheel odometry measurements, a chi-squared test is applied to evaluate residuals from the hybrid pre-integration process, enabling robust identification and rejection of abnormal data. The algorithm is validated through experiments conducted in three off-road farmland scenarios with varying levels of difficulty. Results show that HW-VIO significantly improves localization accuracy, achieving an average accuracy improvement of 47%. Furthermore, ablation studies confirm the effectiveness of the hybrid pre-integration model, demonstrating a 50% improvement in localization accuracy compared to the W-VIO (Wheel-VIO) algorithm, which directly fuses wheel odometry.

References

[1]  陶永, 刘海涛, 王田苗, 等. 我国服务机器人技术研究进展与产业化发展趋势[J]. 机械工程学报, 2022, 58(18): 56-74.
[2]  Potokar, E.R., McGann, D. and Kaess, M. (2024) Robust Preintegrated Wheel Odometry for Off-Road Autonomous Ground Vehicles. IEEE Robotics and Automation Letters, 9, 11649-11656.
https://doi.org/10.1109/LRA.2024.3496334
[3]  Bai, Y., Zhang, B., Xu, N., Zhou, J., Shi, J. and Diao, Z. (2023) Vision-Based Navigation and Guidance for Agricultural Autonomous Vehicles and Robots: A Review. Computers and Electronics in Agriculture, 205, Article 107584.
https://doi.org/10.1016/j.compag.2022.107584
[4]  Bloesch, M., Burri, M., Omari, S., Hutter, M. and Siegwart, R. (2017) Iterated Extended Kalman Filter Based Visual-Inertial Odometry Using Direct Photometric Feedback. The International Journal of Robotics Research, 36, 1053-1072.
https://doi.org/10.1177/0278364917728574
[5]  Sun, K., Mohta, K., Pfrommer, B., Watterson, M., Liu, S., Mulgaonkar, Y., et al. (2018) Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight. IEEE Robotics and Automation Letters, 3, 965-972.
https://doi.org/10.1109/lra.2018.2793349
[6]  Geneva, P., Eckenhoff, K., Lee, W., Yang, Y. and Huang, G. (2020) OpenVINS: A Research Platform for Visual-Inertial Estimation. 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, 31 May-31 August 2020, 4666-4672.
https://doi.org/10.1109/icra40945.2020.9196524
[7]  Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R. and Furgale, P. (2014) Keyframe-Based Visual-Inertial Odometry Using Nonlinear Optimization. The International Journal of Robotics Research, 34, 314-334.
https://doi.org/10.1177/0278364914554813
[8]  Qin, T., Cao, S., Pan, J. and Shen, S. (2019) A General Optimization-Based Framework for Global Pose Estimation with Multiple Sensors. arXiv: 1901.03642.
https://doi.org/10.48550/arXiv.1901.03642
[9]  Campos, C., Elvira, R., Rodriguez, J.J.G., M. Montiel, J.M. and D. Tardos, J. (2021) ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap Slam. IEEE Transactions on Robotics, 37, 1874-1890.
https://doi.org/10.1109/tro.2021.3075644
[10]  Cremona, J., Comelli, R. and Pire, T. (2022) Experimental Evaluation of Visual‐Inertial Odometry Systems for Arable Farming. Journal of Field Robotics, 39, 1121-1135.
https://doi.org/10.1002/rob.22099
[11]  Chustz, G. and Saripalli, S. (2022) ROOAD: RELLIS Off-Road Odometry Analysis Dataset. 2022 IEEE Intelligent Vehicles Symposium (IV), Aachen, 4-9 June 2022, 1504-1510.
https://doi.org/10.1109/iv51971.2022.9827133
[12]  Sun, J., Wu, S., Dong, J. and He, J. (2024) Field-VIO: Stereo Visual-Inertial Odometry Based on Quantitative Windows in Agricultural Open Fields. 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, 13-17 May 2024, 1624-1630.
https://doi.org/10.1109/icra57147.2024.10611284
[13]  Wu, K.J., Guo, C.X., Georgiou, G. and Roumeliotis, S.I. (2017) VINS on Wheels. 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May-3 June 2017, 5155-5162.
https://doi.org/10.1109/icra.2017.7989603
[14]  Zhao, H., Ji, X. and Wei, D. (2023) Vehicle-Motion-Constraint-Based Visual-Inertial-Odometer Fusion with Online Extrinsic Calibration. IEEE Sensors Journal, 23, 27895-27908.
https://doi.org/10.1109/JSEN.2023.3319345
[15]  Yin, J., Li, A., Xi, W., Yu, W. and Zou, D. (2024) Ground-Fusion: A Low-Cost Ground SLAM System Robust to Corner Cases. arXiv: 2402.14308.
https://doi.org/10.48550/arXiv.2402.14308
[16]  Lee, W., Eckenhoff, K., Yang, Y., Geneva, P. and Huang, G. (2020) Visual-Inertial-Wheel Odometry with Online Calibration. 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, 24 October-24 January 2021, 4559-4566.
https://doi.org/10.1109/iros45743.2020.9341161
[17]  Nemec, D., Šimák, V., Janota, A., Hruboš, M. and Bubeníková, E. (2019) Precise Localization of the Mobile Wheeled Robot Using Sensor Fusion of Odometry, Visual Artificial Landmarks and Inertial Sensors. Robotics and Autonomous Systems, 112, 168-177.
https://doi.org/10.1016/j.robot.2018.11.019
[18]  Liu, J., Gao, W. and Hu, Z. (2019) Visual-Inertial Odometry Tightly Coupled with Wheel Encoder Adopting Robust Initialization and Online Extrinsic Calibration. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, 3-8 November 2019, 5391-5397.
https://doi.org/10.1109/iros40897.2019.8967607
[19]  Quan, M., Piao, S., Tan, M. and Huang, S.-S. (2018) Tightly-Coupled Monocular Visual-Odometric SLAM Using Wheels and a MEMS Gyroscope. arXiv: 1804.04854.
https://doi.org/10.48550/arXiv.1804.04854
[20]  Liu, Y., Zhao, C. and Ren, M. (2022) An Enhanced Hybrid Visual-Inertial Odometry System for Indoor Mobile Robot. Sensors, 22, Article 2930.
https://doi.org/10.3390/s22082930
[21]  Bellemare, M.G., Danihelka, I., Dabney, W., Mohamed, S., Lakshminarayanan, B., Hoyer, S., et al. (2017) The Cramer Distance as a Solution to Biased Wasserstein Gradients. arXiv:1705.10743.
https://doi.org/10.48550/arXiv.1705.10743.
[22]  Mourikis, A.I. and Roumeliotis, S.I. (2007) A Multi-State Constraint Kalman Filter for Vision-Aided Inertial Navigation. Proceedings 2007 IEEE International Conference on Robotics and Automation, Rome, 10-14 April 2007, 3565-3572.
https://doi.org/10.1109/robot.2007.364024
[23]  Jurado, J., Raquet, J., Schubert Kabban, C.M. and Gipson, J. (2020) Residual‐Based Multi‐Filter Methodology for All‐source Fault Detection, Exclusion, and Performance Monitoring. NAVIGATION, 67, 493-510.
https://doi.org/10.1002/navi.384
[24]  Pire, T., Mujica, M., Civera, J. and Kofman, E. (2019) The Rosario Dataset: Multisensor Data for Localization and Mapping in Agricultural Environments. The International Journal of Robotics Research, 38, 633-641.
https://doi.org/10.1177/0278364919841437
[25]  Grupp, M. (2017) EVO: Python Package for the Evaluation of Odometry and SLAM.
https://github.com/MichaelGrupp/evo
[26]  Umeyama, S. (1991) Least-Squares Estimation of Transformation Parameters between Two Point Patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13, 376-380.
https://doi.org/10.1109/34.88573

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133