全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于RGB-D深度相机的室内场景重建

DOI: 10.11834/jig.20151010

Keywords: RGB-D深度相机,同时定位与地图构建,相机姿态估计,3维重建

Full-Text   Cite this paper   Add to My Lib

Abstract:

目的重建包含真实纹理的彩色场景3维模型是计算机视觉领域重要的研究课题之一,由于室内场景复杂、采样图像序列长且运动无规则,现有的3维重建算法存在重建尺度受限、局部细节重建效果差的等问题。方法以RGBD-SLAM算法为基础并提出了两方面的改进,一是将深度图中的平面信息加入帧间配准算法,提高了帧间配准算法的鲁棒性与精度;二是在截断符号距离函数(TSDF)体重建过程中,提出了一种指数权重函数,相比普通的权重函数能更好地减少相机深度畸变对重建的影响。结果本文方法在相机姿态估计中带来了比RGBD-SLAM方法更好的结果,平均绝对路径误差减少1.3cm,能取得到更好的重建效果。结论本文方法有效地提高了相机姿态估计精度,可以应用于室内场景重建中。

References

[1]  Zhang Z. Microsoft kinect sensor and its effect[J]. MultiMedia, IEEE, 2012, 19(2): 4-10.
[2]  Izadi S, Kim D, Hilliges O, et al. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera[C]// Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. NY, USA:ACM, 2011:559-568.
[3]  Endres F, Hess J, Engelhard N, et al. An evaluation of the RGB-D SLAM system[J].IEEE International Conference on Robotics and Automation, 2012, 162(4):1691-1696.
[4]  Bylow E, Sturm J, Kerl C, et al. Real-time camera tracking and 3d reconstruction using signed distance functions[C]//Robotics: Science and Systems (RSS) Conference 2013. Berlin, Germany: Robotics: Science and Systems,2013:309-313.
[5]  Zhou Q, Miller S, Koltun V. Elastic fragments for dense scene reconstruction[C]//2013 IEEE International Conference on Computer Vision. Sydney:IEEE, 2013:473-480.
[6]  Nieβner M, Dai A, Fisher M. Combining inertial navigation and icp for real-time 3d surface reconstruction[J]. Eurographics 2014-Short Papers, 2014,10(3): 13-16.
[7]  Thrun S. Robotic mapping: a survey[J]. Exploring Artificial Intelligence in the New Millennium, 2002,5(11): 1-35.
[8]  Nuchter A, Lingemann K, Hertzberg J, et al. 6D SLAM with approximate data association[J].International Conference on Advanced Robotics.icar., 2005,3(7):242-249.
[9]  Henry P, Krainin M, Herbst E, et al. RGB-D mapping: using depth cameras for dense 3D modeling of indoor environments[C]//Experimental Robotics. Berlin, Heidelberg:Springer, 2014: 477-491.
[10]  Endres F, Hess J, Engelhard N, et al. An evaluation of the RGB-D SLAM system[J].IEEE International Conference on Robotics and Automation, 2012, 162(4):1691-1696.
[11]  Kerl C, Sturm J, Cremers D. Robust odometry estimation for RGB-D cameras[C]//IEEE International Conference Robotics and Automation. Germany, Karlsruhe:IEEE, 2013: 3748-3754.
[12]  Steinbrucker F, Sturm J, Cremers D. Real-time visual odometry from dense RGB-D images[C]//Proceedings of IEEE International Conference on Computer Vision Workshops. Colorado Springs, USA: IEEE, 2011: 719-722.
[13]  Taguchi Y, Jian Y D, Ramalingam S, et al. Point-plane SLAM for hand-held 3D sensors[C]//proceedings of IEEE International Conference Robotics and Automation, Germany, Karlsruhe:IEEE, 2013: 5182-5189.
[14]  Sturm J, Engelhard N, Endres F, et al. A benchmark for the evaluation of RGB-D SLAM systems[C]//proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. Washington DC IEEE, 2012: 573-580.
[15]  Bay H, Tuytelaars T, Gool L V. SURF: speeded up robust features[J]. In ECCV, 2006, 8(3):404-417.
[16]  Fischler M A, Bolles R C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography[J]. Readings in Computer Vision, 1987, 24(6):726-740.
[17]  Kummerle R, Grisetti G, Strasdat H, et al. G2o: a general framework for graph optimization[C]//proceedings of IEEE International Conference on Robotics and Automation. Sydney, Australian: IEEE, 2011:3607-3613.
[18]  Curless B, Levoy M. A volumetric method for building complex models from range images[C]//Proceedings of the 23rd annual conference on Computer graphics and interactive techniques. New York, USA:ACM, 1996: 303-312.
[19]  Muja M, Lowe D G. Fast approximate nearest neighbors with automatic algorithm configuration.[J].VISAPP International Conference on Computer Vision Theory and Applications, 2009,84(99):331-340.
[20]  Newcombe R A, Davison A J, Izadi S, et al. KinectFusion: real-time dense surface mapping and tracking[C]//The 10th IEEE International Symposium Mixed and Augmented Reality. Basel, Switzerland:IEEE, 2011: 127-136.
[21]  更多...
[22]  Baumgart B G. Winged edge polyhedron representation[R].State of California Stanford Univ. Cadept of Computer Science, 1972.
[23]  Khoshelham, Kelberink S O. Accuracy and resolution of Kinect depth data for indoor mapping applications.[J]. Sensors, 2012, 12(2):1437-1454.
[24]  Whelan T, Johannsson H, Kaess M, et al. Robust real-time visual odometry for dense RGB-D mapping[C]//Proceedings of IEEE International Conference on Robotics and Automation. Germany, Karlsruhe: IEEE, 2013: 5724-5731.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133