%0 Journal Article
%T A MAP Approach for Vision-based Self-localization of Mobile Robot
一种基于MAP估计的移动机器人视觉自定位方法
%A WANG Ke
%A WANG Wei
%A ZHUANG Yan
%A
王珂
%A 王伟
%A 庄严
%J 自动化学报
%D 2008
%I
%X An on-the-fly, self-localization system is developed for mobile robot which is operative in a 3D environment with elaborative 3D landmarks. The robot estimates its pose recursively through a MAP estimator that incorporates the information collected from odometry and unidirectional camera. We build the nonlinear models for these two sensors and maintain that the uncertainty manipulation of robot motion and inaccurate sensor measurements should be embedded and tracked throughout our system. We describe the uncertainty framework in a probabilistic geometry viewpoint and use unscented transform to propagate the uncertainty, which undergoes the given nonlinear functions. Considering the processing power of our robot, image features are extracted in the vicinity of corresponding projected features. In addition, data associations are evaluated by statistical distance.Finally, a series of systematic experiments are conducted to prove the reliable and accurate performance of our system.sor fusion, unscented transformation, uncertainty propaga
%K Vision-based self-localization
%K MAP estimation
%K multi sensor fusion
%K unscented transformation
%K uncertainty propagation
Vision-based
%K self-localization
%K MAP
%K estimation
%K multi
%K sention
%K 估计
%K 移动
%K 机器人视觉
%K 自定位方法
%K Mobile
%K Robot
%K transformation
%K fusion
%K series
%K experiments
%K prove
%K reliable
%K accurate
%K performance
%K data
%K statistical
%K distance
%K addition
%K image
%K vicinity
%K features
%U http://www.alljournals.cn/get_abstract_url.aspx?pcid=5B3AB970F71A803DEACDC0559115BFCF0A068CD97DD29835&cid=8240383F08CE46C8B05036380D75B607&jid=E76622685B64B2AA896A7F777B64EB3A&aid=442CCB5A76862C3812D47E141FF681C7&yid=67289AFF6305E306&vid=339D79302DF62549&iid=0B39A22176CE99FB&sid=BA79719BCA7341D5&eid=43608FD2E15CD61B&journal_id=0254-4156&journal_name=自动化学报&referenced_num=0&reference_num=38