全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
-  2017 

基于高斯过程的机器人自适应抓取策略
Adaptive grasping strategy of robot based on Gaussian process

DOI: 10.13700/j.bh.1001-5965.2016.0660

Keywords: 高斯过程,自适应抓取,机器人控制,机器人视觉,从演示中学习
Gaussian process
,adaptive grasping,robot control,robot vision,learning from demonstration

Full-Text   Cite this paper   Add to My Lib

Abstract:

摘要 在机器人抓取作业时,目标物体的位姿经常发生变化。为了使机器人在运动过程中能够适应物体的位姿变化,提出了一种基于高斯过程的机器人自适应抓取策略。该方法建立了从观测空间到关节空间的映射,使机器人从样本中学习,省去了机器人视觉系统的标定和逆运动学求解。首先,拖动机器人抓取物体,记录物体的观测变量和机器人的关节角度;然后,利用记录的样本训练高斯过程模型,实现观测变量和关节角度的关联;最后,当得到新的观测变量时,通过训练的高斯过程模型得到机器人的关节角度。经过训练后,UR3机器人成功抓取了物体。
Abstract:When robot grasps an object, the pose of the object maybe change frequently. In order to make the robot adapt to the change of the pose of the object in the process of motion, an adaptive grasping strategy of robot based on Gaussian process was proposed. The proposed method maps the observation variables to the joint angles, which makes robot learn from samples and eliminates the calibration process of robot vision system and the robot inverse kinematics computation. First, the robot was dragged to grasp object. The observation variables of object and corresponding robot joint angles were recorded. Second, Gaussian process model was trained with the recorded samples, which correlates the observation variables and joint angles. Finally, after new observation variables were acquired, joint angles for grasping operation can be obtained by the trained Gaussian process model. The experiments show that UR3 robot can successfully grasp objects after training.

References

[1]  AHRARY A,LUDENA R D A.A novel approach to design of an under-actuated mechanism for grasping in agriculture application[M]//LEE R.Applied computing and information technology.Berlin:Springer,2014:31-45.
[2]  MANTI M,HASSAN T,PASSETTI G,et al.An under-actuated and adaptable soft robotic gripper[M]//PRESCOTT T J,LEPORA N F,MURA A,et al.Biomimetic and biohybrid systems.Berlin:Springer,2015:64-74.
[3]  BELZILE B,BIRGLEN L.A compliant self-adaptive gripper with proprioceptive haptic feedback[J].Autonomous Robots,2014,36(1):79-91.
[4]  PETKOVIC'D,ISSA M,PAVLOVIC'N D,et al.Adaptive neuro fuzzy controller for adaptive compliant robotic gripper[J].Expert Systems with Applications,2012,39(18):13295-13304.
[5]  HOFFMANN H,SCHENCK W,M?LLER R.Learning visuomotor transformations for gaze-control and grasping[J].Biological Cybernetics,2005,93(2):119-130.
[6]  LUO R C,LIN T W,TSAI Y H.Analytical inverse kinematic solution for modularized 7-DoF redundant manipulators with offsets at shoulder and wrist[C]//International Conference on Intelligent Robots and Systems.Piscataway,NJ:IEEE Press,2014:516-521.
[7]  RASMUSSEN C E.Gaussian processes for machine learning[M].Cambridge:MIT Press,2006.
[8]  PARASCHOS A,DANIEL C,PETERS J,et al.Probabilistic movement primitives[C]//Advances in Neural Information Processing Systems(NIPS),2013:2616-2624.
[9]  SAXENA A,DRIEMEYER J,NG A Y.Robotic grasping of novel objects using vision[J].International Journal of Robotics Research,2008,27(2):157-173.
[10]  LIPPIELLO V,RUGGIERO F,SICILIANO B,et al.Visual grasp planning for unknown objects using a multifingered robotic hand[J].IEEE/ASME Transactions on Mechatronics,2013,18(3):1050-1059.
[11]  ZHANG Z Y.A flexible new technique for camera calibration[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2000,22(11):1330-1334.
[12]  王一,刘常杰,杨学友,等.工业机器人视觉测量系统的在线校准技术[J].机器人,2011,33(3):299-302.WANG Y,LIU C J,YANG X Y,et al.Online calibration of visual measurement system based on industrial robot[J].Robot,2011,33(3):299-302(in Chinese).
[13]  张李俊,黄学祥,冯渭春,等.基于运动路径靶标的空间机器人视觉标定方法[J].机器人,2016,38(2):193-199.ZHANG L J,HUANG X X,FENG W C,et al.Space robot vision calibration with reference objects from motion trajectories[J].Robot,2016,38(2):193-199(in Chinese).
[14]  CORKE P I.Visual control of robots:High-performance visual serving[M].New York:Wiley,1997.
[15]  SIRADJUDDIN I,BEHERA L,MCGINNITY T M,et al.A position based visual tracking system for a 7 DOF robot manipulator using a kinect camera[C]//International Joint Conference on Neural Networks.Piscataway,NJ:IEEE Press,2012:1-7.
[16]  THOMAS J,LOIANNO G,SREENATH K,et al.Toward image based visual servoing for aerial grasping and perching[C]//2014 IEEE International Conference on Robotics and Automation (ICRA).Piscataway,NJ:IEEE Press,2014:2113-2118.
[17]  NIE L,HUANG Q.Inverse kinematics for 6-DOF manipulator by the method of sequential retrieval[C]//Proceedings of the International Conference on Mechanical Engineering and Material Science,2012:255-258.
[18]  CHAN T F,DUBEY R V.A weighted least-norm solution based scheme for avoiding joint limits for redundant joint manipulators[J].IEEE Transactions on Robotics & Automation,1995,11(2):286-292.
[19]  SHIMIZU M,KAKUYA H,YOON W K,et al.Analytical inverse kinematic computation for 7-DOF redundant manipulators with joint limits and its application to redundancy resolution[J].IEEE Transactions on Robotics,2008,24(5):1131-1142.
[20]  EWERTON M,NEUMANN G,LIOUTIKOV R,et al.Learning multiple collaborative tasks with a mixture of interaction primitives[C]//2015 IEEE International Conference on Robotics and Automation (ICRA).Piscataway,NJ:IEEE Press,2015:1535-1542.
[21]  CALANDRA R,SEYFARTH A,PETERS J,et al.An experimental comparison of Bayesian optimization for bipedal locomotion[C]//IEEE International Conference on Robotics and Automation.Piscataway,NJ:IEEE Press,2014:1951-1958.
[22]  CULLY A,CLUNE J,TARAPORE D,et al.Robots that can adapt like animals[J].Nature,2015,521(7553):503-507.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133