全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Research on Visual Servo Grasping of Household Objects for Nonholonomic Mobile Manipulator

DOI: 10.1155/2014/315396

Full-Text   Cite this paper   Add to My Lib

Abstract:

This paper focuses on the problem of visual servo grasping of household objects for nonholonomic mobile manipulator. Firstly, a new kind of artificial object mark based on QR (Quick Response) Code is designed, which can be affixed to the surface of household objects. Secondly, after summarizing the vision-based autonomous mobile manipulation system as a generalized manipulator, the generalized manipulator’s kinematic model is established, the analytical inverse kinematic solutions of the generalized manipulator are acquired, and a novel active vision based camera calibration method is proposed to determine the hand-eye relationship. Finally, a visual servo switching control law is designed to control the service robot to finish object grasping operation. Experimental results show that QR Code-based artificial object mark can overcome the difficulties brought by household objects’ variety and operation complexity, and the proposed visual servo scheme makes it possible for service robot to grasp and deliver objects efficiently. 1. Introduction A classical mobile manipulator system (MMS) consists of a manipulator which is mounted on a nonholonomic mobile platform. This type of arrangement extends manipulator’s workspace apparently and is widely used in service robot applications [1, 2]. The development of MMS mainly involves two classical items, namely, motion planning [3–8] and coordinating control [9–13], which are used to overcome the mobile platform’s nonholonomic constraint and make the MMS move quickly and efficiently. When robots operate in unstructured environments, it is essential to include exteroceptive sensory information in the control loop. In particular, visual information provided by vision sensor such as charge-coupled device (CCD) cameras guarantees accurate positioning, robustness of calibration uncertainties, and reactivity of environmental changes. Much of the work related to CCD cameras and manipulators has focused on the applications about the manipulator’s visual servo control, which specifies robotic tasks (such as object grasping, assembling) in terms of desired image features extracted from a target object. The overview of visual servo can be seen in literature [14–16]. In general, visual servo approaches can be divided into three different kinds, namely, position-based visual servoing (PBVS) [17, 18], image-based visual sevoing (IBVS) [19, 20], and hybrid visual servoing (HYBVS) [21–23]. In PBVS, the feedback signals in vision loop are the intuitive relative 3D pose between current and desired cameras estimated by current and

References

[1]  S. Ekvall, D. Kragic, and P. Jensfelt, “Object detection and mapping for service robot tasks,” Robotica, vol. 25, no. 2, Article ID 00323, pp. 175–187, 2007.
[2]  K. Severinson-Eklundh, A. Green, and H. Hüttenrauch, “Social and collaborative aspects of interaction with a service robot,” Robotics and Autonomous Systems, vol. 42, no. 3-4, pp. 223–234, 2003.
[3]  W. F. Carriker, P. K. Khosla, and B. H. Krogh, “Path planning for mobile manipulators for multiple task execution,” IEEE Transactions on Robotics and Automation, vol. 7, no. 3, pp. 403–408, 1991.
[4]  Q. Huang, K. Tanie, and S. Sugano, “Coordinated motion planning for a mobile manipulator considering stability and manipulation,” International Journal of Robotics Research, vol. 19, no. 8, pp. 732–742, 2001.
[5]  A. Mohri, S. Furuno, and M. Yamamoto, “Trajectory planning of mobile manipulator with end-effector's specified path,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 4, pp. 2264–2269, Maui, Hawaii, USA, November 2001.
[6]  K. Tchon, J. Jakubiak, and R. Muszynski, “Regular Jacobian motion planning algorithms for mobile manipulators,” in Proceedings of the 15th IFAC World Congress, vol. 15, Barcelona, Spain, 2002.
[7]  J. Vannoy and J. Xiao, “Real-time adaptive motion planning (RAMP) of mobile manipulators in dynamic environments with unforeseen changes,” IEEE Transactions on Robotics, vol. 24, no. 5, pp. 1199–1212, 2008.
[8]  S. Ide, T. Takubo, K. Ohara, Y. Mae, and T. Arai, “Real-time trajectory planning for mobile manipulator using model predictive control with constraints,” in Proceedings of the 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI '11), pp. 244–249, Incheon, Republic of Korea, November 2011.
[9]  J. H. Chung, S. A. Velinsky, and R. A. Hess, “Interaction control of a redundant mobile manipulator,” International Journal of Robotics Research, vol. 17, no. 12, pp. 1302–1309, 1998.
[10]  A. Mazur, “Hybrid adaptive control laws solving a path following problem for non-holonomic mobile manipulators,” International Journal of Control, vol. 77, no. 15, pp. 1297–1306, 2004.
[11]  S. Lin and A. A. Goldenberg, “Robust damping control of mobile manipulators,” IEEE Transactions on Systems, Man, and Cybernetics B: Cybernetics, vol. 32, no. 1, pp. 126–132, 2002.
[12]  M. Mailah, E. Pitowarno, and H. Jamaluddin, “Robust motion control for mobile manipulator using resolved acceleration and proportional-integral active force control,” International Journal of Advanced Robotic Systems, vol. 2, no. 2, pp. 125–134, 2005.
[13]  M. Galicki, “Control of mobile manipulators in a task space,” IEEE Transactions on Automatic Control, vol. 57, no. 11, pp. 2962–2967, 2012.
[14]  S. Hutchinson, G. D. Hager, and P. I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 651–670, 1996.
[15]  F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,” IEEE Robotics and Automation Magazine, vol. 13, no. 4, pp. 82–90, 2006.
[16]  F. Chaumette and S. Hutchinson, “Visual servo control. II. Advanced approaches,” IEEE Robotics and Automation Magazine, vol. 14, no. 1, pp. 109–118, 2007.
[17]  W. J. Wilson, C. C. W. Hulls, and G. S. Bell, “Relative end-effector control using cartesian position based visual servoing,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 684–696, 1996.
[18]  B. Thuilot, P. Martinet, L. Cordesses, and J. Gallice, “Position based visual servoing: keeping the object in the field of vision,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA '02), vol. 2, pp. 1624–1629, Washington, DC, USA, May 2002.
[19]  P. I. Corke and S. A. Hutchinson, “A new partitioned approach to image-based visual servo control,” IEEE Transactions on Robotics and Automation, vol. 17, no. 4, pp. 507–515, 2001.
[20]  R. Mahony, P. Corke, and T. Hamel, “Dynamic image-based visual servo control using centroid and optic flow features,” Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME, vol. 130, no. 1, Article ID 011005, 2008.
[21]  E. Malis, F. Chaumette, and S. Boudet, “2-1/2-D visual servoing,” IEEE Transactions on Robotics and Automation, vol. 15, no. 2, pp. 238–250, 1999.
[22]  E. Malis and F. Chaumette, “21/2D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement,” International Journal of Computer Vision, vol. 37, no. 1, pp. 79–97, 2000.
[23]  E. Malis and F. Chaumette, “Theoretical improvements in the stability analysis of a new class of model-free visual servoing methods,” IEEE Transactions on Robotics and Automation, vol. 18, no. 2, pp. 176–186, 2002.
[24]  Y. Mezouar and F. Chaumette, “Path planning in image space for robust visual servoing,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA '00), vol. 3, pp. 2759–2764, April 2000.
[25]  Y. Ma, J. Kosěcká, and S. S. Sastry, “Vision guided navigation for a nonholonomic mobile robot,” IEEE Transactions on Robotics and Automation, vol. 15, no. 3, pp. 521–536, 1999.
[26]  W. E. Dixon, D. M. Dawson, E. Zergeroglu, and A. Behal, “Adaptive tracking control of a wheeled mobile robot via an uncalibrated camera system,” IEEE Transactions on Systems, Man, and Cybernetics B: Cybernetics, vol. 31, no. 3, pp. 341–352, 2001.
[27]  D. Amarasinghe, G. K. I. Mann, and R. G. Gosine, “Vision-based hybrid control scheme for autonomous parking of a mobile robot,” Advanced Robotics, vol. 21, no. 8, pp. 905–930, 2007.
[28]  R. F. Vassallo, H. J. Schneebeli, and J. Santos-Victor, “Visual servoing and appearance for navigation,” Robotics and Autonomous Systems, vol. 31, no. 1, pp. 87–97, 2000.
[29]  A. de Luca, G. Oriolo, and P. R. Giordano, “Image-based visual servoing schemes for nonholonomic mobile manipulators,” Robotica, vol. 25, no. 2, Article ID 00326, pp. 131–145, 2007.
[30]  N. Mansard, O. Stasse, F. Chaumette, and K. Yokoi, “Visually-guided grasping while walking on a humanoid robot,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA '07), pp. 3041–3047, April 2007.
[31]  Y. Wang, H. Lang, and C. W. de Silva, “A hybrid visual servo controller for robust grasping by wheeled mobile robots,” IEEE/ASME Transactions on Mechatronics, vol. 15, no. 5, pp. 757–769, 2010.
[32]  Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330–1334, 2000.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133