全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Sensors  2011 

A Multi-Sensorial Hybrid Control for Robotic Manipulation in Human-Robot Workspaces

DOI: 10.3390/s111009839

Keywords: direct visual servo, human-robot collaboration, visual servoing, tactile control

Full-Text   Cite this paper   Add to My Lib

Abstract:

Autonomous manipulation in semi-structured environments where human operators can interact is an increasingly common task in robotic applications. This paper describes an intelligent multi-sensorial approach that solves this issue by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks. The proposed sensorial system is composed of a hybrid visual servo control to efficiently guide the robot towards the object to be manipulated, an inertial motion capture system and an indoor localization system to avoid possible collisions between human operators and robots working in the same workspace, and a tactile sensor algorithm to correctly manipulate the object. The proposed controller employs the whole multi-sensorial system and combines the measurements of each one of the used sensors during two different phases considered in the robot task: a first phase where the robot approaches the object to be grasped, and a second phase of manipulation of the object. In both phases, the unexpected presence of humans is taken into account. This paper also presents the successful results obtained in several experimental setups which verify the validity of the proposed approach.

References

[1]  Chaumette, F; Hutchinson, F. Visual servo control, part I: Basic approaches. IEEE Robot. Autom. Mag 2006, 13, 82–90.
[2]  Lippiello, V; Siciliano, B; Villani, L. Position-based visual servoing in industrial multirobot cells using a hybrid camera configuration. IEEE Trans. Robot 2007, 23, 73–86.
[3]  Muis, A; Ohnishi, K. Eye-to-hand approach on eye-in-hand configuration within real-time visual servoing. IEEE/ASME Trans. Mechatron 2005, 10, 404–410.
[4]  Marchand, E; Hager, GD. Dynamic Sensor Planning in Visual Servoing. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA-98), Leuven, Belgium, 16–20 May 1998; pp. 1988–1993.
[5]  Kelly, R. Robust asymptotically stable visual servoing of planar robots. IEEE Trans. Robot. Autom 1996, 12, 759–766.
[6]  Miyazaki, F; Masutani, Y. Robustness of Sensory Feedback Control based on Imperfect JACOBIAN. In Robotics Research: The Fifth International Symposium; MIT Press: Cambridge, MA, USA, 1990; pp. 201–208.
[7]  Takegaki, M; Arimoto, S. A new feedback method for dynamic control of manipulators. J. Dyn. Syst. Meas. Control 1981, 103, 119–125.
[8]  Kelly, R; Marquez, A. Fixed-eye direct visual feedback control of planar robots. J. Syst. Eng 1995, 5, 239–248.
[9]  Zergeroglu, E; Dawson, DM; de Querioz, MS; Behal, A. Vision-based nonlinear tracking controllers with uncertain robot-camera parameters. IEEE/ASME Trans. Mechatron 2001, 6, 322–337.
[10]  Wu, B; Li, HG. Uncalibrated visual servoing of robots with new image Jacobian estimation method. J. Syst. Simul 2008, 20, 3767–3771.
[11]  Wang, HS; Liu, YH; Zhou, DX. Adaptive visual servoing using point and line features with an uncalibrated eye-in-hand camera. IEEE Trans. Robot 2008, 24, 843–857.
[12]  Baeten, J; Bruyninckx, H; de Schutter, J. Shared control in hybrid vision/force robotic servoing using the task frame. IEEE/RSJ Int. Conf. Intell. Robots Syst 2002, 3, 2128–2133.
[13]  Leite, AC; Lizarralde, F; Hsu, L. Hybrid adaptive vision—Force control for robot manipulators interacting with unknown surfaces. Int. J. Robot. Res 2009, 28, 911–926.
[14]  Raibert, M; Craig, JJ. Hybrid position/force control of manipulators. Trans. ASME J. Dyn. Syst. Meas. Control 1981, 102, 126–133.
[15]  Deng, LF; Janabi-Sharifi, F; Wilson, WJ. Hybrid motion control and planning strategies for visual servoing. IEEE Trans. Ind. Electron 2005, 52, 1024–1040.
[16]  Wang, Y; Lang, HX; de Silva, CW. A hybrid visual servo controller for robust grasping by wheeled mobile robots. IEEE-ASME Trans. Mechatron 2010, 15, 757–769.
[17]  Xue, Z; Z?llner, JM; Dillmann, R. Dexterous Manipulation Planning of Objects with Surface of Revolution. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 2703–2708.
[18]  Melchiorri, C. Robot Hands. In Springer Handbook of Robotics; Springer Verlag: Berlin, Germany, 2008; pp. 345–360.
[19]  Trinkle, JC; Hunter, JJ. A Framework for Planning Dexterous Manipulation. Proceedings of the IEEE International Conference on Robotics and Automation, Sacramento, CA, USA, 9–11 April 1991; pp. 1245–1251.
[20]  Zhang, H; Tanie, K; Maekawa, H. Dextrous Manipulation Planning by Grasp Transformation. Proceedings of the IEEE International Conference on Robotics and Automation, Minneapolis, MN, USA, 22–28 April 1996; pp. 3055–3060.
[21]  Cherif, M; Gupta, KK. Planning quasi-static fingertip manipulations for reconfiguring objects. IEEE Trans. Robot. Autom 1999, 15, 837–848.
[22]  Yashima, M. Manipulation Planning for Object Re-Orientation Based on Randomized Techniques. Proceedings of the IEEE International Conference on Robotics and Automation, Nueva Orleans, LA, USA, 26 April–1 May 2004; pp. 1245–1251.
[23]  Goodwine, B. Stratified Motion Planning with Application to Robotic Finger Gaiting. Proceedings of the 14th IFAC World Congress, Beijing, China, 5–9 July 1999.
[24]  Saut, JP; Sahbani, A; El-Khoury, S; Perdereau, V. Dexterous Manipulation Planning Using Probabilistic Roadmaps in Continuous Grasp Subspaces. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 2907–2912.
[25]  Corrales, JA. Safe Human-Robot Interaction based on Multi-Sensor Fusion and Dexterous Manipulation Planning. Ph.D. Dissertation, University of Alicante, Alicante, Spain, 2011.
[26]  Espiau, B; Chaumette, F; Rives, P. A new approach to visual servoing in robotics. IEEE Trans. Robot. Autom 1992, 8, 313–326.
[27]  Pomares, J; García, GJ; Torres, F. Improving Tracking Trajectories with Motion Estimation. Proceedings of the 3rd International Conference on Informatics in Control, Automation and Robotics, Setubal, Portugal, 1–5 August 2006; pp. 97–106.
[28]  Kelly, R; Coello, A. Analysis and experimentation of transpose jacobian-based cartesians regulators. Robotica 1999, 17, 303–312.
[29]  Pomares, J; Candelas, FA; Torres, F; Corrales, JA; Garcia, GJ. Safe human-robot cooperation based on an adaptive time-independent image path tracker. Int. J. Innov. Comput. Inf. Control 2010, 6, 3819–3842.
[30]  Corrales, JA; Candelas, FA; Torres, F. Sensor data integration for indoor human tracking. Robot. Auton. Syst 2010, 58, 931–939.
[31]  Corrales, JA; Candelas, FA; Torres, F. Safe human-robot interaction based on dynamic sphere-swept line bounding volumes. Robot. Comput. Integr. Manuf 2011, 27, 177–185.
[32]  Sahbani, A; El-Khoury, S; Bidaud, P. An overview of 3D object grasp synthesis algorithms. Robot Auton Systems 2011, doi:10.1016/j.robot.2011.07.016.
[33]  Corrales, JA; Jara, CA; Torres, F. Modelling and Simulation of a Multi-fingered Robotic Hand for Grasping Tasks. Proceedings of the 11th International Conference on Control, Automation, Robotics and Vision, Singapore, 7–10 December 2010; pp. 1577–1582.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133