In this paper, a global-state-space visual servoing scheme is proposed for uncalibrated model-independent robotic manipulation. The scheme is based on robust Kalman filtering (KF), in conjunction with Elman neural network (ENN) learning techniques. The global map relationship between the vision space and the robotic workspace is learned using an ENN. This learned mapping is shown to be an approximate estimate of the Jacobian in global space. In the testing phase, the desired Jacobian is arrived at using a robust KF to improve the ENN learning result so as to achieve robotic precise convergence of the desired pose. Meanwhile, the ENN weights are updated (re-trained) using a new input-output data pair vector (obtained from the KF cycle) to ensure robot global stability manipulation. Thus, our method, without requiring either camera or model parameters, avoids the corrupted performances caused by camera calibration and modeling errors. To demonstrate the proposed scheme’s performance, various simulation and experimental results have been presented using a six-degree-of-freedom robotic manipulator with eye-in-hand configurations.
References
[1]
Shirai, Y.; Inoue, H. Guiding a robot by visual feedback in assembling tasks. Pattern Recognit. 1973, 5, 99–108.
[2]
Chaumette, F.; Hutchinson, S. Visual servo control. Part II: Advanced approaches. IEEE Robot. Autom. Mag. 2007, 14, 109–118.
[3]
Jorge, P.; Pablo, G.; Fernando, T. Visual control of robots using range images. Sensors 2010, 10, 7303–7322.
[4]
Garcia-Aracil, N.; Perez-Vidal, C.; Sabater, J.M.; Morales, R.; Badesa, F.J. Robust and cooperative image-based visual servoing system using a redundant architecture. Sensors 2011, 11, 11885–11900.
[5]
Do-Hwan, P.; Jeong-Hoon, K.; In-Joong, H. Novel position-based visual servoing approach to robust global stability under field-of-view constraint. IEEE Trans. Indus. Elec. 2012, 59, 4735–4752.
[6]
Chaumette, F.; Hutchinson, S. Visual servo control. Part I: Basic approaches. IEEE Robot. Autom. Mag. 2006, 13, 82–90.
[7]
Hutchinson, S.; Hager, G.; Corke, P. Atutorial on visual servo control. IEEE Trans. Robot. Autom. 1996, 12, 651–670.
[8]
Baumann, M.; Leonard, S.; Croft, E.A.; Little, J.J. Path planning for improved visibility using a probabilistic road map. IEEE Trans. Robot. 2010, 26, 195–200.
[9]
Chaumette, F. Potential Problems of Stability and Convergence in Image-Based and Position Based Visual Servoing. In The Confluence of Vision and Control, LNCIS Series; Kriegman, D., Hager, G., Morse, A., Eds.; Springer-Verlag: New York, NY, USA, 1998; Volume 237, pp. 66–78.
[10]
Garcia-Aracil, N.; Malis, E.; Aracil-Santonja, R.; Perez-Vidal, C. Continuous visual servoing despite the changes of visibility in image features. IEEE Trans. Robot. 2005, 21, 1214–1220.
[11]
Chesi, G.; Hashimoto, K.; Prattichizzo, D.; Vicino, A. Keeping features in the field of view in eye-in-hand visual servoing: A switching approach. IEEE Trans. Robot. 2004, 20, 908–913.
[12]
Comport, A.; Pressigout, M.; Marchand, E.; Chaumette, F. A Visual Servoing Control Law that is Robust to Image Outliers. Proceedings of IEEE International Conference on Intelligent Robots and Systems (IROS'03), Las Vegas, NV, USA, 27–31 October 2003; Volume 1, pp. 492–497.
[13]
Sanderson, A.C.; Weis, L.E.; Neuman, C.P. Dynamic sensor-based control of robots with visual feedback control. IEEE J. Robot. Autom. 1987, 7, 31–47.
[14]
Redwan, D.; Nicolas, A.; Youcef, M.; Omar, A.-A.; Philippe, M. Dynamic visual servoing from sequential regions of interest acquisition. Int. J. Robot. Res. 2012, 4, 1–19.
[15]
Youcef, M.; Francois, C. Path planning for robust image-based control. IEEE Trans. Robot. Autom. 2002, 18, 534–549.
[16]
Chesi, G. Visual servoing path-planning via homogeneous forms and LMI optimizations. IEEE Trans. Robot. Autom. 2009, 25, 281–291.
[17]
Chesi, G.; Shen, T. Conferring robustness to path-planning for image-based control. IEEE Trans. Control Syst. Tech. 2011, doi:10.1109/TCST.2011.2157346.
[18]
Eissa, N.; Aleksandar, V.; Janabi-Sharifi, F. A second-order conic optimization-based method for visual servoing. Mechatronics 2012, 22, 444–467.
[19]
Zoran, M.; Marko, M.; Mihailo, L.; Bojan, B. Neural network reinforcement learning for visual control of robot manipulators. Exp. Syst. Appl. 2013, 40, 1721–1736.
[20]
Shahamiri, M.; Jagersand, M. Uncalibrated Visual Servoing Using a Biased Newton Method for On-Line Singularity Detection and Avoidance. Proceedings of 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005 (IROS 2005), Edmonton, AB, Canada, 2–6 August 2005; pp. 3953–3958.
[21]
Jagersand, M.; Fuentes, O.; Nelson, R. Experimental Evaluation of Uncalibrated Visual Servoing for Precision Manipulation. Proceedings of 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, USA, 20–25 April 1997; pp. 2874–2880.
[22]
Piepmeier, J.A.; Lipkin, H. Uncalibrated eye-in-hand visual servoing. Int. J. Robot. Res. 2003, 22, 805–819.
[23]
Azad, S.; Amir-Massoud, F.; Martin, J. Robust Jacobian Estimation for Uncalibrated Visual Servoing. Proceedings of 2010 IEEE International Conference on Robotics and Automation (ICRA), Anchorage, AK, USA, 3–7 May 2010; pp. 5564–5569.
[24]
Qian, J.; Su, J. Online estimation of image Jacobian matrix by Kalman-Bucy filterfor uncalibrated stereo vision feedback. Proceedings of IEEE International Conference on Robotics and Automation, 2002 (ICRA '02), Washington, DC, USA, 14–18 May 2002; pp. 562–567.
[25]
Prem Kumar, P.; Laxmidhar, B. Visual servoing of redundant manipulator with Jacobian matrix estimation using self-organizing map. Robot. Autonom. Syst. 2010, 58, 978–990.
Swagat, K.; Laxmidhar, B.; McGinnity, T.M. Kinematic control of a redundant manipulator using an inverse-forward adaptive scheme with a KSOM based hint generator. Robot. Autonom. Syst. 2010, 58, 622–633.
[28]
Sun, G.; Scassellati, B. A fast and efficient model for learning to reach. Int. J. Hum. Robot. 2005, 4, 391–413.
[29]
Chui, C.K.; Chen, G. Kalman Filtering with Realtime Application; Springer-Verlag: Berlin/Heidelberg, Germany, 1987.
[30]
Simo, S. On unscented kalman filtering for state estimation of continuous-time nonlinear systems. IEEE Trans. Auto. Control 2007, 52, 1631–1641.
Assal, S.F.M.; Watanabe, K.; Izumi, K. Neural network-based kinematic inversion of industrial redundant robots using cooperative fuzzy hint for the joint limits avoidance. IEEE Trans. Mech. 2006, 11, 593–603.
[34]
Martin, P.; Millan, J.R. Robot arm reaching through neural inversions and reinforcement learning. Robot. Autonom. Syst. 2000, 31, 227–246.
[35]
Ceren, Z.; Altug, E. Image based and hybrid visual servo control of an unmanned aerial vehicle. J. Intell. Robot. Syst. 2012, 65, 325–344.