Assistive robotic applications require systems capable of interaction in the human world, a workspace which is highly dynamic and not always predictable. Mobile assistive devices face the additional and complex problem of when and if intervention should occur; therefore before any trajectory assistance is given, the robotic device must know where it is in real-time, without unnecessary disruption or delay to the user requirements. In this paper, we demonstrate a novel robust method for determining room identification from floor features in a real-time computational frame for autonomous and assistive robotics in the human environment. We utilize two inexpensive sensors: an optical mouse sensor for straightforward and rapid, texture or pattern sampling, and a four color photodiode light sensor for fast color determination. We show how data relating floor texture and color obtained from typical dynamic human environments, using these two sensors, compares favorably with data obtained from a standard webcam. We show that suitable data can be extracted from these two sensors at a rate 16 times faster than a standard webcam, and that these data are in a form which can be rapidly processed using readily available classification techniques, suitable for real-time system application. We achieved a 95% correct classification accuracy identifying 133 rooms’ flooring from 35 classes, suitable for fast coarse global room localization application, boundary crossing detection, and additionally some degree of surface type identification.
References
[1]
Zhang, Y.; Liu, J.; Hoffmann, G.; Quilling, M.; Payne, K.; Bose, P.; Zimdars, A. Real-Time Indoor Mapping for Mobile Robots with Limited Sensing. Proceedings of 2010 IEEE 7th International Conference on Mobile Ad Hoc and Sensor Systems (MASS 2010), San Francisco, CA, USA, 8–12 November 2013; pp. 636–641.
[2]
Torres-Solis, J.; Falk, T.H.; Chau, T. A review of indoor localization technologies: Towards navigational assistance for topographical disorientation. Ambient Intell. 2010, doi:10.5772/8678.
[3]
Mautz, R.; Tilch, S. Survey of Optical Indoor Positioning Systems. Proceedings of 2011 International Conference on Indoor Positioning and Indoor Navigation (IPIN 2011), Montbeliard-Belfort, France, 21–23 September 2011; pp. 1–7.
[4]
Jiang, Y.; Pan, X.; Li, K.; Lv, Q.; Dick, R.P.; Hannigan, M.; Shang, L. Ariel: Automatic Wi-Fi Based Room Fingerprinting for Indoor Localization. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 441–450.
[5]
Nisbet, P. Who's Intelligent? Wheelchair, Driver or Both? Proceedings of the 2002 IEEE International Conference on Control Applications, Glasgow, UK, September 2002.
[6]
Lopresti, E.F.; Mihailidis, A.; Kirsch, N. Assistive technology for cognitive rehabilitation: State of the art. Neuropsychol. Rehabil. 2004, 14, 5–39.
[7]
Matuszek, C.; Pronobis, A.; Zettlemoyer, L.; Fox, D. Combining World and Interaction Models for Human-Robot Collaborations. Proceedings of Workshops at the 27th AAAI Conference on Artificial Intelligence (AAAI 2013), Bellevue, Washington,DC, USA, 14–15 July 2013.
[8]
Rimminen, H.; Linnavuo, M.; Sepponen, R. Human identification and localization using active capacitive RFID tags and an electric field floor sensor. Int. Rev. Electr. Eng. 2010, 5, 1061–1068.
[9]
Doshi-Velez, F.; Li, W.; Battat, Y.; Charrow, B.; Curthis, D.; Park, J.; Hemachandra, S.; Velez, J.; Walsh, C.; Fredette, D. Improving safety and operational efficiency in residential care settings with WiFi-based localization. J. Am. Med. Dir. Assoc. 2012, 13, 558–563.
[10]
Jiang, Y.; Pan, X.; Li, K.; Lv, Q.; Dick, R.; Hannigan, M.; Shang, L. ARIEL: Automatic Wi-Fi Based Room Fingerprinting for Indoor Localization. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, New York, NY, USA, 5–8 September 2012; pp. 44–450.
[11]
Nguyen, V.; Jeong, M.; Ahn, S.; Moon, S.; Baik, S. A Robust Localization Method for Mobile Robots Based on Ceiling Landmarks. In Modelling Decisions for Artificial Intelligence; Torra, V., Narukawa, Y., Yoshida, Y., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 422–430.
[12]
Xu, D.; Han, L.; Tan, M.; Li, Y.F. Ceiling-based visual positioning for an indoor mobile robot with monocular vision. IEEE Trans. Ind. Electron. 2009, 56, 1617–1628.
[13]
Jeong, W.; Lee, K.M. CV-SLAM: A New Ceiling Vision-Based SLAM Technique. Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 3195–3200.
[14]
Buschka, P.; Saffiotti, A. A Virtual Sensor for Room Detection. Proceedings of 2002 IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, 30 September–4 October 2002; pp. 637–642.
[15]
Bay, H.; Tuytelaars, T.; van Gool, L. Surf: Speeded up Robust Features. In Computer Vision–ECCV 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 404–417.
[16]
Murillo, A.C.; Guerrero, J.; Sagues, C. Surf Features for Efficient Robot Localization with Omnidirectional Images. Proceedings of 2007 IEEE International Conference on Robotics and Automation, Anchorage, AL, USA, 3–8 May 2010; pp. 3901–3907.
[17]
Chary, R.; Lakshmi, D.R.; Sunitha, K. Feature extraction methods for color image similarity. Adv. Comput. An Int. J. 2012, doi:10.5121/acij.2012.3215.
[18]
Haralick, R.M. Statistical and structural approaches to texture. Proc. IEEE 1979, 67, 786–804.
[19]
Tuceryan, M.; Jain, A.K. Texture Analysis. In The Handbook of Pattern Recognition and Computer Vision, 2nd ed.; Chen, C.H., Pau, L.F., Wang, P.S.P., Eds.; World Scientific Publishing Company: Singapore city, Singapore, 1998; pp. 207–248.
[20]
Application Note 5330. Available online: http://www.avagotech.com/docs/AV02-0359EN (accessed on 18 March 2013).
[21]
ADNS-2610 Optical Mouse Sensor. Available online: http://www.avagotech.com/pages/en/sensors/led-based_sensors/adns-2610/ (accessed on 18 March 2013).
Shi, G. Method and Apparatus for Determining Relative Movement in an Optical Mouse Using Feature Extraction. U.S. Patent 6,859,199[P], 22 February 2005.
[24]
Tresanchez, M.; Pallejà, T.; Teixidó, M.; Palacín, J. Using the optical mouse sensor as a two-Euro counterfeit coin detector. Sensors 2009, 9, 7083–7096.
[25]
Starzacher, A.; Rinner, B. Evaluating KNN, LDA and QDA Classification for Embedded Online Feature Fusion. Proceedings of International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP 2008), Daegu, Korea, 3–7 November 2013; pp. 85–90.
[26]
Swain, M.J. Interactive Indexing into Image Databases. Proceedings of IS&T/SPIE's Symposium on Electronic Imaging: Science and Technology, San Jose, CA, USA, 31 January–4 February 1993; pp. 95–103.
[27]
Duin, R.; Juszczak, P.; Paclik, P.; Pekalska, E.; de Ridder, D.; Tax, D.; Verzakov, S. A Matlab Toolbox for Pattern Recognition (PRTools Version 3). Available online: http://www.prtools.org (accessed on 16 December 2013).
[28]
McElroy, B.; Gillham, M.; Howells, G.; Spurgeon, S.; Kelly, S.; Batchelor, J.; Pepper, M. Highly Efficient Localisation Utilising Weightless Neural Systems. Proceedings of 2012 European Symposium on Artificial Neural Networks, Bruges, Belgium, 25–27 April 2012; pp. 543–548.
[29]
Feng, Y.; Ren, J.; Jiang, J.; Halvey, M.; Jose, J.M. Effective venue image retrieval using robust feature extraction and model constrained matching for mobile robot localization. Mach. Vision Appl. 2011, 23, 1–17.
[30]
Zivkovic, Z.; Booij, O.; Kr?se, B. From images to rooms. Robot. Auton. Syst. 2007, 55, 411–418.
[31]
Filliat, D.; Battesti, E.; Bazeille, S.; Duceux, G.; Gepperth, A.; Harrath, L.; Jebari, I.; Pereira, R.; Tapus, A. RGBD Object Recognition and Visual Texture Classification for Indoor Semantic Mapping. Proceedings of 2012 IEEE International Conference on Technologies for Practical Robot Applications (TePRA 2012), Woburn, MT, USA, 23–24 April 2012; pp. 127–132.