In this paper, we study the continuous growth of computer
technology and the increasing importance of human-computer interaction. Interactive
touch-less is now an undergoing developing technology in real life. Touchless technology
introduces a new way of interacting with computers by object tracking method. Nowadays
most mobile devices are using touchscreen technology. However, this technology is
still not cheap enough to be used in desktop systems. Designing a touchless device such as a mouse or keyboard using a webcam and computer vision
techniques can be an alternative way of touch screen technology. Recent trends in
technology aim to build highly interactive and easy-to-use applications as a replacement
forconventional devices. Such a device is touchless mouse. Its development is completed
on the MATLAB platform. The overall objective is to apply image processing techniques
from video to track the movement of color which is captured by a webcam and that
is converted into mouse movements and operations to control the system. The sub-system
which is implemented here would allow a person to control his/her mouse without
any input other than the marker movements. We use three fingers (with color) as
three color markers (red, green, blue) for completing the activities of a mouse.
My first goal was to successfully track the marker color and the second goal was
to track the marker position from the acquired image frame for performing the mouse
operations. The webcam is used to capture the information on the marker and trigger
the associated actions. I use java.awt.Robot file for performing the mouse operations using
the acquired data from the image frame.
References
[1]
Katona, J. (2021) A Review of Human-Computer Interaction and Virtual Reality Research Fields in Cognitive InfoCommunications. Applied Sciences, 11, Article 2646. https://doi.org/10.3390/app11062646
[2]
Gonzalez, R.C. and Woods, R.E. (2018) Digital Image Processing. Pearson, London.
[3]
Vonach, E., Gerstweiler, G. and Kaufmann, H. (2014) ACTO: A Modular Actuated Tangible User Interface Object. Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, November 2014, 259-268. https://doi.org/10.1145/2669485.2669522
Zhu, X.X., Riener, A. and Hausen, D. (2018) Gesture-Based Interaction for Touchless Automotive HMI.
[6]
Prajapati, R., Pandey, V., Jamindar, N., Yadav, N. and Phadnis, P.N. (2018) Hand Gesture Recognition and Voice Conversion for Deaf and Dumb. International Research Journal of Engineering and Technology, 5, 1373-1376.
[7]
Sood, A. and Mishra, A. (2016) AAWAAZ: A Communication System for Deaf and Dumb. 2016 5th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), Noida, 7-9 September 2016, 620-624. https://doi.org/10.1109/ICRITO.2016.7785029
[8]
Harish, N. and Poonguzhali, S. (2015) Design and Development of Hand Gesture Recognition System for Speech Impaired People. 2015 International Conference on Industrial Instrumentation and Control (ICIC), Pune, 28-30 May 2015, 1129-1133.
[9]
Shangeetha, R.K., Valliammai, V. and Padmavathi, S. (2012) Computer Vision Based Approach for Indian Sign Language Character Recognition. 2012 International Conference on Machine Vision and Image Processing (MVIP), Coimbatore, 14-15 December 2012, 181-184. https://doi.org/10.1109/MVIP.2012.6428790
[10]
Tripathi, K. and Nandi, N.B.G.C. (2015) Continuous Indian Sign Language Gesture Recognition and Sentence. Procedia Computer Science, 54, 523-531. https://doi.org/10.1016/j.procs.2015.06.060
[11]
Suzuki, S. and Keiichi, A. (1985) Topological Structural Analysis of Digitized Binary Images by Border Following. Computer Vision, Graphics, and Image Processing, 30, 32-46. https://doi.org/10.1016/0734-189X(85)90016-7
[12]
Li, Y.P. (2022) Cellular Mechanism of Mouse Atrial Development. Open Journal of Regenerative Medicine, 11, 1-24. https://doi.org/10.4236/ojrm.2022.111001