In the current digital age, the adoption of natural interfaces between humans and machines is increasingly important. This trend is particularly significant in the education sector where interactive tools and applications can ease the presentation and comprehension of complex concepts, stimulate collaborative work, and improve teaching practices. An important step towards this vision, interactive whiteboards are gaining widespread adoption in various levels of education. Nevertheless, these solutions are usually expensive, making their acceptance slow, especially in countries with more fragile economies. In this context, we present the low-cost interactive whiteboard (LoCoBoard) project, an open-source interactive whiteboard with low-cost hardware requirements, usually accessible in our daily lives, for an easy installation: a webcam-equipped computer, a video projector, and an infrared pointing device. The detection software framework offers five different Pointer Location algorithms with support for the Tangible User Interface Object protocol and also adapts to support multiple operating systems. We discuss the detailed physical and logical structure of LoCoBoard and compare its performance with that of similar systems. We believe that the proposed solution may represent a valuable contribution to ease the access to interactive whiteboards and increase widespread use with obvious benefits. 1. Introduction Over the past decades, the computing power evolution has been remarkable, but human computer interaction (HCI) is still being realized, in most cases, through the traditional keyboard and mouse. The future of HCI should rely more on the use of natural interfaces, such as haptic, speech, and gestures. In particular, the use of natural interfaces through Interactive Whiteboards (IWs) in education environments can ease the presentation and comprehension of complex concepts, allow collaborative work between teachers and students and improve pedagogical practices. Although there are wide ranges of commercial IW solutions, they are generally expensive and difficult to afford and implement by a large number of education institutions. The main goal of this project consists in developing an open source software-based IW solution, based on usually accessible hardware in our daily lives, that is, a video projector, a laptop with a webcam, and an Infra-Red (IR) pointing device. The LoCoBoard [1] prototype uses computer vision algorithms for processing captured images through the webcam and interpret the user interactions. Each developed algorithm tries to
References
[1]
C. Soares, LoCoBoard: Quadro Interactivo de Baixo Custo recorrendo a Algoritmos de Vis?o por Computador [M.S. thesis], Universidade Fernando Pessoa, Porto, Portugal, 2009.
[2]
Y. Benezeth, P. M. Jodoin, B. Emile, H. Laurent, and C. Rosenberger, “Comparative study of background subtraction algorithms,” Journal of Electronic Imaging, vol. 19, no. 3, 2010.
[3]
T. Bovermann, R. Bencina, E. Costanza, and M. Kaltenbrunner, “TUIO: a protocol for table-top tangible user interfaces,” in Proceedings of the 6th International Workshop on Gesture in Human-Computer Interaction and Simulation (GW '05), 2005.
[4]
R. Chang, F. Wang, and P. You, “A survey on the development of multi-touch technology,” in Proceedings of the 2010 Asia-Pacific Conference on Wearable Computing Systems (APWCS '10), pp. 363–366, IEEE Computer Society, Washington, DC, USA, April 2010.
[5]
D. Glover and D. Miller, “Running with technology: the pedagogic impact of the large-scale introduction of interactive whiteboards in one secondary school,” Journal of InFormation Techology For Teacher Education, vol. 10, no. 3, pp. 257–278, 2001.
[6]
J. Walny, B. Lee, P. Johns, N. H. Riche, and S. Carpendale, “Understanding pen and touch interaction for data exploration on interactive whiteboards,” IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 12, pp. 2779–2788, 2012.
[7]
N. Group, Multi-Touch Technologies, NUI Group, 1st edition, 2009.
[8]
J. Teichert, M. Herrlich, B. Walther-Franks et al., “Advancing large interactive surfaces for use in the real world,” Advances in Human-Computer Interaction, vol. 2010, Article ID 657937, 10 pages, 2010.
[9]
P. I. S. Lei and A. K. Y. Wong, “The multiple-touch user interface revolution,” IT Professional, vol. 11, no. 1, pp. 42–49, 2009.
[10]
A. D. Wilson, “Playanywhere: a compact interactive tabletop projection-vision system,” in Proceedings of the 18th annual ACM symposium on User interface software and technology (UIST '05), pp. 83–92, ACM, New York, NY, USA, 2005.
[11]
A. Agarwal, S. Izadi, M. Chandraker, and A. Blake, “High precision multi-touch sensing on surfaces using overhead cameras,” in Proceedings of the 2nd Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (Tabletop '07), pp. 197–200, IEEE Computer Society, Washington, DC, USA, October 2007.
[12]
J. C. Lee, “Hacking the nintendo wii remote,” IEEE Pervasive Computing, vol. 7, no. 3, pp. 39–45, 2008.
[13]
U. Schmidt, Wiimote whiteboard, 2008, http://www.uweschmidt.org/wiimote-whiteboard.
[14]
T. Vajk, P. Coulton, W. Bamford, and R. Edwards, “Using a mobile phone as a “wii-like” controller for playing games on a large public display,” International Journal of Computer Games Technology, vol. 2008, Article ID 539078, 6 pages, 2008.
[15]
N. Group, Touchlib: A multi-touch development kit, 2008 http://nuigroup.com/touchlib/.
[16]
N. Group, Community core vision—ccv, http://ccv.nuigroup.com/, 2010.
[17]
M. Kaltenbrunner and R. Bencina, “Reactivision: a computer-vision framework for table-based tangible interaction,” in Proceedings of the 1st international conference on Tangible and embedded interaction (TEI '07), pp. 69–74, ACM, New York, NY, USA, February 2007.
[18]
M. Wright, A. Freed, and A. Momeni, “Opensound control: state of the art 2003,” in in Proceedings of the 2003 conference on New interfaces for musical expression (NIME '03), pp. 153–160, National University of Singapore, Singapore, 2003.
[19]
G. Bradski and A. Kaehler, Learning OpenCV: Computer Vision with the OpenCV Library, O'Reilly Media, 1st edition, 2008.
[20]
R. Bencima, “oscpack—a simple c++ OSC packet manipulation library,” 2006, http://www.rossbencina.com/code/oscpack.
[21]
R. Haralick and L. Shapiro, Computer and Robot Vision, vol. 1, Addison Wesley, 1992.