Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.
Szelinsky, R. Computer Vision Algorithms and Applications; Springer: Berlin, Germany, 2011.
[3]
Guzel, MS; Bicker, R. Optical Flow Based System Design for Mobile Robots. Proceedings of the 2010 IEEE Conference on Robotics Automation and Mechatronics, Robotics Automation and Mechatronics (RAM), Singapore, 28–30 June 2010; pp. 545–550.
[4]
Sim, KF; Sundaraj, K. Human Motion Tracking of Athlete Using Optical Flow and Artificial Markers. Proceedings of the 2010 International Conference on Intelligent and Advanced Systems (ICIAS), Kuala Lumpur, Malaysia, 15–17 June 2010; pp. 1–4.
[5]
Papadopoulos, GT; Briassouli, A; Mezaris, V; Kompatsiaris, I; Strintzis, MG. Statistical motion information extraction and representation for semantic video analysis. IEEE Trans. Circuits Syst. Video Technol 2009, 19, 1513–1528, doi:10.1109/TCSVT.2009.2026932.
[6]
Huang, C; Chen, Y. Motion estimation method using 3D steerable filter. Image Vis. Comput 1995, 13, 21–32, doi:10.1016/0262-8856(95)91465-P.
[7]
Lucas, BD; Kanade, T. An Iterative Image Registration Technique with an Application to Stereo Vision. Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI’81), Vancouver, BC, Canada, 24–28 August 1981; pp. 674–679.
[8]
Baker, S; Matthews, I. Lucas-kanade 20 years on: A unifying framework. Int. J. Comput. Vis 2004, 56, 221–255, doi:10.1023/B:VISI.0000011205.11775.fd.
[9]
Prokop, RJ; Reeves, AP. A survey of moment-based techniques for unoccluded object representation and recognition. CVGIP: Graph. Models Image Process 1992, 54, 438–460, doi:10.1016/1049-9652(92)90027-U.
[10]
Papakostas, GA; Koulouriotis, DE; Karakasis, EG. A unified methodology for the efficient computation of discrete orthogonal image moments. Inf. Sci 2009, 176, 3619–3633.
[11]
Flusser, J. Moment Invariants in Image Analysis. Proceedings of the World Academy of Science, Engineering and Technology, Czech Republic, February 2006; 11, pp. 196–201.
[12]
Hu, M-K. Pattern recognition by moment invariants. IEEE Trans. Inf. Theory 1961, 49, 14–28.
[13]
Hu, M-K. Visual pattern recognition by moment invariants. IRE Trans. Inf. Theory 1962, 8, 179–187.
[14]
Botella, G; Meyer-Baese, U; García, A. Bioinspired robust optical flow processor system for VLSI implementation. IEEE Electron. Lett 2009, 45, 1304–1306.
Johnston, A; Clifford, CW. A unified account of three apparent motion illusions. Vis. Res 1994, 35, 1109–1123.
[17]
Johnston, A; Clifford, CW. Perceived motion of contrast modulated gratings: PredICTION of the McGM and the role of full-wave rectification. Vis. Res 1995, 35, 1771–1783, doi:10.1016/0042-6989(94)00258-N. 7660584
[18]
Johnston, A; McOwan, PW; Benton, CP. Robust velocity computation from a biologically motivated model of motion perception. Proc. Biol. Sci 1999, 266, 509–518, doi:10.1098/rspb.1999.0666.
[19]
McOwan, PW; Benton, C; Dale, J; Johnston, A. A multi-differential neuromorphic approach to motion detection. Int. J. Neural Syst 1999, 9, 429–434, doi:10.1142/S0129065799000435. 10630473
[20]
Johnston, A; McOwan, PW; Benton, CP. Biological computation of image motion from flows over boundaries. J. Physiol. (Paris) 2003, 97, 325–334, doi:10.1016/j.jphysparis.2003.09.016.
[21]
Lindeberg, T; Romeny, B. Linear scale-space: I. Basic Theory, II. Early Visual Operations. In Geometry-Driven Diffusion; Kluwer Academic Publishers: Boston, MA, USA, 1994; pp. 1–77.
[22]
Johnston, A; McOwan, PW; Buxton, HA. Computational model of the analysis of some first-order and second-order motion patterns by simple and complex cells. Proc. R. Soc. London 1992, 250, 297–306, doi:10.1098/rspb.1992.0162.
[23]
Nalwa, VS. A Guided Tour of Computer Vision; Addison-Wesley: Reading, MA, USA, 1993.
[24]
Barron, JL; Fleet, DJ; Beauchemin, SS. Performance of optical flow techniques. Int. J. Comput. Vis 1994, 12, 43–77, doi:10.1007/BF01420984.
[25]
Hess, RF; Snowden, RJ. Temporal frequency filters in the human peripheral visual field. Vis. Res 1992, 32, 61–72, doi:10.1016/0042-6989(92)90113-W. 1502812
[26]
Lagae, L; Raiguel, S; Orban, GA. Speed and direction selectivity of macaque middle temporal neurons. J. Neurophysiol 1993, 69, 19–39. 8433131
[27]
Mikami, A; Newsome, WT; Wurtz, RH. Motion selectivity in macaque visual cortex. I. Mechanisms of direction and speed selectivity in extrastriate area MT. J. Neurophysiol 1986, 55, 1308–1327. 3016210
[28]
McLeod, P; Dittrich, W; Driver, J; Perrett, D; Zihl, J. Preserved and impaired detection of structure from motion by a motion-blind patient. Visual Cognit 1996, 3, 363–391, doi:10.1080/135062896395634.
Teh, C-H; Chin, RT. On image analysis by the methods of moments. IEEE Trans. Pattern Anal. Mach. Intell 1988, 10, 496–513, doi:10.1109/34.3913.
[31]
Zhang, YN; Zhang, Y; Wen, CY. A new focus measure method using moments. Image Vis. Comput 2000, 18, 959–965, doi:10.1016/S0262-8856(00)00038-X.
[32]
Papakostas, GA; Boutalis, YS; Karras, DA; Mertzios, BG. A new class of zernike moments for computer vision applications. Inf. Sci 2007, 177, 2802–2819, doi:10.1016/j.ins.2007.01.010.
[33]
Papakostas, GA; Karakasis, EG; Koulouriotis, DE. Exact and Speedy Computation of Legendre Moments on Binary Images. Proceedings of the Eight International Workshop on Image Analysis for Multimedia Interactive Services, WIAMIS ’07, Santorini, Greece, 6–8 June 2007.
[34]
Papakostas, GA; Koulouriotis, DE; Karakasis, EG. A unified methodology for the efficient computation of discrete orthogonal image moments. Inf. Sci 2009, 176, 3619–3633.
[35]
Wee, C-Y; Paramesran, R; Takeda, F. New computational methods for full and subset zernike moments. Inf. Sci 2004, 159, 203–220, doi:10.1016/j.ins.2003.08.006.
[36]
Sookhanaphibarn, K; Lursinsap, C. A new feature extractor invariant to intensity, rotation, and scaling of color images. Inf. Sci 2006, 176, 2097–2119, doi:10.1016/j.ins.2005.10.005.
[37]
Martin H, JA; Santos, M; de Lope, J. Orthogonal variant moments features in image analysis. Inf. Sci 2010, 180, 846–860, doi:10.1016/j.ins.2009.08.032.
[38]
Handel-C Languaje Reference Manual; Agility Design Solutions Inc, 2008. 2008. Available online: http://www.mentor.com/products/fpga/handel-c/upload/handelc-reference.pdf (accessed on 16 August 2011).
[39]
AlphaData RC1000 product. Available online: http://www.alpha-data.com (accessed on 16 August 2011).
[40]
Frigo, J. Evaluation of the StreamsC, CtoFPGA compiler: An applications perspective. Proceedings of the ACM/SIGDA International Symposium on Field Programmable Gate Arrays, Monterey, CA, USA, 11–13 February 2001; pp. 134–140.
[41]
Software and Design Tools. Available online: http://www.xilinx.com/tools/designtools.htm (accessed on 16 August 2011).
Tomasi, M; Barranco, F; Vanegas, M; Díaz, J; Ros, E. Fine grain pipeline architecture for high performance phase-based optical flow computation. J. Syst. Archit 2010, 56, 577–587, doi:10.1016/j.sysarc.2010.07.012.
[45]
Sosa, JC; Gomez-Fabela, R; Boluda, JA; Pardo, F. Change-Driven Image Architecture on FPGA with Adaptive Threshold for Optical-Flow Computation. Proceedings of the IEEE International Conference on Reconfigurable Computing and FPGA’s, ReConFig 2006, San Luis Potosí, México, 20–22 September 2006; pp. 1–8.
[46]
Mahalingam, V; Bhattacharya, K; Ranganathan, N; Chakravarthula, H; Murphy, RR; Pratt, KS. A VLSI architecture and algorithm for lucas-kanade-based optical flow computation. IEEE Trans. VLSI Syst 2010, 18, 29–38, doi:10.1109/TVLSI.2008.2006900.
[47]
Chubb, C; Sperling, G. Drift-balanced random stimuli: A general basis for studying non-Fourier motion perception. J. Opt. Soc. Am. A 1988, 5, 1986–2007, doi:10.1364/JOSAA.5.001986. 3210090
[48]
First-Order and Second-Order Motion Demos. Available online: http://www.snl.salk.edu/~maarten/demos/2nd.html (accessed on 16 August 2011).