全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Sensors  2013 

Frame Synchronization of High-Speed Vision Sensors with Respect to Temporally Encoded Illumination in Highly Dynamic Environments

DOI: 10.3390/s130404102

Keywords: robot vision, vision chip, camera synchronization, visual feedback control, phase-locked loop, signal normalization, quadrature detection, intelligent coding, Manchester encoding

Full-Text   Cite this paper   Add to My Lib

Abstract:

The authors propose a Manchester Encoding inspired illumination modulation strategy to properly index the temporally-aligned vision frames, which are successfully synchronized by the LED reference signal. Based on signal normalization, Manchester Encoded reference signals carry temporal information owing to serial communication and thus can timestamp the output vision frame. Both simulated and experimental results show satisfactory robustness to various disturbances, such as dynamic targets, fluctuant optical intensity, and unfixed cameras, etc. The 1,000 Hz vision sensor is locked to 500 Hz temporally modulated LED illumination with only 24 μs jitters. This result is believed to be applicable to low-cost wireless vision sensor network.

References

[1]  Kagami, S.; Komuro, T.; Ishikawa, M. A High-Speed Vision System with In-Pixel Programmable ADCs and PEs for Real-Time Visual Sensing. Proceedings of the 8th IEEE International Workshop on Advanced Motion Control, Kawasaki, Japan, 25 March 2004; pp. 439–443.
[2]  Watanabe, Y.; Komuro, T.; Kagami, S.; Ishikawa, M. Real-Time Visual Measurement Using a High-Speed Vision Chip. Proceedings of the 2004 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18 April 2004. Video Proceedings, Video I.
[3]  Criminisi, A.; Reid, I.; Zisserman, A. Single view metrology. Int. J. Comput. Vision 2000, 40, 123–148.
[4]  Baker, P.; Aloimonos, Y. Complete Calibration of a Multi-camera Network. Proceedings of the IEEE Workshop on Omnidirectional Vision, Hilton Head Island, SC, USA, 12 June 2000; pp. 134–141.
[5]  Tuytelaars, T.; Gool, L.V. Synchronizing Video Sequences. Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 27 June–2 July 2004 2004; Volume 1, pp. 762–768.
[6]  Wolf, L.; Zomet, A. Correspondence-Free Synchronization and Reconstruction in a Non-Rigid Scene. Proceedings of the Workshop on Vision and Modeling of Dynamic Scenes, Copenhagen, Denmark, May 2002; pp. 1–19.
[7]  Tresadern, P.; Reid, I. Synchronizing Image Sequences of Non-Rigid Objects. Proceedings of the British Machine Vision Conference, Norwich, UK, 9–11 September 2003; Volume 2, pp. 629–638.
[8]  Caspi, Y.; Simakov, D.; Irani, M. Feature-based sequence to sequence matching. Int. J. Comput. Vis. 2006, 68, 53–64.
[9]  Stein, G.P. Tracking from Multiple View Points: Selfcalibration of Space and Time. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Fort Collins,CO, USA, 23–25 June 1999; pp. 521–527.
[10]  Whitehead, A.; Laganiere, R.; Bose, P. Temporal Synchronization of Video Sequences in Theory and in Practice. Proceedings of the IEEE Workshop on Motion and Video Computing, Breckenridge, CO, USA, 5–7 January 2005; pp. 132–137.
[11]  Barreto, J.P.; Daniilidis, K. Wide Area Multiple Camera Calibration and Estimation of Radial Distortion. Proceedings of the the Fifth Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras, Kyoto, Japan, 4 October 2004.
[12]  Yan, J.; Pollefeys, M. Video Synchronization via Space-Time Interest Point Distribution. Proceedings of Advanced Concepts for Intelligent Vision Systems, Brussels, Belgium, 31 August 2004.
[13]  Laptev, I.; Lindeberg, T. Space-time Interest Points. Proceedings of the Ninth IEEE International Conference on Computer Vision, Nice, France, 13 October 2003; Volume 1, pp. 432–439.
[14]  Ushizaki, M.; Okatani, T.; Deguchi, K. Video Synchronization Based on Co-occurrence of Appearance Changes in Video Sequences. Proceedings of the International Conference on Pattern Recognition, Hong Kong, 20 August 2006; Volume 3, pp. 71–74.
[15]  Rieger, J.H. Three-dimensional motion from fixed points of a deforming profile curve. Opt. Lett. 1986, 11, 123–125.
[16]  Sinha, S.N.; Pollefeys, M. Synchronization and Calibration of Camera Networks from Silhouettes. Proceedings of the 17th International Conference on Pattern Recognition, Cambridge, UK, 23 August 2004; Volume 1.
[17]  Vijayakumar, B.; Kriegman, D.J.; Ponce, J. Structure and Motion of Curved 3D Objects from Monocular Silhouettes. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 18 June 1996; pp. 327–334.
[18]  MendoncEa, P.R.; Wong, K.Y.K.; Cipolla, R. Epipolar geometry from profiles under circular motion. Patt. Anal. Mach. Intell. IEEE Trans. 2001, 23, 604–616.
[19]  Han, M.; Kanade, T. Creating 3D Models with Uncalibrated Cameras. Proceedings of the Fifth IEEE Workshop on Applications of Computer Vision, Palm Springs, CA, USA, 4 December 2000; pp. 178–185.
[20]  Mei Han, T.K. Multiple Motion Scene Reconstruction from Uncalibrated Views. Proceedings of the Eighth International Conference on Computer Vision, Vancouver, BC, Canada, 7 July 2001; Volume 1, pp. 163–170.
[21]  Svoboda, T.; Martinec, D.; Pajdla, T. A convenient multi-camera self-calibration for virtual environments. Presence Teleoperators Virtual Environ. 2005, 14, 407–422.
[22]  Osada, R.; Funkhouser, T.; Chazelle, B.; Dobkin, D. Shape distributions. ACM Trans. Graph. 2002, 21, 807–832.
[23]  Gotoda, H. 3D shape comparison using multiview images. Natl. Inst. Inform. J. 2003, 7, 19–25.
[24]  Point Grey Research Inc. Dragonfly Camera Synchronization. Available online: http://www.ptgrey.com/products/multisync/index.asp (accessed on 16 February 2009).
[25]  Rai, P.K.; Tiwari, K.; Guha, P.; Mukerjee, A. A Cost-effective Multiple Camera Vision System Using FireWire Cameras and Software Synchronization. Proceedings of the 10th International Conference on High Performance Computing, Hyderabad, India, 17–20 December 2003.
[26]  Litos, G.; Zabulis, X.; Triantafyllidis, G. Synchronous Image Acquisition based on Network Synchronization. Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop, Washington, DC, USA, 17 June 2006; pp. 167–167.
[27]  Sivrikaya, F.; Yener, B. Time synchronization in sensor networks: A survey. IEEE Netw. 2004, 18, 45–55.
[28]  Elson, J.; Girod, L.; Estrin, D. Fine-Grained Network Time Synchronization Using Reference Broadcasts. Proceedings of the 5th Symposium on Operating Systems Design and Implementation, Boston, MA, USA, 9 December 2002; pp. 147–163.
[29]  Ganeriwal, S.; Kumar, R.; Sivastava, M.B. Timing-Sync Protocol for Sensor Networks. Proceedings of the 1st International Conference on Embedded Networked Sensor Systems, Los Angeles, CA, USA, 3 November 2003; pp. 138–149.
[30]  Maro'ti, M.; Kusy, B.; Simon, G.; Lédeczi, A. The Flooding Time Synchronization Protocol. Proceedings of the 2nd International Conference on Embedded Networked Sensor Systems, Baltimore, MD, USA, 3 November 2004; pp. 39–49.
[31]  Raleigh, G.G.; Cioffi, J.M. Spatio-temporal coding for wireless communication. IEEE Trans. Commun. 1998, 46, 357–366.
[32]  Vazirgiannis, M.; Theodoridis, Y.; Sellis, T. Spatio-temporal composition and indexing for large multimedia applications. Multimed. Syst. 1998, 6, 284–298.
[33]  Hou, L.; Kagami, S.; Hashimoto, K. Illumination-based synchronization of high-speed vision sensors. Sensors 2010, 10, 5530–5547.
[34]  Kagami, S.; Shinmeimae, M.; Komuro, T.; Watanabe, Y.; Ishikawa, M. A pixel-parallel algorithm for detecting and tracking fast-moving modulated light signals. J. Robot. Mechatron. 2005, 17, 387–394.
[35]  Hou, L.; Kagami, S.; Hashimoto, K. A signal normalization technique for illumination-based synchronization of 1,000-fps real-time vision sensors in dynamic scenes. Sensors 2010, 10, 8719–8739.
[36]  Watanabe, Y.; Komuro, T.; Kagami, S.; Ishikawa, M. Real-time visual measurements using high-speed vision. Proc. SPIE 2004, 5603, 234–242.
[37]  Watanabe, Y.; Komuro, T.; Kagami, S.; Ishikawa, M. Multi-target tracking using a vision chip and its applications to real-time visual measurement. J. Robot. Mechatron. 2005, 17, 121–129.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133