全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于视线追踪的增强现实人机交互方法
Eye Tracking Based Augmented Reality Human-Computer Interaction Method

DOI: 10.12677/CSA.2019.95115, PP. 1020-1028

Keywords: 视线追踪,深度学习,增强现实
Eye Tracking
, Deep Learning, Augmented Reality

Full-Text   Cite this paper   Add to My Lib

Abstract:

本文结合现有光学透视头戴显示设备,提出了基于视线进行交互的方法。受到Resnet网络思想的启发,提出了一种异构嵌套的卷积神经网络(Heterogeneous nested neural, HNN),通过12层卷积和3层全连接层得到一个眼睛的55个特征点,在此基础上分别得出视线向量和瞳孔中心,进而得到了用户视线。在近眼部位采用两个非红外光源相机分别追踪双眼的运动,角膜中心误差降到了0.66 mm,角度误差降到了0.89?。最后本文在HNN中做了不同通道数的对比以及HNN和Resnet网络的对比。实验结果表明,相比于传统方法和普通的卷积神经网络,本文提出的HNN网络对眼睛追踪效果有明显提升。
In order to improve the human-computer interaction (HCI) of augmented reality, we combined with the existing optical see-through head-mounted display (OST-HMD). Inspired by the Resnet network idea, we proposed a Heterogeneous nested neural network, which outputs 55 feature points of an eye. On this basis, we respectively get the line of sight vector and the center of the pupil. Two Non-IR cameras are used to track the movement of both eyes in the near eye area. The corneal center error was reduced to 0.66 mm and the angular error was reduced to 0.89?. The method has achieved good results, which lays a foundation for the eye movement interaction of the augmented reality system.

References

[1]  Grubert, J., Itoh, Y., Moser, K. and Swan, J.E. (2018) A Survey of Calibration Methods for Optical See-Through Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics, 24, 2649-2662.
https://doi.org/10.1109/TVCG.2017.2754257
[2]  del Amo, I.F., Erkoyuncu, J.A., Roy, R. and Wilding, S. (2018) Augmented Reality in Maintenance: An Information-Centred Design Framework. Procedia Manufacturing, 19, 148-155.
https://doi.org/10.1016/j.promfg.2018.01.021
[3]  Kassner, M., Patera, W. and Bulling, A. (2014) Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-Based Interaction. CoRR.
https://doi.org/10.1145/2638728.2641695
[4]  Cristina, S. and Camilleri, K.P. (2018) Unobtrusive and Pervasive Video-Based Eye-Gaze Tracking. Image and Vision Computing, 74, 21-40.
https://doi.org/10.1016/j.imavis.2018.04.002
[5]  张闯, 迟健男, 张朝晖, 王志良. 一种新的基于瞳孔–角膜反射技术的视线追踪方法[J]. 计算机学报, 2010, 33(7): 1272-1285.
[6]  Schreiber, K. and Haslwanter, T. (2004) Improving Calibration of 3-D Video Oculography Systems. IEEE Transactions on Biomedical Engineering, 51, 676-679.
https://doi.org/10.1109/TBME.2003.821025
[7]  Hansen, D.W. and Ji, Q. (2010) In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32, 478-500.
https://doi.org/10.1109/TPAMI.2009.30
[8]  Zhu, Z., Ji, Q. and Fujimura, K. (2002) Combining Kalman Filtering and Mean Shift for Real Time Eye Tracking. Proceedings of the International Conference on Pattern Recognition, 4, 318-321.
[9]  Sasson, N.J. and Elison, J.T. (2012) Eye Tracking Young Children with Autism. Journal of Visualized Experiments, 61, 3675.
https://doi.org/10.3791/3675
[10]  Hsu, C.-F., Chen, Y.-C., Wang, Y.-S., Lei, C.-L. and Chen, K.-T. (2018) Realizing the Real-Time Gaze Redirection System with Convolutional Neural Network. MMSys’18: 9th ACM Multimedia Systems Conference, Amsterdam, 4 p.
[11]  Wood, E., Baltru?aitis, T., Morency, L.-P., Robinson, P. and Bulling, A. (2016) Learning an Appearance-Based Gaze Estimator from One Million Synthesised Images. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA’16), ACM, New York, 131-138.
https://doi.org/10.1145/2857491.2857492
[12]  Wyder, S. and Cattin, P.C. (2018) Eye Tracker Accuracy: Quantitative Evaluation of the Invisible Eye Center Location. International Journal of Computer Assisted Radiology and Surgery, 13, 1651-1660.
https://doi.org/10.1007/s11548-018-1808-5

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133