全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Humanoid Robot Head Design Based on Uncanny Valley and FACS

DOI: 10.1155/2014/208924

Full-Text   Cite this paper   Add to My Lib

Abstract:

Emotional robots are always the focus of artificial intelligence (AI), and intelligent control of robot facial expression is a hot research topic. This paper focuses on the design of humanoid robot head, which is divided into three steps to achieve. The first step is to solve the uncanny valley about humanoid robot, to find and avoid the relationship between human being and robot; the second step is to solve the association between human face and robot head; compared with human being and robots, we analyze the similarities and differences and explore the same basis and mechanisms between robot and human analyzing the Facial Action Coding System (FACS), which guides us to achieve humanoid expressions. On the basis of the previous two steps, the third step is to construct a robot head; through a series of experiments we test the robot head, which could show some humanoid expressions; through human-robot interaction, we find people are surprised by the robot head expression and feel happy. 1. Introduction Robots play an increasingly important role in our life and work, to assist and even replace the human work in more and more fields. With the development of society, the traditional robots can’t meet human beings, humanoid robots [1–3] are coming, and human beings want more and more analogue. There are many famous humanoid robots, for instance, Kismet and Leonardo made from MIT in USA, ROMAN made in Germany, and WE-robot and SAYA made in Japan, which all could surprise people and give some kind of special feelings but cannot gain some place in person’s awareness. This paper divided these humanoid robots into two domains. First like a beautiful flower vase to people, human beings find it is an amazing thing, and only like to see its actions, for instance, smile, anger; these robots are almost made in Japan and have shocking appearance; second like a toy to people, we find it interesting to play with this kind of robots, which can show some facial expressions, and these robots are always made in USA and Europe. In order to avoid these cases and simplify the whole design, this paper is meant to make a humanoid robot head instead of the whole body, which not only has good appearance, but also can show some expressions. The paper is organized as follows: in Section 1 we introduce two kinds of humanoid robots. In Section 2 we introduce the concept of the uncanny valley. In Section 3 we talk about the Facial Action Coding System (FACS). In Section 4 we design the hardware and software of the robot head and build the robot head according to the previous design.

References

[1]  M. A. Goodrich and A. C. Schultz, “Human-robot interaction: a survey,” Foundations and Trends in Human-Computer Interaction, vol. 1, no. 3, pp. 203–275, 2007.
[2]  H. Yan, M. H. Ang Jr., and A. N. Poo, “A survey on perception methods for human-robot interaction in social robots,” International Journal of Social Robotics, vol. 6, no. 1, pp. 85–119, 2014.
[3]  T. Nomura, “Comparison on negative attitude toward robots and related factors between Japan and the UK,” in Proceedings of the 5th ACM International Conference on Collaboration Across Boundaries: Culture, Distance & Technology (CABS '14), pp. 87–90, ACM, Kyoto, Japan, August 2014.
[4]  M. Mori, K. F. MacDorman, and N. Kageki, “The uncanny valley,” IEEE Robotics and Automation Magazine, vol. 19, no. 2, pp. 98–100, 2012.
[5]  H. Brenton, M. Gillies, D. Ballin, et al., “The uncanny valley: does it exist,” in Proceedings of Conference of Human Computer Interaction, Workshop on Human Animated Character Interaction, 2005.
[6]  P. Ekman and W. V. Friesen, Manual for the Facial Action Coding System, Consulting Psychologists Press, 1978.
[7]  P. Ekman and W. V. Friesen, Facial Action Coding System: Investigatoris Guide, Consulting Psychologists Press, 1978.
[8]  J. Yan, Z. Wang, and S. Zheng, “Cognitive emotion research of humanoid expression robot,” in Foundations and Practical Applications of Cognitive Systems and Information Processing, pp. 413–425, Springer, Berlin, Germany, 2014.
[9]  F. De la Torre and F. Cohn J, “Facial expression analysis,” in Visual Analysis of Humans, pp. 377–409, Springer, London, UK, 2011.
[10]  S. Inversions, FaceGen modeller (Version 3.3)[computer software], Singular Inversions, Toronto, Canada, 2008.
[11]  N. Endo, S. Momoki, M. Zecca et al., “Development of whole-body emotion expression humanoid robot,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA '08), pp. 2140–2145, Pasadena, Calif, USA, May 2008.
[12]  K. Malchus, P. Stenneken, P. Jaecks, C. Meyer, O. Damm, and B. Wrede, “The role of emotional congruence in human-robot interaction,” in Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI '13), pp. 191–192, IEEE, Tokyo, Japan, March 2013.
[13]  R. C. Arkin and L. Moshkina, Affect in Human-Robot Interaction, Georgia Institute of Technology, Atlanta, Ga, USA, 2014.
[14]  T. C. Wee, “Mechanical design and optimal control of humanoid robot (TPinokio),” The Journal of Engineering, vol. 1, no. 1, 2014.
[15]  R. Zhu, J. Ren, Z. Chen, et al., “Design and development of mechanical structure and control system for tracked trailing mobile robot,” TELKOMNIKA Indonesian Journal of Electrical Engineering, vol. 11, no. 2, pp. 694–703, 2013.
[16]  S. M. Sajadi, S. H. Mahdioun, and A. A. Ghavifekr, “Design of mechanical structure and tracking control system for 5 DOF surgical robot,” in Proceedings of the 21st Iranian Conference on Electrical Engineering (ICEE '13), pp. 1–6, Mashhad, Iran, May 2013.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133