全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Immersive Robot Control in Virtual Reality to Command Robots in Space Missions

DOI: 10.4236/jsea.2018.117021, PP. 341-347

Keywords: Human Robot Interaction, Virtual Reality, Exoskeleton

Full-Text   Cite this paper   Add to My Lib

Abstract:

We present an approach to control a semi-autonomous robot team remotely under low bandwidth conditions with a single operator. Our approach utilises virtual reality and autonomous robots to create an immersive user interface for multi-robot control. This saves a big amount of bandwidth, just because there is no need to transfer a constant steam of camera images. The virtual environment for control only has to be transferred once to the control station and only has to be updated when the map is out of date. Also, the camera position can easily be changed in virtual reality for more overview on the robots situation. The parts of this approach can easily be transferred to applications on earth e.g. for semi-autonomous robots in hazardous areas or under water applications.

References

[1]  Mallwitz, M. and Will, N. and Teiwes, J. and Kirchner, E.A. (2015) The Capio Active Upper Body Exoskeleton and Its Application for Teleoperation. Proceedings of the 13th ESA/Estec Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA).
[2]  Felis, M.L. (2016) Rbdl: An Efficient Rigid-Body Dynamics Library Using Recursive Algorithms. Autonomous Robots, 1-17.
[3]  Endsley, M.R. (1995) Toward a Theory of Situation Awareness in Dynamic Systems. Human Factors: The Journal of the Human Factors and Ergonomics Society, 37, 32-64.
[4]  Polich, J. (2007) Updating P300: An Integrative Theory of P3a and P3b. Clin Neurophysiol, 118, 2128-2148.
https://doi.org/10.1016/j.clinph.2007.04.019
[5]  Patel, S.H. and Azzam, P.N. (2005) Characterization of N200 and P300: Selected Studies of the Event-Related Potential. International Journal of Medical Sciences, 2, 147-154.
https://doi.org/10.7150/ijms.2.147
[6]  Kirchner, E.A., Kim, S.K., Straube, S., Seeland, A., Wöhrle, H., Krell, M.M., Tabie, M. and Fahle, M. (2013) On the Applicability of Brain Reading for Predictive Human-Machineinterfaces in Robotics. PLoS ONE, 8, e81732.
https://doi.org/10.1371/journal.pone.0081732
[7]  Wöhrle, H. and Kirchner, E.A. (2014) Online Classifier Adaptation for the Detection of p300 Target Recognition Processes in a Complex Teleoperation Scenario. Physiological Computing Systems, Lecture Notes in Computer Science, 105-118.
https://doi.org/10.1007/978-3-662-45686-6_7
[8]  Kirchner, E.A., Kim, S.K., Tabie, M., Wöhrle, H., Maurus, M. and Kirchner, F. (2016) An Intelligent Man-Machine Interface—Multi-Robot Control Adapted for Task Engagement Based on Single-Trial Detectability of P300. Frontiers in Human Neuroscience, 10, 291.
https://doi.org/10.3389/fnhum.2016.00291
[9]  Krell, M.M., Straube, S., Seeland, A., Wöhrle, H., Teiwes, J., Metzen, J.H., Kirchner, E.A. and Kirchner, F. (2013) pySPACE—A Signal Processing and Classification Environment in Python. Frontiers in Neuroinformatics, 7.
https://github.com/pyspace
https://doi.org/10.3389/fninf.2013.00040
[10]  Gu, Y. and Grossman, R.L. (2007) Udt: Udp-Based Data Transfer for High-Speed Wide Area Networks. Computer Networks, 51, 1777-1799.
https://doi.org/10.1016/j.comnet.2006.11.009
[11]  Sonsalla, R., Cordes, F., Christensen, L., Roehr, T.M., Stark, T., Planthaber, S., Maurus, M., Mallwitz, M. and Kirchner, E.A. (2017) Field Testing of a Cooperative Multi-Robot Sample Return Mission in Mars Analogue Environment. Proceedings of the 14th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA).
[12]  Planthaber, S., Maurus, M., Bongardt, B., Mallwitz, M., Vaca Benitez, L.M., Christensen, L., Cordes, F., Sonsalla, R., Stark, T. and Roehr, T.M. (2017) Controlling a Semi-Autonomous Robot Team from a Virtual Environment. Proceedings of the HRI Conference. ACM/IEEE International Conference on Human-Robot Interaction (HRI).

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133