A 3D gyroscope provides measurements of angular velocities around its three intrinsic orthogonal axes, enabling angular orientation estimation. Because the measured angular velocities represent simultaneous rotations, it is not appropriate to consider them sequentially. Rotations in general are not commutative, and each possible rotation sequence has a different resulting angular orientation. None of these angular orientations is the correct simultaneous rotation result. However, every angular orientation can be represented by a single rotation. This paper presents an analytic derivation of the axis and angle of the single rotation equivalent to three simultaneous rotations around orthogonal axes when the measured angular velocities or their proportions are approximately constant. Based on the resulting expressions, a vector called the simultaneous orthogonal rotations angle (SORA) is defined, with components equal to the angles of three simultaneous rotations around coordinate system axes. The orientation and magnitude of this vector are equal to the equivalent single rotation axis and angle, respectively. As long as the orientation of the actual rotation axis is constant, given the SORA, the angular orientation of a rigid body can be calculated in a single step, thus making it possible to avoid computing the iterative infinitesimal rotation approximation. The performed test measurements confirm the validity of the SORA concept. SORA is simple and well-suited for use in the real-time calculation of angular orientation based on angular velocity measurements derived using a gyroscope. Moreover, because of its demonstrated simplicity, SORA can also be used in general angular orientation notation.
Yazdi, N; Ayazi, F; Najafi, K. Micromachined inertial sensors. Proc. IEEE 1998, 86, 1640–1659, doi:10.1109/5.704269.
[3]
Liu, H; Gao, S; Liang, X; Jin, L. Performance Analysis and Measurement of Micro-Machined Gyroscope. Proceedings of the 9th International Conference on Electronic Measurement & Instruments ICEMI ’09, Beijing, China, 16–19 August 2009; pp. 1–30.
[4]
Sparks, DR; Zarabadi, SR; Johnson, JD; Jiang, Q; Chia, M; Larsen, O; Higdon, W; Castillo-Borelley, P. A CMOS Integrated Surface Micromachined Angular Rate Sensor: Its Automotive Applications. Proceedings of the 1997 International Conference on Solid State Sensors and Actuators TRANSDUCERS ’97, Chicago, IL, USA, 16–19 July 1997; pp. 851–854.
[5]
Lin, PC; Komsuoglu, H; Koditschek, DE. Sensor data fusion for body state estimation in a hexapod robot with dynamical gaits. IEEE Trans. Robot 2006, 22, 932–943, doi:10.1109/TRO.2006.878954.
[6]
Aminian, K; Najafi, B. Capturing human motion using body-fixed sensors: Outdoor measurement and clinical application. Comp. Anim. Virtual Worlds 2004, 15, 4–5.
[7]
Heinz, EA; Kunze, KS; Gruber, M; Bannach, D; Lukowicz, P. Using Wearable Sensors for Real-Time Recognition Tasks in Games of Martial Arts—An Initial Experiment. Proceedings of the 2006 IEEE Symposium on Computational Intelligence and Games, Reno, NV, USA, 22–24 May 2006; pp. 98–102.
[8]
Tun?el, O; Altun, K; Barshan, B. Classifying human leg motions with uniaxial piezoelectric gyroscopes. Sensors 2009, 9, 8508–8546, doi:10.3390/s91108508. 22291521
[9]
Ayrulu-Erdem, B; Barshan, B. Leg motion classification with artificial neural networks using wavelet-based features of gyroscope signals. Sensors 2011, 11, 1721–1743, doi:10.3390/s110201721. 22319378
[10]
McIlwraith, D; Pansiot, J; Yang, GZ. Wearable and Ambient Sensor Fusion for the Characterisation of Human Motion. Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan, 18–22 October 2010; pp. 5505–5510.
[11]
Hoffman, M; Varcholik, P; LaViola, JJ. Breaking the status quo: Improving 3D Gesture Recognition with Spatially Convenient Input Devices. Proceedings of the 2010 IEEE Virtual Reality Conference (VR), Waltham, MA, USA, 20–24 March 2010; pp. 59–66.
[12]
Schall, G; Wagner, D; Reitmayr, G; Taichmann, E; Wieser, M; Schmalstieg, D; Hofmann-Wellenhof, B. Global Pose Estimation Using Multi-Sensor Fusion for Outdoor Augmented Reality. Proceedings of the 8th IEEE International Symposium on Mixed and Augmented Reality ISMAR 2009, Orlando, FL, USA, 19–22 October 2009; pp. 153–162.
[13]
Koifman, M; Bar-Itzhack, IY. Inertial navigation system aided by aircraft dynamics. IEEE Trans. Control Syst. Technol 1999, 7, 487–493, doi:10.1109/87.772164.
[14]
Wang, JH; Gao, Y. Land vehicle dynamics-aided inertial navigation. IEEE Trans. Aerosp. Electron. Syst 2010, 46, 1638–1653, doi:10.1109/TAES.2010.5595584.
[15]
Zhao, L; Yan, L; Cheng, J; Wang, X. The Research of Inertial Navigation System Based on Submarine Space Motion. Proceedings of the 2008 Pacific-Asia Workshop on Computational Intelligence and Industrial Application PACIIA ’08, Wuhan, China, 19–20 December 2008; pp. 751–755.
[16]
Grenon, G; An, PE; Smith, SM; Healey, AJ. Enhancement of the inertial navigation system for the Morpheus autonomous underwater vehicles. IEEE J. Ocean. Eng 2001, 26, 548–560, doi:10.1109/48.972091.
[17]
Barshan, B; Durrant-Whyte, HF. An Inertial Navigation System for a Mobile Robot. Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems IROS ’93, Yokohama, Japan, 26–30 July 1993; pp. 2243–2248.
[18]
Park, SK; Suh, YS. A zero velocity detection algorithm using inertial sensors for pedestrian navigation systems. Sensors 2010, 10, 9163–9178, doi:10.3390/s101009163. 22163402
[19]
Beatty, MF, Jr. Principles of Engineering Mechanics; Plenum Press: New York, NY, USA, 1986; Volume 1.