We demonstrate a method to encode complex human gestures acquired from inertial sensors for activity recognition. Gestures are encoded as a stream of symbols which represent the change in orientation and displacement of the body limbs over time. The first novelty of this encoding is to enable the reuse previously developed single-channel template matching algorithms also when multiple sensors are used simultaneously. The second novelty is to encode changes in orientation of limbs which is important in some activities, such as sport analytics. We demonstrate the method using our custom inertial platform, BlueSense. Using a set of five BlueSense nodes, we implemented a motion tracking system that displays a 3D human model and shows in real-time the corresponding movement encoding.
History
Publication status
Published
File Version
Accepted version
Journal
EWSN ’18 Proceedings of the 2018 International Conference on Embedded Wireless Systems and Network