Читать книгу Human Motion Capture and Identification for Assistive Systems Design in Rehabilitation - Pubudu N. Pathirana - Страница 24
1.5.1 Human motion encoders in action recognition
ОглавлениеActually, this problem has been extensively studied in the field of human action recognition. For instance, Ren et al. [297] employed the silhouette of a dancer to represent his/her performance by extracting local features to control animated human characters. Wang et al. [372] obtained the contour of a walker from his/her silhouette to represent the walking motion. A spatio‐temporal silhouette representation, the silhouette energy image (SEI), and variability action models were used by Ahmad et al. [19] to represent and classify human actions. In both visual‐based and non‐visual‐based human action recognition of differential features, such as velocity and acceleration, motion statistics, their spectra and a variety of clustering and smoothing methods have been used to identify motion types. A two‐stage dynamic model was established by Kristan et al. [182] to track the centre of gravity of subjects in images. Velocity was employed as one of the features by Yoon et al. [390] to represent the hand movement for the purpose of classification. Further, Panahandeh et al. [272] collected acceleration and rotation data from an inertial measurement unit (IMU) mounted on a pedestrian's chest to classify the activities with a continuous hidden Markov model. Ito [154] estimated human walking motion by monitoring the acceleration of the subject with 3D acceleration sensors. Moreover, angular features, especially the joint angle and angular velocity, have been used to monitor and reconstruct articulated rigid body models corresponding to action states and types. Zhang et al. [397] fused various raw data into angular velocity and orientation of the upper arm to estimate its motion. Donno et al. [97] collected angle and angular velocity data from a goniometer to monitor the motions of human joints. Angle was also utilised by Gu et al. [129] to recognise human motions to instruct a robot. Amft et al. [26] detected the feeding phases by constructing a hidden Markov model with the angle feature from the lower arm rotation. Apart from the above, only a few have considered a similar approach of trajectory shape features such as curvature and torsion. For example, Zhou et al. [401] extracted the trajectories of the upper limb and classified its motion by computing the similarity of these trajectories.