Research

 

Human Motion Analysis for Physical Exercises Assessment in Kinaesthetic Rehabilitation


This work is part of the European projet KERAAL where we develop a robot coach system for assisting patients suffering from low-back pain in their rehabilitation. We propose a probabilistic approach based on Gaussian Mixture Model learned on a Riemannian space combining both position and orientation of joints acquired from physiotherapist demonstrations. Such a model is then employed to assess patients' movement in a multi-level manner where we consider different body parts and temporal segments. The approach is currently evaluated within a real clinical study where patients are using our system for a part of their rehabilitation in hospital.
  

Motion Unit Decomposition of Skeleton Sequences


In this work, we extend the previous analysis on human action recognition by considering the segmentation of a motion stream into a set of relevant temporal intervals called "Motion Units". To that end, we investigate the human pose deformation along the sequence so as to identify temporal key points where deformation among local poses is low. These identified key points correspond to transitions between two consecutive motion units. Such decomposition allows us to better and more accurately understand a motion sequence, as demonstrated by experiments in several contexts like action segmentation and recognition. 
  

Human Action Recognition by Shape Analysis of Motion Trajectories


Recognizing human actions in a 3D video sequence is an important open problem, currently at the heart of many research domains. We propose a new framework which allows  compact representation, quick comparison and accurate recognition of human actions in video sequences from depth sensors. We consider the human motion trajectory and analyze its shape within a Riemannian manifold. The provided metric representing shape similarity between two motion trajectories is employed for classification. Experiments on several action datasets show high performance, above 87% of recognittion accuracy.