IDR Logo

Please use this identifier to cite or link to this item: http://idr.iitbbs.ac.in/jspui/handle/2008/1680
Title: A novel framework of continuous human-activity recognition using Kinect
Authors: Saini R.
Kumar P.
Roy P.P.
Dogra D.P.
Keywords: BLSTM-NN
Continuous activity
Depth sensors
HMM
Kinect
Issue Date: 2018
Citation: 10
Abstract: Automatic human activity recognition is being studied widely by researchers for various applications. However, majority of the existing work are limited to recognition of isolated activities, though human activities are inherently continuous in nature with spatial and temporal transitions between various segments. Therefore, there are scopes to develop a robust and continuous Human Activity Recognition (HAR) system. In this paper, we present a novel Coarse-to-Fine framework for continuous HAR using Microsoft Kinect. The activity sequences are captured in the form of 3D skeleton trajectories consisting of 3D positions of 20 joints estimated from the depth data. The recorded sequences are first coarsely grouped into two activity sequences performed during sitting and standing. Next, the activities present in the segmented sequences are recognized into fine-level activities. Activity classification in both stages are performed using Bidirectional Long Short-Term Memory Neural Network (BLSTM-NN) classifier. A total of 1110 continuous activity sequences have been recorded using a combination of 24 isolated human activities. Recognition rates of 68.9% and 64.45% have been recorded using BLSTM-NN classifier when tested using length-modeling and without length-modeling, respectively. We have also computed results for isolated activity recognition performance. Finally, the performance has been compared with existing approaches. � 2018 Elsevier B.V.
URI: http://dx.doi.org/10.1016/j.neucom.2018.05.042
http://10.10.32.48:8080/jspui/handle/2008/1680
Appears in Collections:Research Publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.