Please use this identifier to cite or link to this item: http://10.1.7.192:80/jspui/handle/123456789/6720
Title: Human Gesture Analysis for Action Recognition
Authors: Sonani, Kaveri
Keywords: Computer 2014
Project Report 2014
Computer Project Report
Project Report
14MCEN
14MCEN26
NT
NT 2014
CE (NT)
Issue Date: 1-Jun-2016
Publisher: Institute of Technology
Series/Report no.: 14MCEN26;
Abstract: Human gesture includes different component of visual action such as motion of hands, motion of legs which are required to analyze for action recognition in video surveillance. Human activity recognition is able to recognise the different kind of activities which is performed by human like walking, dancing, jumping, running, waving hand etc. This master thesis deals human gesture analysis for the action recognition using Mi- crosoft kinect sensor to build physiotherapy application. Kinect is able to generate depth image from RGB image and human skeleton from the depth image. Generated skeleton of human using kinect includes twenty different joints and their 3D coordinates. For proposed work, we require only twelve coordinates needs to be analyzed. In this method, therapist may record the exercise and patients are required to mimic that exercise at home. Physiotherapy is able to track their progress as well as recognize the action of patients. If a patient do not perform exercise properly, then it gives the suggestion based on the information of angle between joints of human skeleton. We design codebook for each action, which contains different key posture frames for each action. To find the match between two frames, we make the use of the concept of star distance. We evaluate our proposed system with large number of scenarios and analysed with Hidden Markov Model to recognise the action.
URI: http://hdl.handle.net/123456789/6720
Appears in Collections:Dissertation, CE (NT)

Files in This Item:
File Description SizeFormat 
14MCEN26.pdf14MCEN263.46 MBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.