ERU - 2016
Permanent URI for this collectionhttp://192.248.9.226/handle/123/19525
Browse
Browsing ERU - 2016 by Subject "Action detection"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
- item: Conference-AbstractHuman action detection using space-time interest points(Engineering Research Unit, Faculty of Engiennring, University of Moratuwa, 2016-04) Sriashalya, S; Ramanan, A; Jayasekara, AGBP; Amarasinghe, YWRThe bag-of-features (BoF) approach for human action classification uses spatio-temporal features to assign the visual words of a codebook. Space time interest points (STIP) feature detector captures the temporal extent of the features, allowing distinguishing between fast and slow movements. This study compares the relative performance of action classification on KTH videos using the combination of STIP feature detector with histogram of gradient orientations (HOG) and histograms of optical flow (HOF) descriptors. The extracted descriptors are clustered using K-means algorithm and the feature sets are classified with two classifiers: nearest neighbour (NN) and support vector machine (SVM). In addition, this study compares actionspecific and global codebook in the BoF framework. Furthermore, less discriminative visual words are removed from initially constructed codebook to yield a compact form using likelihood ratio measure. Testing results show that STIP with HOF performs better than HOG descriptors and simple linear SVM outperforms NN classifier. It can be noticed that action-specific codebooks when merged together perform better than globally constructed codebook in action classification on videos.