ACTION RECOGNITION BASED ON SPARSE MOTION TRAJECTORIES


Jargalsaikhan I., Little S., Direkoglu C., O'Connor N. E.

20th IEEE International Conference on Image Processing (ICIP), Melbourne, Australia, 15 - 18 September 2013, pp.3982-3985, (Full Text) identifier identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1109/icip.2013.6738820
  • City: Melbourne
  • Country: Australia
  • Page Numbers: pp.3982-3985
  • Middle East Technical University Northern Cyprus Campus Affiliated: No

Abstract

We present a method that extracts effective features in videos for human action recognition. The proposed method analyses the 3D volumes along the sparse motion trajectories of a set of interest points from the video scene. To represent human actions, we generate a Bag-of-Features (BoF) model based on extracted features, and finally a support vector machine is used to classify human activities. Evaluation shows that the proposed features are discriminative and computationally efficient. Our method achieves state-of-the-art performance with the standard human action recognition benchmarks, namely KTH and Weizmann datasets.