ACTION RECOGNITION BASED ON SPARSE MOTION TRAJECTORIES


Jargalsaikhan I., Little S., Direkoglu C., O'Connor N. E.

20th IEEE International Conference on Image Processing (ICIP), Melbourne, Avustralya, 15 - 18 Eylül 2013, ss.3982-3985, (Tam Metin Bildiri) identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/icip.2013.6738820
  • Basıldığı Şehir: Melbourne
  • Basıldığı Ülke: Avustralya
  • Sayfa Sayıları: ss.3982-3985
  • Orta Doğu Teknik Üniversitesi Kuzey Kıbrıs Kampüsü Adresli: Hayır

Özet

We present a method that extracts effective features in videos for human action recognition. The proposed method analyses the 3D volumes along the sparse motion trajectories of a set of interest points from the video scene. To represent human actions, we generate a Bag-of-Features (BoF) model based on extracted features, and finally a support vector machine is used to classify human activities. Evaluation shows that the proposed features are discriminative and computationally efficient. Our method achieves state-of-the-art performance with the standard human action recognition benchmarks, namely KTH and Weizmann datasets.