Online Dynamic Gesture Recognition for Human Robot Interaction

Dan Xu, Xinyu Wu, Yen Lun Chen*, Yangsheng Xu

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

Abstract

This paper presents an online dynamic hand gesture recognition system with an RGB-D camera, which can automatically recognize hand gestures against complicated background. For background subtraction, we use a model-based method to perform human detection and segmentation in the depth map. Since a robust hand tracking approach is crucial for the performance of hand gesture recognition, our system uses both color information and depth information in the process of hand tracking. To extract spatio-temporal hand gesture sequences in the trajectory, a reliable gesture spotting scheme with detection on change of static postures is proposed. Then discrete HMMs with Left-Right Banded (LRB) topology are utilized to model and classify gestures based on multi-feature representation and quantization of the hand gesture sequences. Experimental evaluations on two self-built databases of dynamic hand gestures show the effectiveness of the proposed system. Furthermore, we develop a human-robot interactive system, and the performance of this system is demonstrated through interactive experiments in the dynamic environment.

Original languageEnglish
Pages (from-to)583-596
Number of pages14
JournalJournal of Intelligent and Robotic Systems: Theory and Applications
Volume77
Issue number3-4
DOIs
Publication statusPublished - Mar 2014
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2014, Springer Science+Business Media Dordrecht.

Keywords

  • Dynamic gesture spotting
  • Hand gesture recognition
  • Human-robot interaction

Fingerprint

Dive into the research topics of 'Online Dynamic Gesture Recognition for Human Robot Interaction'. Together they form a unique fingerprint.

Cite this