Document details

A biological and real-time framework for hand gestures and head poses

Author(s): Saleiro, Mário ; Farrajota, Miguel ; Terzic, Kasim ; Rodrigues, J. M. F. ; du Buf, J. M. H.

Date: 2013

Persistent ID: http://hdl.handle.net/10400.1/3444

Origin: Sapientia - Universidade do Algarve

Project/scholarship: info:eu-repo/grantAgreement/FCT/3599-PPCDT/RIPD%2FADA%2F109690%2F2009/PT; info:eu-repo/grantAgreement/FCT/SFRH/SFRH%2FBD%2F79812%2F2011/PT; info:eu-repo/grantAgreement/FCT/SFRH/SFRH%2FBD%2F71831%2F2010/PT; info:eu-repo/grantAgreement/EC/FP7/270247/EU;

Subject(s): Hand gestures; Head pose; Biological framework


Description

Human-robot interaction is an interdisciplinary research area that aims at the development of social robots. Since social robots are expected to interact with humans and understand their behavior through gestures and body movements, cognitive psychology and robot technology must be integrated. In this paper we present a biological and real-time framework for detecting and tracking hands and heads. This framework is based on keypoints extracted by means of cortical V1 end-stopped cells. Detected keypoints and the cells’ responses are used to classify the junction type. Through the combination of annotated keypoints in a hierarchical, multi-scale tree structure, moving and deformable hands can be segregated and tracked over time. By using hand templates with lines and edges at only a few scales, a hand’s gestures can be recognized. Head tracking and pose detection are also implemented, which can be integrated with detection of facial expressions in the future. Through the combinations of head poses and hand gestures a large number of commands can be given to a robot.

Document Type Conference object
Language English
Contributor(s) Sapientia
facebook logo  linkedin logo  twitter logo 
mendeley logo

Related documents