The Influence of Speed and Position in Dynamic Gesture Recognition for Human-Robot Interaction Articles
Overview
published in
- Journal of Sensors Journal
publication date
- February 2019
volume
- 2019
Digital Object Identifier (DOI)
full text
International Standard Serial Number (ISSN)
- 1687-725X
Electronic International Standard Serial Number (EISSN)
- 1687-7268
abstract
- Human communication relies on several aspects beyond the speech. One of them is gestures as they express intentions, interests, feelings, or ideas and complement the speech. Social robots need to interpret these messages to allow a more natural Human-Robot Interaction. In this sense, our aim is to study the effect of position and speed features in dynamic gesture recognition. We use 3D information to extract the user's skeleton and calculate the normalized position for all of its joints, and using the temporal variation of such positions, we calculate their speeds. Our three datasets are composed of 1355 samples from 30 users. We consider 14 common gestures in HRI involving upper body movements. A set of classification techniques is evaluated testing these three datasets to find what features perform better. Results indicate that the union of both speed and position achieves the best results among the three possibilities, 0.999 of F-score. The combination that performs better to detect dynamic gestures in real time is finally integrated in our social robot with a simple HRI application to run a proof of concept test to check how the proposal behaves in a realistic scenario.
Classification
subjects
- Robotics and Industrial Informatics
keywords
- dynamic gesture recognition; social robots; human robot interaction