Modeling of human upper body for sign language recognition
Sign Language Recognition systems require not only the hand motion trajectory to be classified but also facial features, Human Upper Body (HUB) and hand position with respect to other HUB parts. Head, face, forehead, shoulders and chest are very crucial parts that can carry a lot of positioning...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2011
|
Subjects: | |
Online Access: | http://irep.iium.edu.my/32510/1/modeling_human_upper_body.pdf http://irep.iium.edu.my/32510/ http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=6138561 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Sign Language Recognition systems require not only
the hand motion trajectory to be classified but also facial
features, Human Upper Body (HUB) and hand position with
respect to other HUB parts. Head, face, forehead, shoulders
and chest are very crucial parts that can carry a lot of
positioning information of hand gestures in gesture
classification. In this paper as the main contribution, a fast and
robust search algorithm for HUB parts based on head size has
been introduced for real time implementations. Scaling the
extracted parts during body orientation was attained using
partial estimation of face size. Tracking the extracted parts for
front and side view was achieved using CAMSHIFT [24]. The
outcome of the system makes it applicable for real-time
applications such as Sign Languages Recognition (SLR)
systems.
Keywords: Human upper body detection |
---|