Automated hand gesture recognition for enhancing sign language communication
This paper introduces a novel approach aimed at enhancing communication between individuals who are deaf or hard of hearing and those unfamiliar with sign language. The project addresses this challenge by developing a mobile application that harnesses the power of smartphone cameras, coupled with a...
Saved in:
| Main Author: | |
|---|---|
| Format: | Final Year Project / Dissertation / Thesis |
| Published: |
2024
|
| Subjects: | |
| Online Access: | http://eprints.utar.edu.my/6525/1/20ACB02030_FYP.pdf http://eprints.utar.edu.my/6525/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | This paper introduces a novel approach aimed at enhancing communication between individuals who are deaf or hard of hearing and those unfamiliar with sign language. The project addresses this challenge by developing a mobile application that harnesses the power of smartphone cameras, coupled with a deep learning model, to interpret hand gestures and provide real-time contextual information to users. It emphasizes the widespread adoption of smartphones and the practical applicability of mobile applications in real-life scenarios. Furthermore, the paper proposes a new methodology leveraging Google’s MediaPipe, which outperforms traditional approaches such as transfer learning with pre-trained object detection models in deep learning model development. Of paramount importance is the seamless integration of the deep learning model with the mobile application, enabling real-time detection and recognition on the mobile application. |
|---|
