Automated sign language translation using deep learning

This project focuses on developing a system for automated static gesture sign language translation using deep learning. With the increasing demand for accessible communication tools, particularly for the hearing-impaired community, the need for reliable sign language translation systems is growing....

Full description

Saved in:
Bibliographic Details
Main Author: Wong, Jia Kang
Format: Final Year Project / Dissertation / Thesis
Published: 2025
Subjects:
Online Access:http://eprints.utar.edu.my/7244/1/fyp_CS_2025_WJK.pdf
http://eprints.utar.edu.my/7244/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1854094496239714304
author Wong, Jia Kang
author_facet Wong, Jia Kang
author_sort Wong, Jia Kang
building UTAR Library
collection Institutional Repository
content_provider Universiti Tunku Abdul Rahman
content_source UTAR Institutional Repository
continent Asia
country Malaysia
description This project focuses on developing a system for automated static gesture sign language translation using deep learning. With the increasing demand for accessible communication tools, particularly for the hearing-impaired community, the need for reliable sign language translation systems is growing. The main challenge addressed in this project is the recognition and translation of static sign language gestures into text, which is less complex than dynamic gestures involving movement. The methodology involves processing images of static sign language gestures using hand landmark detection with MediaPipe. These landmarks are then normalized and input into a deep learning model, trained on processed dataset images, to predict the corresponding sign. The model architecture consists of multiple dense layers with batch normalization and dropout to ensure robust learning. The system is integrated into a user-friendly application that offers real-time sign language translation through a webcam feed, with features such as dynamic confidence threshold adjustment, translation history tracking, and a sign language dictionary. The results show that the system is capable of accurately recognizing and translating static sign language gestures with high confidence, as validated by the test dataset. The system is efficient, easy to use, and highly adaptable for future enhancements. This project demonstrates the potential of deep learning in bridging communication gaps for the hearing-impaired community and sets the groundwork for future work in dynamic sign language translation.
format Final Year Project / Dissertation / Thesis
id my-utar-eprints.7244
institution Universiti Tunku Abdul Rahman
publishDate 2025
record_format eprints
spelling my-utar-eprints.72442025-12-29T10:18:29Z Automated sign language translation using deep learning Wong, Jia Kang T Technology (General) TD Environmental technology. Sanitary engineering This project focuses on developing a system for automated static gesture sign language translation using deep learning. With the increasing demand for accessible communication tools, particularly for the hearing-impaired community, the need for reliable sign language translation systems is growing. The main challenge addressed in this project is the recognition and translation of static sign language gestures into text, which is less complex than dynamic gestures involving movement. The methodology involves processing images of static sign language gestures using hand landmark detection with MediaPipe. These landmarks are then normalized and input into a deep learning model, trained on processed dataset images, to predict the corresponding sign. The model architecture consists of multiple dense layers with batch normalization and dropout to ensure robust learning. The system is integrated into a user-friendly application that offers real-time sign language translation through a webcam feed, with features such as dynamic confidence threshold adjustment, translation history tracking, and a sign language dictionary. The results show that the system is capable of accurately recognizing and translating static sign language gestures with high confidence, as validated by the test dataset. The system is efficient, easy to use, and highly adaptable for future enhancements. This project demonstrates the potential of deep learning in bridging communication gaps for the hearing-impaired community and sets the groundwork for future work in dynamic sign language translation. 2025-06 Final Year Project / Dissertation / Thesis NonPeerReviewed application/pdf http://eprints.utar.edu.my/7244/1/fyp_CS_2025_WJK.pdf Wong, Jia Kang (2025) Automated sign language translation using deep learning. Final Year Project, UTAR. http://eprints.utar.edu.my/7244/
spellingShingle T Technology (General)
TD Environmental technology. Sanitary engineering
Wong, Jia Kang
Automated sign language translation using deep learning
title Automated sign language translation using deep learning
title_full Automated sign language translation using deep learning
title_fullStr Automated sign language translation using deep learning
title_full_unstemmed Automated sign language translation using deep learning
title_short Automated sign language translation using deep learning
title_sort automated sign language translation using deep learning
topic T Technology (General)
TD Environmental technology. Sanitary engineering
url http://eprints.utar.edu.my/7244/1/fyp_CS_2025_WJK.pdf
http://eprints.utar.edu.my/7244/
url_provider http://eprints.utar.edu.my