Indoor navigation for visually impaired

The increasing prevalence of visual impairments globally has highlighted the importance of effective assistive technologies to enable independent navigation for the visually impaired in indoor environments. Traditional indoor navigation systems often suffer from high costs, signal interference, and...

Full description

Saved in:
Bibliographic Details
Main Author: Ng, Wei Yu
Format: Final Year Project / Dissertation / Thesis
Published: 2025
Subjects:
Online Access:http://eprints.utar.edu.my/7216/1/fyp_CS_2025_NWY.pdf
http://eprints.utar.edu.my/7216/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1854094490018512896
author Ng, Wei Yu
author_facet Ng, Wei Yu
author_sort Ng, Wei Yu
building UTAR Library
collection Institutional Repository
content_provider Universiti Tunku Abdul Rahman
content_source UTAR Institutional Repository
continent Asia
country Malaysia
description The increasing prevalence of visual impairments globally has highlighted the importance of effective assistive technologies to enable independent navigation for the visually impaired in indoor environments. Traditional indoor navigation systems often suffer from high costs, signal interference, and reliance on pre-installed infrastructure, making them inaccessible to many. This project proposes a cost-effective and infrastructure-free indoor navigation system that leverages a smartphone's camera and advanced computer vision. The system's core is a novel localization approach that uses a DINOv2 vision transformer to generate robust semantic embeddings from real-time images. For superior accuracy, initial embedding comparisons are verified by a Vision-Language Model (VLM), which provides contextual understanding of the scene. To handle dynamic environments, the system integrates YOLOv8-seg and Stable Diffusion 2.0 to detect and remove people from the camera feed, ensuring reliable performance. Implemented as a Flutter mobile application with a high-performance FastAPI and Supabase backend, the system provides users with step-by-step guidance along pre-recorded visual routes, delivering clear instructions via audio feedback and a hands-free voice assistant. By integrating state-of-the-art deep learning models into an accessible platform, this project addresses the limitations of existing solutions and contributes meaningfully to advancing assistive technology for the visually impaired community.
format Final Year Project / Dissertation / Thesis
id my-utar-eprints.7216
institution Universiti Tunku Abdul Rahman
publishDate 2025
record_format eprints
spelling my-utar-eprints.72162025-12-29T08:02:34Z Indoor navigation for visually impaired Ng, Wei Yu T Technology (General) The increasing prevalence of visual impairments globally has highlighted the importance of effective assistive technologies to enable independent navigation for the visually impaired in indoor environments. Traditional indoor navigation systems often suffer from high costs, signal interference, and reliance on pre-installed infrastructure, making them inaccessible to many. This project proposes a cost-effective and infrastructure-free indoor navigation system that leverages a smartphone's camera and advanced computer vision. The system's core is a novel localization approach that uses a DINOv2 vision transformer to generate robust semantic embeddings from real-time images. For superior accuracy, initial embedding comparisons are verified by a Vision-Language Model (VLM), which provides contextual understanding of the scene. To handle dynamic environments, the system integrates YOLOv8-seg and Stable Diffusion 2.0 to detect and remove people from the camera feed, ensuring reliable performance. Implemented as a Flutter mobile application with a high-performance FastAPI and Supabase backend, the system provides users with step-by-step guidance along pre-recorded visual routes, delivering clear instructions via audio feedback and a hands-free voice assistant. By integrating state-of-the-art deep learning models into an accessible platform, this project addresses the limitations of existing solutions and contributes meaningfully to advancing assistive technology for the visually impaired community. 2025-06 Final Year Project / Dissertation / Thesis NonPeerReviewed application/pdf http://eprints.utar.edu.my/7216/1/fyp_CS_2025_NWY.pdf Ng, Wei Yu (2025) Indoor navigation for visually impaired. Final Year Project, UTAR. http://eprints.utar.edu.my/7216/
spellingShingle T Technology (General)
Ng, Wei Yu
Indoor navigation for visually impaired
title Indoor navigation for visually impaired
title_full Indoor navigation for visually impaired
title_fullStr Indoor navigation for visually impaired
title_full_unstemmed Indoor navigation for visually impaired
title_short Indoor navigation for visually impaired
title_sort indoor navigation for visually impaired
topic T Technology (General)
url http://eprints.utar.edu.my/7216/1/fyp_CS_2025_NWY.pdf
http://eprints.utar.edu.my/7216/
url_provider http://eprints.utar.edu.my