Visdom: Smart guide robot for visually impaired people

This report details the development of an autonomous guide robot system for visually impaired individuals using the ROS 2 framework and the Wheeltec R550 Robot platform. Key features include the SMAC Hybrid A* algorithm for global path planning, the Regulated Pure Pursuit Controller for local trajec...

Full description

Saved in:
Bibliographic Details
Main Author: Lee, Zhen Ting
Format: Final Year Project / Dissertation / Thesis
Published: 2025
Subjects:
Online Access:http://eprints.utar.edu.my/6171/1/fyp_CS_2025_LZT.pdf
http://eprints.utar.edu.my/6171/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1848452694112468992
author Lee, Zhen Ting
author_facet Lee, Zhen Ting
author_sort Lee, Zhen Ting
building UTAR Library
collection Institutional Repository
content_provider Universiti Tunku Abdul Rahman
content_source UTAR Institutional Repository
continent Asia
country Malaysia
description This report details the development of an autonomous guide robot system for visually impaired individuals using the ROS 2 framework and the Wheeltec R550 Robot platform. Key features include the SMAC Hybrid A* algorithm for global path planning, the Regulated Pure Pursuit Controller for local trajectory execution and dynamic obstacle avoidance, and gmapping for SLAM functionality. The system architecture integrates ROS 2 on a Raspberry Pi, with TCP/IP connectivity enabling remote operation. An Android mobile application, developed using Java and the java.net.Socket library, provides an intuitive and accessible user interface for seamless interaction with the robot. The app incorporates TalkBack accessibility support, enabling visually impaired users to navigate the interface using screen reader feedback. Once a destination is selected, the robot guides the user using a combination of voice prompts (e.g., "Heading to kitchen", "Be careful to your left") and physical feedback through a mounted guiding stick. The project has demonstrated successful results in mapping and navigation within static indoor environments. Core functionalities such as path planning, autonomous movement, voice feedback, and app-to-robot communication have been thoroughly tested and optimized. The integration of the mobile interface and real-time voice prompting significantly enhances user accessibility and confidence. This system represents a significant advancement in assistive technology, offering improved mobility and independence for visually impaired individuals. The project showcases the practical application of autonomous navigation technologies in real-world scenarios, highlighting the potential of robotics and AI in improving quality of life for people with visual impairments.
format Final Year Project / Dissertation / Thesis
id my-utar-eprints.6171
institution Universiti Tunku Abdul Rahman
publishDate 2025
record_format eprints
spelling my-utar-eprints.61712025-11-05T13:22:17Z Visdom: Smart guide robot for visually impaired people Lee, Zhen Ting T Technology (General) This report details the development of an autonomous guide robot system for visually impaired individuals using the ROS 2 framework and the Wheeltec R550 Robot platform. Key features include the SMAC Hybrid A* algorithm for global path planning, the Regulated Pure Pursuit Controller for local trajectory execution and dynamic obstacle avoidance, and gmapping for SLAM functionality. The system architecture integrates ROS 2 on a Raspberry Pi, with TCP/IP connectivity enabling remote operation. An Android mobile application, developed using Java and the java.net.Socket library, provides an intuitive and accessible user interface for seamless interaction with the robot. The app incorporates TalkBack accessibility support, enabling visually impaired users to navigate the interface using screen reader feedback. Once a destination is selected, the robot guides the user using a combination of voice prompts (e.g., "Heading to kitchen", "Be careful to your left") and physical feedback through a mounted guiding stick. The project has demonstrated successful results in mapping and navigation within static indoor environments. Core functionalities such as path planning, autonomous movement, voice feedback, and app-to-robot communication have been thoroughly tested and optimized. The integration of the mobile interface and real-time voice prompting significantly enhances user accessibility and confidence. This system represents a significant advancement in assistive technology, offering improved mobility and independence for visually impaired individuals. The project showcases the practical application of autonomous navigation technologies in real-world scenarios, highlighting the potential of robotics and AI in improving quality of life for people with visual impairments. 2025-01 Final Year Project / Dissertation / Thesis NonPeerReviewed application/pdf http://eprints.utar.edu.my/6171/1/fyp_CS_2025_LZT.pdf Lee, Zhen Ting (2025) Visdom: Smart guide robot for visually impaired people. Final Year Project, UTAR. http://eprints.utar.edu.my/6171/
spellingShingle T Technology (General)
Lee, Zhen Ting
Visdom: Smart guide robot for visually impaired people
title Visdom: Smart guide robot for visually impaired people
title_full Visdom: Smart guide robot for visually impaired people
title_fullStr Visdom: Smart guide robot for visually impaired people
title_full_unstemmed Visdom: Smart guide robot for visually impaired people
title_short Visdom: Smart guide robot for visually impaired people
title_sort visdom: smart guide robot for visually impaired people
topic T Technology (General)
url http://eprints.utar.edu.my/6171/1/fyp_CS_2025_LZT.pdf
http://eprints.utar.edu.my/6171/
url_provider http://eprints.utar.edu.my