Abstract:
Visually impaired individuals usually depend on assistive devices like white canes, frequently equipped with ultrasonic sensors, for navigation. However, these devices fa...Show MoreMetadata
Abstract:
Visually impaired individuals usually depend on assistive devices like white canes, frequently equipped with ultrasonic sensors, for navigation. However, these devices face significant limitations, particularly in detecting specific hazards such as potholes and deep trenches on walkways. This gap in functionality increases the risk of accidents and impedes safe, independent navigation for the visually impaired. The research developed a prototype of a blind assistive system equipped with an array of ultrasonic sensors and a Raspberry Pi integrated with Firebase for loT capabilities. AI models, trained on the collected datasets of road images and ultrasonic sensor readings, were deployed on the Raspberry Pi. Testing in real-world scenarios was conducted to validate the prototype's effectiveness. The results showed that the AI model successfully detected potholes with an accuracy of 93%. The prototype could detect both large and small potholes using ultrasonic sensors and a camera but faced challenges in cases where potholes were filled with water or in complex environments.
Date of Conference: 13-14 August 2024
Date Added to IEEE Xplore: 04 September 2024
ISBN Information: