An AI-Based Visual Aid With Integrated Reading Assistant for the Completely Blind | IEEE Journals & Magazine | IEEE Xplore

An AI-Based Visual Aid With Integrated Reading Assistant for the Completely Blind


Abstract:

Blindness prevents a person from gaining knowledge of the surrounding environment and makes unassisted navigation, object recognition, obstacle avoidance, and reading tas...Show More

Abstract:

Blindness prevents a person from gaining knowledge of the surrounding environment and makes unassisted navigation, object recognition, obstacle avoidance, and reading tasks a major challenge. In this work, we propose a novel visual aid system for the completely blind. Because of its low cost, compact size, and ease-of-integration, Raspberry Pi 3 Model B+ has been used to demonstrate the functionality of the proposed prototype. The design incorporates a camera and sensors for obstacle avoidance and advanced image processing algorithms for object detection. The distance between the user and the obstacle is measured by the camera as well as ultrasonic sensors. The system includes an integrated reading assistant, in the form of the image-to-text converter, followed by an auditory feedback. The entire setup is lightweight and portable and can be mounted onto a regular pair of eyeglasses, without any additional cost and complexity. Experiments are carried out with 60 completely blind individuals to evaluate the performance of the proposed device with respect to the traditional white cane. The evaluations are performed in controlled environments that mimic real-world scenarios encountered by a blind person. Results show that the proposed device, as compared with the white cane, enables greater accessibility, comfort, and ease of navigation for the visually impaired.
Published in: IEEE Transactions on Human-Machine Systems ( Volume: 50, Issue: 6, December 2020)
Page(s): 507 - 517
Date of Publication: 20 October 2020

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.