DOI: https://doi.org/10.15368/theses.2014.107
Available at: https://digitalcommons.calpoly.edu/theses/1262
Date of Award
6-2014
Degree Name
MS in Computer Science
Department/Program
Computer Science
Advisor
Lynne Slivovsky
Abstract
The most common navigation aid visually-impaired people employ is a white cane, but, recently, technology has given rise to a varied set of sophisticated navigation aids. While these new aids can provide more assistance to a visually-impaired person than a white cane, they tend to be expensive due to a small market segment, which in turn can reduce their accessibility. In an effort to produce a technologically-advanced yet accessible navigation aid, an Android application is proposed that detects and notifies users about obstacles within their path through the use of a smartphone's camera. While the smartphone is mounted on a harness worn by the user, the Walking Assistant application operates by capturing images as the user walks, finding features of objects within each frame, and determining how the features have moved from image to image. If it is discovered that an object is moving towards the user, the Walking Assistant will activate the smartphone's vibration mode to alert the user to the object's presence. Additionally, the user can control the Walking Assistant through the use of either touch or voice commands. By conducting real-world tests, it was determined that the Walking Assistant can correctly identify obstacles 42.1% of the time, while generating false positive obstacle identifications only 15.0% of the time. The accuracy of the Walking Assistant can be further improved by implementing additional features, such as a fuzzy-decision-based thresholding system or image stabilization.