Scope

This project is a case study to understand how blind and low-vision people explore

unfamiliar environments by providing with a descriptive feedback that focuses on

high-level information, safety and points of interest, in and around an object.

This project is a case study to understand how blind and low-vision people explore unfamiliar environments by providing with a descriptive feedback that focuses on high-level information, safety and points of interest, in and around an object.

Background

This study is primarily based on a qualitative study established (Banovic et al.) with respect to the preferences of blind and low-vision people in the context. Since then numerous studies have explored navigation perspective but to the best of my knowledge, there has not been an attempt at experimenting with high-level information, safety and points of interest information that can assist them in making an informed decision.

Technology

Though exploration has been attempted through
object identification, tactile interfaces, audio systems, and external human assistance each has its limitations. For my study, due to unfamiliar textures and to promote independent exploration, I chose to promote localized object recognition with descriptive feedback through audio.

Lo-fi Prototype

Based on the background research and limitations identified, I performed a brainstorming activity and sketched basic prototype designs that served as inspiration for application development and as the baseline for product iterations.

The application requires the user to upload their schedule and set a timer for the task. In addition, the application also asks for total permitted break time.

Python

Used for interfacing and transfer learning.

Android OS

Used for deployment and testing of prototype

Yolov5

Used for training object recognition model

Get in Touch!

Get in Touch!