Scope

This project is a case study to understand the challenges of blind

and low-vision people during social interactions with other people.

The challenges are used to develop a prototype that can successfully

detect emotions using a mixture of body language, facial gestures and

tonality.

This project is a case study to understand the challenges of blind

and low-vision people during social interactions with other people.

The challenges are used to develop a prototype that can successfully

detect emotions using a mixture of body language, facial gestures and

tonality.

Scope

This project is a case study to understand the challenges of blind

and low-vision people during social interactions with other people.

The challenges are used to develop a prototype that can successfully

detect emotions using a mixture of body language, facial gestures and

tonality.

This project is a case study to understand the challenges of blind

and low-vision people during social interactions with other people.

The challenges are used to develop a prototype that can successfully

detect emotions using a mixture of body language, facial gestures and

tonality.

Background

A literature review of previously published academic research articles

was performed. Approximately 30 papers were analyzed from ACM

and IEEE to gather challenges and technology used in prototypes

to remedy the challenges. Sticky chart expresses the base level limitations to

current practices in emotion detection and classification.

A literature review of previously published academic research articles

was performed. Approximately 30 papers were analyzed from ACM

and IEEE to gather challenges and technology used in prototypes

to remedy the challenges. Sticky chart expresses the base level

limitations to current practices in emotion detection and classification.

A literature review of previously published academic research articles was performed. Approximately 30 papers were analyzed from ACM and IEEE to gather challenges and technology used in prototypes to remedy the challenges. Sticky chart expresses the base level limitations to current practices in emotion detection and classification.

Lo-fi Prototype

Based on the background research and limitations identified, I performed a brainstorming activity and sketched basic prototype designs that served as inspiration for application development and as the baseline for product iterations.

By tracing the movements of arms and body language in addition to facial gestures and tone, the application can predict the emotion of the person facing the camera.

In this scenario, a blind and low-vision person walks with his phone out and uses it to converse with an able bodied person.

Get in Touch!

Get in Touch!