Deafblindness is a condition that hinders an individual's interaction with the people and environment that surrounds him or her. Touch-based interaction, or haptics, is generally employed by forms of communication for the deafblind. Some examples are social-haptic communication, which conveys additional environmental information, like facial expressions, through touch gestures and block alphabet and Braille, which convey words and phrases.
The project’s goal was to develop the prototype of a system that would communicate information to the user by using haptic communication. This was achieved through a facial expression recognition software and a vest with a 10-by-10 vibration motor matrix on the back. The software was developed in C++ and analyses the facial image acquired by the camera in real time, classifying it as anger, disgust, fear, happiness, sadness, surprise or neutral expression. Non-neutral facial expressions are conveyed to the user by different vibration patterns using a matrix of vibration motors in the interior of the vest, which simulate social-haptic communication. The system also converts text input from a keyboard into vibration patterns that simulate the block alphabet and Braille letters, which allows it to convey words and phrases.