Usually blind people have problems using touchscreens as they don’t provide haptic feedback such as traditional buttons (braille). blindAR enables blind people to use devices with touchscreens like coffee machines or train-ticket machines autonomously. The application blindAR runs on the Microsoft HoloLens (augmented reality glasses).
The software identifies touchscreens, tracks the index finger and accepts acoustic input to help using a touchscreen. For the interaction with BlindAR two approaches can be chosen:
- After scanning the screen the content is read out and the user can select on option. Then BlindAR guides the user to the selected item.
- Another approach is to read out the content when hovering over the respective item. When the user hears the desired information, he can simply move the finger forward to touch the screen.
Both approaches improve the life of blind people by assisting in the utilization of touchscreens.
Development team: Gabriel Bauer, Judith Bauer, Magnus Berendes, Stefan Blos, Jalpa Parmar, Aniol Serra Juhé
Scrum Master: Mathias Maurer
Partner: Machine Learning and Data Analytics Lab