AI, Morse code and live video—how technologies power accessibility apps for users around the world
Samsung’s Good Vibes app uses Morse code to enable two-way communication between the visually and hearing challenged and their caretakers
Earlier this month, South Korean technology company Samsung introduced two apps in India designed to make communication easier for the hearing and visually challenged. While Relumino is a visual aid application that magnifies and minimizes images, adjusts colour brightness and contrast (with the help of a Samsung Gear VR headset), Good Vibes uses Morse code to enable two-way communication between the visually and hearing challenged and their caretakers.
These are the latest in a series of apps, like Google’s Lookout and Microsoft’s Seeing AI for the visually challenged. The Good Vibes app uses a combination of long and short vibrations sent through an Android device to digitize Morse code. The app turns Morse input into text or voice, and vice versa.
Good Vibes essentially has two interfaces. A user takes the help of Morse code, tapping the phone’s screen to create a message. All the letters of the English alphabet are combinations of dots and dashes that can be put in with a short tap or a long press on the phone’s screen. A short tap denotes a dot, a long press denotes a dash, while a two-finger tap denotes a space. A long press with two fingers deletes a letter. Once users have typed a message, they need to flip the phone to send it.
“During the numerous workshops and R&D sessions (with users and their caretakers) held for the app, we worked on details like how long a vibration should be, what should be the gap between one pulse and another, because the user should be able to piece the whole message together and figure out what is being said to them," says Trivikram Thakore, vice-president, Samsung India, as we speak on the sidelines of a demonstration of the apps at the National Association for the Blind in Delhi.
Conceptualized by Cheill Worldwide India in 2018, the Good Vibes app has been developed in collaboration with Sense International India, a non-profit that supports the development of services and training for the hearing and visually challenged. The app is currently available only on the Samsung Galaxy Store but will be rolled out on the Google Play Store soon.
“Using Morse code was the best way forward for the app. It is confusing for people like you and me to understand Morse. But for the hearing and visually challenged, Morse is easy to grasp," explains Parag Namdeo of Sense International India.
Once the message is sent, there is an option to receive it in text or voice format. They use a standard chat messaging interface to reply—either in text or voice. Once the receiver hits send, the message is delivered to the sender in Morse code vibrations. What if the receiver has a basic Android phone? “It’s not a problem. Any Android phone should be able to use (or be compatible with) the app. You need a smartphone: that’s it. It won’t work on a feature phone. We have tested this across multiple devices," explains Thakore, adding that Good Vibes has been in the works for almost a year.
In March, Google introduced the Artificial Intelligence-powered app Lookout, which helps blind people or those with low vision to identify their surroundings and things around them. Lookout works on principles similar to Google Lens, using a device’s camera and sensors to recognize text and objects. According to an official blog on Lookout’s launch, once users open the app, they just need to keep the phone pointed forward and it will keep describing everything around them as they move through a space.
While Lookout is currently available only in the US on select Android devices, other accessibility apps, like Live Transcribe and Sound Amplifier, are available in India and can be downloaded free of cost from the Play Store. While Live Transcribe converts speech into text in real time for the other person to read, Sound Amplifier works with wired headphones to “customize frequencies to augment important sound, like the voices of the people you are with, and filter out background noise," according to an official blog on the app’s release.
Microsoft’s Seeing AI works on similar lines. Released in 2017, it is a camera app that narrates the world around a user with visual impairment. It uses optical character technology to read out short texts and provides audio guidance to recognize the people around the user—even emotions. Other features include recognizing currency, barcodes, colours, etc. Seeing AI is currently available only on the App Store in more than 70 countries, including India.
While these apps put the power in the hands of the user, Be My Eyes, founded by Denmark-based Hans Jørgen Wiberg, connects blind and low-vision individuals with sighted volunteers across the world through a live video interaction. With the app, a user with low vision can request assistance from sighted volunteers who have Be My Eyes installed on their devices. As of March, two million people had signed up as sighted volunteers for the app, which is available on both Android and iOS in more than 150 countries and 180 languages.