Though there is a growing number of apps available for Android, the most popular platform for assistive apps hands-down is iOS. Thanks to built-in capabilities like the VoiceOver screen reader, the iPhone attracted far more of the accessibility-conscious in the beginning, and most similar apps since then have been developed for it. In the past few years apps for counting money, reading light intensity, and reading barcodes have popped up everywhere, but AI and fast mobile internet have been game-changers.

The Machines: Artificial Intelligence

Text-to-speech and color recognition are useful, but the next generation of seeing-eye apps is being powered by artificial intelligence. Apps like Microsoft’s Seeing AI and Envision are going beyond simple scanning and recognition and using machine learning and neural networks to unlock a whole new set of tools for visually-impaired users. Seeing AI, for example, can recognize people, guess their age, and give you an idea of how they’re feeling. Its development team is working on its ability to tell you what’s going on in a scene as well. It may only be a matter of time before smartphones can provide continuous real-time narration about a user’s surroundings. There seems to be no shortage of companies working on incorporating AI and machine learning into accessibility, with projects like TapTapSee, AiPoly Vision, and even Google Lookout working on and releasing apps that use phone cameras and sensors to decode the world.

The Humans: Seeing Eyes

As good as AI currently is, it’s not foolproof, and it probably won’t be as much help in more complex situations. When you have a problem that requires some human input, there are some apps out there that make it easy for visually-impaired individuals to get in touch with a sighted volunteer. The most popular is Be My Eyes, which crowdsources over a million volunteers to help out 80,000 users. As long as there is a decent Internet connection, the volunteer can see through the phone’s camera and relay information to the user. Aira is an even more sophisticated service, using video-camera glasses (similar to Google Glass) to give a video feed to an assistance agent. It comes with a high monthly fee, but the agents are trained, and at least Google Glass technology came in handy somewhere.

Getting around: navigation apps

While navigation capabilities are increasingly being built into other apps, there is still a whole class of apps out there specifically meant to help blind and visually-impaired people get where they need to go. One of the most popular is BlindSquare, an app that not only gives you “turn left,” “turn right” directions, but describes your surroundings and points of interest to you. Microsoft’s Soundscape app does something similar, as do ViaOpta Nav, Seeing Eye GPS, and several others. They offer features like giving you directions by vibration, letting you record memos that can be triggered to play back at certain locations, and as AI begins to be more and more integrated, they will likely begin providing even more real-time information.

Conclusion: the future of assistive tech

As technology continues to get smarter, smaller, and more widespread, options for visually-impaired and blind users will continue to expand. AI and machine learning are two of the most promising fields, but the internet of things is also an interesting development – a network of sensors surrounding and interacting with a blind user would open up a lot of possibilities. Opening these doors will not only benefit users, but society as a whole, giving talented and unique individuals more opportunities to help build the future.