Microsoft’s Seeing AI app helps the blind ‘see’ everything around them

Microsoft has released an app for iPhone, designed to translate the visual world into an audible experience for those with vision impairments.

Making use of artificial intelligence for image recognition, and natural language processing, Seeing AI uses the smartphone’s camera to capture what’s happening around the user, then relay this through speech.

Point the app at a person and it will use facial recognition software to tell you that individual’s gender, age and emotion. Aim it at something that contains text – like a street sign or a document – and it will speak what it reads.

The app is also able to relay product information from scanned barcodes, while features are being developed for identifying currency – useful for handing over cash in a shop – and for describing wider scenes.

For example, in the promo below the user takes a picture, and the app says the image is of a young girl throwing a frisbee in the park.  

The app is the result of work from Microsoft Research, and looks to be a neat coupling of several areas of focus. The research wing’s site lists AI, computer vision and human language technologies as core areas of focus, with Seeing AI ticking off all three. Other projects the initiative is working on include a Minecraft-based testing ground for AI and means for neural networks to make their own software code.

Seeing AI certainly seems to have the potential to become a useful tool for those with vision impairments, although the app would presumably require some level of vision in order to line the camera up with a required object – particularly if it’s being used to scan documents. Speaking to Alphr, the Royal National Institute of Blind People (RNIB) said the technologies showcased by Microsoft’s app could offer a “revolution” in access, but that its use would hinge on the camera’s ability to self-direct users:  

“Artificial intelligence, computer vision, and human language technologies offer a revolution in access for blind and partially sighted people,” said Robin Spinks, senior strategy manager at RNIB. “Orientation and placement are critical factors for a successful deployment of computer vision. You’ll do well if you can see where to direct the camera but the real trick is for the camera to self-direct enabling a person with no useful sight to benefit equally.

“RNIB is working with blue chip companies around the world to leverage the potential of these technologies and we warmly welcome the arrival of Seeing AI.”

Those living in the US, Canada, Hong Kong, India, New Zealand and Singapore can try the iOS app now, while users in the UK will have to wait until it is made available.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.