Facebook harnesses Artificial Intelligence to read photos to the blind

According to the World Health Organisation, around 285 million people are visually impaired – that’s just over 4% of the world’s population, give or take. The internet has got a whole lot more accessible over the last few years, but it still has some way to go, and photographs are one of the biggest issues.

Facebook harnesses Artificial Intelligence to read photos to the blind

Image alt tags allow website owners to get around this problem to some degree by writing a description that will be read aloud to visually impaired visitors, but they’re not always used effectively. Worse, most people don’t even know what they do, which wasn’t a problem when the majority of people simply consumed content, but now everyone is uploading stuff all the time, and few – if any – are bothering to make their uploads accessible.

Nobody is more aware of this problem than Facebook. Across Instagram, WhatsApp, Messenger and, of course, Facebook itself, people upload more than two billion photos per day. Asking people to describe their images manually, as Twitter is now trying to encourage, seems like a non-starter with that kind of volume, so the company has decided to let Artificial Intelligence do the majority of the heavy lifting.

The technology uses machine learning to identify the contents of photographs uploaded to the service, then reads the content aloud to the user. Machine learning works by repeatedly showing the software pictures of, say, cats. Eventually the software will learn what a cat looks like and be able to tag them automatically. The Verge explains that the Facebook system is starting small, but getting bigger, currently capable of accurately identifying various objects (“car”, “boat”, “plane”), backdrops (“sunset”, “ocean”, “snow”) and even types of food (“sushi”, “pizza”).

The feature is rolling out on iOS today, and will soon follow on Android and desktop. It may sound like a small thing, but this kind of inclusion can make a huge difference. As Matt King, a visually impaired Facebook engineer told The Verge, “Inclusion is really powerful and exclusion is really painful. The impact of doing something like this is really telling people who are blind, your ability to participate in the social conversation that’s going on around the world is really important to us. It’s saying as a person, you matter, and we care about you. We want to include everybody – and we’ll do what it takes to include everybody.”

READ NEXT: Why web accessibility is critical to low-vision users

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.