How Microsoft is using AI to empower people with disabilities


Microsoft, AI, Disability

More than one billion people have a disability worldwide which can make accessing technology difficult if not impossible. In first world countries, people with a disability are 20% less likely to own a tablet, smartphone, computer, and have access to high-speed internet. But could this be all about to change?

In support of the eighth annual Global Accessibility Awareness Day — a campaign that aims to highlight the need for inclusive technology, Microsoft has awarded grants to AI projects to make the world more inclusive. The grants are part of a five-year initiative committed to investing $25 million in AI-based accessibility tools. This year, seven recipients will receive access to the Azure AI platform, through Azure compute credits and Microsoft engineering support.

Over the next year, the recipients will work on things like a nerve-sensing wearable wristband that detects micro-movements of the hands and arms and translates them into actions like a mouse click. Another project seeks to develop a wearable cap that reads a person's EEG data and communicates it to the cloud to provide seizure warnings and alerts. Other tools will rely on speech recognition, AI-powered chatbots and apps for people with vision impairment.

As well as supplying grants and support, Microsoft has been developing their own AI solutions to make technology more accessible and inclusive to those with disabilities. Over the last few years, Microsoft has been putting to work stronger solutions such as real-time speech-to-text transcription, visual recognition services and predictive text functionality which offer enormous potential by enabling people with vision, hearing, cognitive, learning, mobility disabilities and mental health conditions do more in three specific scenarios: employment, modern life and human connection.

We’re really excited to see what developments come from Microsoft’s grants, but in the meantime, let’s take a look at the technology Microsoft has already built and the impact it’s having on people’s lives.

Seeing AI app

According to the Royal National Institute of Blind People, more than two million people in the UK live with sight loss, and almost half of blind and partially sighted people feel “moderately” or “completely” cut off from people and things around them so Microsoft has done something to try and change this. They have developed an app, designed to help blind and partially sighted people by narrating the world around them. The free program uses artificial intelligence to recognise objects, people and text via a phone or tablet’s camera and describes them to the user.

The app is already changing lives and empowering blind and partially sighted people to experience the world in a new way. The app uses AI facial recognition to identify family and friends describing both the person and their emotions. The app will also locate and scan product barcodes to enable to user to identify products more easily.

Microsoft are continually working to improve the app and are currently fine turning the app’s AI scene identification skills so that in the future, the app will accurately describe what is going on around the user, such as if a car is pulling out of a car parking space or a cyclist is approaching.

 

 

Microsoft Translator

Microsoft Translator not only breaks down language barriers, it empowers people who are deaf or hard of hearing with real-time captioning of conversations. 

The system uses an advanced form of automatic speech recognition to convert raw spoken language – ums, stutters and all – into fluent, punctuated text. The removal of disfluencies and addition of punctuation leads to higher-quality translations into the more than 60 languages that the translator technology supports.

Microsoft has already piloted the technology at the Rochester Institute of Technology in New York with the aim of supporting students in the classroom who are deaf or hard of hearing.

Student feedback was overwhelmingly positive with one student stating that for the first time he can receive information at the same time as his hearing peers which allowed him to keep up with the class and learn to spell the scientific terms correctly.

 

 

Eye Control

Eye control empowers people with limited mobility to operate an onscreen mouse, keyboard, and text-to-speech experience using only their eyes.

The introduction of Eye Control was inspired by former NFL player Steve Gleason, who suffers from motor neurone disease. He challenged Microsoft employees to launch a hackathon to find ways to fight some of the constraints the disease forced him to live with.

One of the teams who took part in the hackathon went away a developed a wheelchair Gleason could control using the movement of his eyes. This technology has already proven to be revolutionary for many users with limited mobility, but this is just the begging.

Birmingham City University has been awarded one of the AI for Accessibility grants from Microsoft to not only solve a mobility challenge but also make careers in technology accessible to everyone.

Over the next few years, the university will attempt to build a system that could make it easier for people with limited mobility to gain employment in web development and computer programming. Building on the existing eye control technology, the university will use cognitive services such as Microsoft Azure and speech to text to enable computer programmers to code with just their eyes and voice.

 

 

 To find out what other technologies Microsoft are working on to make the world more accessible to all, click here.

 

Posted by Helen Thomas