Quantcast
Channel: 2045 Initiative
Viewing all articles
Browse latest Browse all 2275

Artificial intelligence app helps blind people

$
0
0

A team of university students from the University of Auckland and Auckland University of Technology have created a smartphone app which uses artificial intelligence to help blind people make better visual sense of the world.

The app, which has been designed for Windows phones, allows visually impaired users to take a photo of their surroundings, for example, a piece of clothing, and the app verbally describes the item to the user. All the user needs to do is tap anywhere on the phone’s screen to capture a photo. Images are then compressed to around 50 to 70kb and sent to the app’s server for analysis. Artificial intelligence is then used to detect colours, text, darkness and brightness to analyse the image and send back a verbal description of the photo.

“There are many things that computer or artificial intelligence can do, so it’s just using computer algorithms you can get some information about an image. [For example], what are the most significant colours in an image or is there any printed text that you can read out or detect darkness and brightness and so on,” Aakash Polra, MobileEye’s team leader, toldComputerworld Australia.

In cases where the image cannot be analysed, it is sent to a human ‘helper’, such as friends on Facebook or volunteers who are helping blind people, who identify the image, with verbal descriptions again sent to the blind user.

“Humans answer the images where the [app] can’t, so it’s a mixture of human intelligence and computer intelligence,” Polra said.

The app took around seven to eight months to develop, with trials carried out to discover the app’s limitations and bugs - it was recently trialled by around 20 users from the Royal New Zealand Foundation of the Blind. There are also plans to extend the app to other countries, including Australia.

“We have been in constant touch with the users and getting their constant feedback and improving the product as we go,” Polra said.

Throughout the trials, Polra said issues which had not been evident at the concept phase were raised. For example, one user took a photo of a shoe and the app simply described it as a shoe.

“She said ‘I already know that because I can touch and feel that it’s a shoe, but I want to know what colour it is? What are the patterns on it?’” Polra said.

“We realised what kind of information is required by the blind users and as we talked with more and more people we learnt more about how the product can be useful in their daily lives and what features we needed to add, so we started adding them, developing them and testing them as well.

“That actually helped make the product more useful to actual users rather than just concepts.”

Team MobileEye are one of six finalists competing in the 2012 Microsoft Imagine Cup, which challenges students around the world to develop unique products using technology. The winner to be announced tonight. Other finalists include:

- uCHAMPsys from Taiwan with a ubiquitous Cloud-based health assessment, management and promotion system

- quadSquad from Ukraine with which transforms sign language into a form of verbal communication by creating a mobile device that continuously recognises sign language phonemes

- Coccolo from Japan with a system that enables LED lamps to communicate with each other and automatically dim if there is more light in the room than needed

- wi-GO from Portugal with a robotic cart designed to improve the mobility of people with special needs by using motors and sensors powered by Kinect

- Symbiosis from Greece, which provides therapy for Alzheimer’s patients through augmented reality and provides support to all people affected by Alzheimer’s, such as caregivers and doctors

Australia's Team StethoCloud, which created a digital stethoscope, placed in the top 20.


Viewing all articles
Browse latest Browse all 2275

Trending Articles