A next-generation hearing aid which can “see” is being developed at the University of Stirling in Scotland where a research team led by a computer scientist is designing an aid to help users in noisy environments. The new hearing device will use a miniaturized camera that can lip-read, process visual information in real time, and seamlessly fuse and switch between audio and visual cues. According to a University of Stirling announcement, Amir Hussain, PhD, is leading the ambitious joint research project, which has received nearly £500,000 in funding from the UK Government’s Engineering and Physical Sciences Research Council (EPSRC) and industry.

“This exciting world-first project has the potential to significantly improve the lives of millions of people who have hearing difficulties,” said Hussain. “Existing commercial hearing aids are capable of working on an audio-only basis, but the next-generation audio-visual model we want to develop will intelligently track the target speaker’s face for visual cues, like lip reading. These will further enhance the audio sounds that are picked up and amplified by conventional hearing aids.”

Hussain is also collaborating with Jon Barker, PhD, at the University of Sheffield, who has developed biologically-inspired approaches for separating speech sources that will complement the audio-visual enhancement techniques being developed at Stirling. Other project partners include the MRC/CSO Institute of Hearing Research—Scottish Section, and hearing aid manufacturer, Phonak.

Link to complete article:

http://www.hearingreview.com/2015/06/researchers-develop-hearing-aid-can-see-visual-cues/

 

Content Provided HearingReview