Google is continuing its work in the disability space by revealing a new tool aimed at helping individuals with speech impairments communicate. Called Project Relate, the app was designed to help folks communicate more easily through text and a synthetic vocal assistant. 

Users are able tap into the technology to get a transcription of their speech to text in real-time. This can then be used to text or show others what the user would like to say. It also has a repeat function, which allows a user to speak into the app, and have a synthetic voice repeat their statement. The tool is able to connect with to user’s Google Home, allowing users to give the device specific commands like turning off the lights etc. 

Google is now soliciting users in Australia, Canada, New Zealand and the United States to test the app and give feedback. 

“Project Relate is a continuation of years of research from both Google’s Speech and Research teams, made possible by over a million speech samples recorded by participants of our research effort,” Julie Cattiau, product manager in Google AI, said in a blog introducing the technology

WHY IT MATTERS 

Roughly 7.5 million people in America have difficulties using their voice, according to the National Institute on Deafness and Other Communication Disorders. The most common cause of a voice disorder is spasmodic dysphonia, which is caused by involuntary spasms in the muscle of the voice box or larynx, according to the institute. 

Google is pitching this technology as a way for individuals with speech impacted by stroke, ALS, Cerebral Palsy and traumatic brain injury to communicate more easily. 

THE LARGER TREND 

This isn’t Google’s first move into the accessibility space. In September the company announced a new Android feature, called Camera Switches, that would let individuals with speech and motor impairments navigate their smartphones through self-selected facial gestures and eye movements.

Apple is also looking to boost accessibility. Earlier this year it announced a number of new features for people with mobility, vision, hearing and cognitive disabilities. One example is Apple Watch’s AssistiveTouch, which has a built-in gyroscope and an accelerometer sensor to pinpoint subtle muscle movements and let individuals navigate via gestures.  

 



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *