Skip to content

Eyeglasses transformed to allow those without speech capability to communicate

Eyeglasses transformed to allow those without speech capability to communicate 

COLUMBIA, Mo. – They look like a pair of everyday eyeglasses, but University of Missouri students have turned them into a device that allows people without speech capability to communicate.

“For people of amyotrophic lateral sclerosis (Lou Gehrig’s disease), traumatic brain injury, stroke or spinal cord injury, speech can be impossible, leaving them essentially trapped in their own bodies,” said Nathan Granneman, recent MU graduate from Milan, Mo.

The eyeglasses are equipped with tiny sensor switches on flexible wires that attach to the user’s cheek and chin muscles. Activated by the slightest facial twitch, the switches relay messages to an on-screen computer keyboard, which allows the input of text and a synthetic voice that speaks the message typed.

While there are many assistive speech communications devices on the market, limitations exist with the technology, and no single device can suit all users. The eyeglass design allows for user customization by extending down the temples to preferred facial muscles.

The design project was a semester-long project in a senior capstone course, which requires students to apply what they have learned in the classrooms and laboratories to a real life situation, said John Viator, assistant professor of biological engineering.

The device had to be aesthetically pleasing, affordable, easy to use and designed to have multiple sensors placed upon it, said Viator.

Students testing their device found it to be comfortable for a client to use over an extended period of time and 96 percent accurate.

The production cost of the device is estimated to be about $130. A suggested retail price of $260 would allow the device to be competitive with similar products. This does not include the price of the computer.

While the current model of the device allows for chin and cheek sensors, future versions could incorporate eyebrow twitches, eye gaze sensors and sip and puff switches.

Other students involved in the project were Emily Longwith of Columbia, Caleb Rich of Keaney, Mo., and Andrea Erler of Wildwood, Mo.