T
The Verge RSS
Guest
Author: James Vincent
It seems like voice interfaces are going to be a big part of the future of computing; popping up in phones, smart speakers, and even household appliances. But how useful is this technology for people who don’t communicate using speech? Are we creating a system that locks out certain users?
These were the questions that inspired software developer Abhishek Singh to create a mod that lets Amazon’s Alexa assistant understand some simple sign language commands. In a video, Singh demonstrates how the system works. An Amazon Echo is connected to a laptop, with a webcam (and some back-end machine learning software) decoding Singh’s gestures in text and speech.
“Seamless design needs to be inclusive in nature.”
Speaking to The Verge, Singh...
Continue reading…
Continue reading...
It seems like voice interfaces are going to be a big part of the future of computing; popping up in phones, smart speakers, and even household appliances. But how useful is this technology for people who don’t communicate using speech? Are we creating a system that locks out certain users?
These were the questions that inspired software developer Abhishek Singh to create a mod that lets Amazon’s Alexa assistant understand some simple sign language commands. In a video, Singh demonstrates how the system works. An Amazon Echo is connected to a laptop, with a webcam (and some back-end machine learning software) decoding Singh’s gestures in text and speech.
“Seamless design needs to be inclusive in nature.”
Speaking to The Verge, Singh...
Continue reading…
Continue reading...