T
The Verge RSS
Guest
Author: Dami Lee
In a developer blog post published today, Alexa AI director of applied science Ruhi Sarikaya detailed the advances in machine learning technologies that have allowed Alexa to better understand users through contextual clues. According to Sarikaya, these improvements have played a role in reducing user friction and making Alexa more conversational.
Since this fall, Amazon has been working on self-learning techniques that teach Alexa to automatically recover from its own errors. The system has been in beta until now, and it launched in the US this week. It doesn’t require any human annotation, and, according to Sarikaya, it uses customers’ “implicit or explicit contextual signals to detect unsatisfactory interactions or failures of...
Continue reading…
Continue reading...
In a developer blog post published today, Alexa AI director of applied science Ruhi Sarikaya detailed the advances in machine learning technologies that have allowed Alexa to better understand users through contextual clues. According to Sarikaya, these improvements have played a role in reducing user friction and making Alexa more conversational.
Since this fall, Amazon has been working on self-learning techniques that teach Alexa to automatically recover from its own errors. The system has been in beta until now, and it launched in the US this week. It doesn’t require any human annotation, and, according to Sarikaya, it uses customers’ “implicit or explicit contextual signals to detect unsatisfactory interactions or failures of...
Continue reading…
Continue reading...