Edition 69, December 2021

Machine learning to make sign language more accessible​


Inclusive Ideas

Welcome to SignTown! An interactive experience where you can learn sign language with a little help from AI. Like how to order at a restaurant ('milk tea?'). Or checking into a hotel and requesting shampoo or soap.


How does it work? All it takes is a webcam and machine learning to detect your body poses, facial expressions, and hand movements. 

Google has spent over twenty years helping to make information accessible and useful in more than 150 languages. And our work is definitely not done, because the internet changes so quickly. About 15% of searches, we see are entirely new every day. And when it comes to other types of information beyond words, in many ways, technology hasn’t even begun to scratch the surface of what’s possible. Take one example: sign language.

The task is daunting. There are as many sign languages as there are spoken languages around the world. That’s why, when we began exploring how we could better support sign language, we started small by researching and experimenting with what machine learning models could recognize. 

of testing with a database of videos for Japanese Sign Language and Hong Kong Sign Language, we launched SignTown: an interactive desktop application that works with a web browser and camera.

SignTown is an interactive web game built to help people to learn about sign language and Deaf culture. It uses machine learning to detect the user's ability to perform signs learned from the game.


Originally featured at 

https://blog.google/outreach-initiatives/accessibility/ml-making-sign-language-more-accessible/