M.S. AAI Capstone Chronicles 2024
A.S.LINGUIST
2
Introduction
This project aims at developing an application able to elaborate live sign language
questions and return both textual and sign language answers on general topics, thereby offering
communication support to deaf and dumb individuals.
It has been estimated that about 72 million people worldwide rely on sign language,
which represents the most common choice of communication among those suffering from
hearing and/or speaking impairments (Vaidhya and Deiva Preetha, 2022). However, there are
more than 300 different sign languages in the world (Sign Language, n.d.) and each of them is
known and used by small groups of people only. This can seriously limit the possibilities and
opportunities of people suffering from this kind of disability, who usually need to be
accompanied by family members or educators able to understand their needs and translate them
to the external world. Thus, the ultimate goal of our project is to develop a technology
responding to the communication needs of deaf and dumb people who know the American Sign
Language (ASL), which is the dominant sign language of deaf communities in North America
(American Sign Language, 2021). The application we propose could in fact be beneficial for
ASL users, to both communicate with people who do not know their language and have simple
dialogues with a chatbot.
For this project, we developed two different machine learning models: a convolutional
neural network (CNN) serving as sign language interpreter and a chatbot model to hold on
textual conversations. Thus, we adopted two different datasets. The first one is an ASL alphabet
dataset from Kaggle (ASL Alphabet, 2021) and was used to train the CNN model. The second
dataset, which is also from Kaggle, contains conversations for building a chatbot (3K
182
Made with FlippingBook - professional solution for displaying marketing and sales documents online