Sign Language Translation Using AI & ML
Abstract
Ankita Gandhi, Jaykumar Rameshbhai Makwana*, Viraj Rajendra Bhalodiya, Sharad Kalpeshkumar Patel and Priyank Kiritbhai Upadhyay
The communication of the deaf depends greatly on sign languages. But low awareness and knowledge of these sign languages have limited the effectiveness of communication and interaction and hence integration. In this paper, we describe an AI-centric system for real-time translation of sign languages with support for English, Indian, Turkish, and Arabic signs and alphabets. The system has the ability to recognize and be responsive to the palm gesture of alphabets and numbers in these languages and is ready to enable full-fledged communication. The system leverages advanced machine learning algorithms to ensure portability, which makes it accessible for everyday use and immediately deployable. Key features include a user-friendly interface, as well as the ability to integrate with the majority of e-learning platforms to offer sign language lessons and to be continuously improved based on users' feedback through adaptive learning. Working with the deaf community, language experts, and teachers is part of the effort to fine-tune the system and improve its practical impact. The project will demonstrate the potential of ML and AI technologies in breaking the communication barrier for the deaf towards achieving full accessibility, inclusivity, and active participation in all kinds of social, educational, and professional environments.

