Unlocking Sign Language Translation: Computer Science Research Team Explores the Power of AI

TL;DR:

  • Computer science professor Andrea Salgian and Ben Guerrieri are working on a project to develop a Google Translate-like tool for American Sign Language (ASL).
  • They aim to use computer vision and machine learning to create a program that allows ASL users to sign to a camera and receive direct translations.
  • The initial focus is on recognizing static gestures, such as letters in the ASL alphabet, with no hand movement.
  • The program will act as a dictionary at first, but the goal is to develop automated translation capabilities.
  • Mediapipe, a machine-learning framework developed by Google, is being utilized for this research.
  • Ben Guerrieri ’26, a computer science major, is actively involved in developing translator algorithms for the project.
  • The research also extends to visual gesture recognition in other areas like musical conducting and exercising.
  • The project aims to enhance accessibility and enable communication for those who do not speak ASL.
  • The collaboration demonstrates the potential of AI in bridging communication gaps and promoting inclusivity.

Main AI News:

Computer science professor Andrea Salgian and Ben Guerrieri are collaborating on an ambitious project aimed at expanding the capabilities of translation services. While platforms like Google Translate have revolutionized communication across languages, Salgian and Guerrieri are now focusing on bridging the gap for American Sign Language (ASL) speakers. By harnessing the power of computer vision and machine learning, they aim to develop a program that serves as a Google Translate equivalent for ASL users, allowing them to sign directly to a camera and receive instant translations.

Initially, the program will focus on recognizing static gestures, such as those representing letters in the ASL alphabet that do not involve hand movement. Salgian describes this stage as akin to building a dictionary, where users can look up individual signs. However, the ultimate goal is to create an automated translation system that can interpret and translate complete phrases and sentences.

Salgian’s research heavily relies on Mediapipe, a machine-learning framework developed by Google. This powerful tool employs computer vision techniques to detect the positions of joints in real time through a camera feed. By tracking the user’s hand movements and extracting the corresponding gestures, the program can match them to ASL signs and provide accurate translations.

Ben Guerrieri, a computer science major, joined Salgian’s project early on during his time at TCNJ (The College of New Jersey). Guerrieri is actively involved in the research, focusing on the development of translator algorithms. He finds the project both intellectually stimulating and personally rewarding, as he gets to witness the real-time results of their incremental algorithmic advancements.

Salgian’s interest in visual gesture recognition extends beyond ASL translation. Her ongoing research also explores applications in fields such as musical conducting and exercising. However, she finds ASL particularly compelling due to its potential for enhancing accessibility. The ability to enable communication for individuals who do not speak ASL but desire to understand it would be an invaluable achievement.

Salgian and Guerrieri’s collaborative efforts to create a Google Translate-like tool for American Sign Language hold tremendous promise. By leveraging computer vision and machine learning technologies, they aim to empower ASL speakers and facilitate cross-linguistic understanding. This project not only showcases the potential of AI in bridging communication gaps but also highlights the profound impact technology can have on accessibility and inclusivity.

Conlcusion:

The development of a Google Translate-like tool for American Sign Language (ASL) has significant implications for the market. By leveraging computer vision and machine learning technologies, this innovative solution addresses a critical communication gap for ASL users.

The market potential for such a tool is substantial, as it caters to the needs of millions of individuals who rely on ASL as their primary means of communication. This advancement not only enhances accessibility and inclusivity but also opens up new opportunities for businesses to engage with ASL speakers in various sectors, such as customer service, education, and healthcare.

Furthermore, the success of this project highlights the transformative power of artificial intelligence in breaking down language barriers and fostering cross-linguistic understanding. As the market continues to embrace technological advancements, organizations that recognize and leverage the potential of ASL translation tools will be well-positioned to meet the needs of a diverse customer base and gain a competitive edge in the evolving landscape of communication services.

Source