Tech

New method improves AI translation of sign language

Share
Share
Reading signs: New method improves AI translation of sign language
Adding data such as hand and facial expressions, as well as skeletal information on the position of the hands relative to the body, to the information on the general movements of the signer’s upper body improves word recognition. Credit: Osaka Metropolitan University

Sign languages have been developed by nations around the world to fit the local communication style, and each language consists of thousands of signs. This has made sign languages difficult to learn and understand.

Using artificial intelligence to automatically translate the signs into words, known as word-level sign language recognition, has now gained a boost in accuracy through the work of an Osaka Metropolitan University-led research group. The findings were published in IEEE Access.

Previous research methods have been focused on capturing information about the signer’s general movements. The problems in accuracy have stemmed from the different meanings that could arise based on the subtle differences in hand shape and relationship in the position of the hands and the body.

Graduate School of Informatics Associate Professor Katsufumi Inoue and Associate Professor Masakazu Iwamura worked with colleagues at the Indian Institute of Technology Roorkee, to improve AI recognition accuracy. They added data such as hand and facial expressions, as well as skeletal information on the position of the hands relative to the body, to the information on the general movements of the signer’s upper body.

“We were able to improve the accuracy of word-level sign language recognition by 10–15% compared to conventional methods,” Professor Inoue said. “In addition, we expect that the method we have proposed can be applied to any sign language, hopefully leading to improved communication with speaking- and hearing-impaired people in various countries.”

More information:
Mizuki Maruyama et al, Word-Level Sign Language Recognition With Multi-Stream Neural Networks Focusing on Local Regions and Skeletal Information, IEEE Access (2024). DOI: 10.1109/ACCESS.2024.3494878

Provided by
Osaka Metropolitan University


Citation:
Reading signs: New method improves AI translation of sign language (2025, January 15)
retrieved 15 January 2025
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Shopify is hiring ChatGPT as your personal shopper, according to a new report
Tech

Shopify is hiring ChatGPT as your personal shopper, according to a new report

New code reveals ChatGPT may soon support in-chat purchases via Shopify The...

Apple removed ‘Available Now’ from the Apple Intelligence webpage, but it may not have been Apple’s choice
Tech

Apple removed ‘Available Now’ from the Apple Intelligence webpage, but it may not have been Apple’s choice

Apple has removed “Available Now” from its Apple Intelligence webpage The change...