Static-gesture word recognition in Bangla sign language using convolutional neural network

Kulsum Ara Lipi, Sumaita Faria Karim Adrita, Zannatul Ferdous Tunny, Abir Hasan Munna, Ahmedul Kabir

Abstract


Sign language is the communication process of people with hearing impairments. For hearing-impaired communication in Bangladesh and parts of India, Bangla sign language (BSL) is the standard. While Bangla is one of the most widely spoken languages in the world, there is a scarcity of research in the field of BSL recognition. The few research works done so far focused on detecting BSL alphabets. To the best of our knowledge, no work on detecting BSL words has been conducted till now for the unavailability of BSL word dataset. In this research, a small static-gesture word dataset has been developed, and a deep learning-based method has been introduced that can detect BSL static-gesture words from images. The dataset, “BSLword” contains 30 static-gesture BSL words with 1200 images for training. The training is done using a multi-layered convolutional neural network with the Adam optimizer. OpenCV is used for image processing and TensorFlow is used to build the deep learning models. This system can recognize BSL static-gesture words with 92.50% accuracy on the word dataset.

Keywords


BSL; BSL word dataset; convolutional neural network; static-gesture signs;

Full Text:

PDF


DOI: http://doi.org/10.12928/telkomnika.v20i5.24096

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

TELKOMNIKA Telecommunication, Computing, Electronics and Control
ISSN: 1693-6930, e-ISSN: 2302-9293
Universitas Ahmad Dahlan, 4th Campus
Jl. Ringroad Selatan, Kragilan, Tamanan, Banguntapan, Bantul, Yogyakarta, Indonesia 55191
Phone: +62 (274) 563515, 511830, 379418, 371120
Fax: +62 274 564604

View TELKOMNIKA Stats