Real time Indian sign language recognition using transfer learning with VGG16

Sumit Kumar, Ruchi Rani, Sanjeev Kumar Pippal, Ulka Chaudhari

Abstract


Normal people’s interaction and communication are easier than those with disabilities such as hearing and speech, which are very complicated; hence, the use of sign language plays a crucial role in bridging this gap in communication. While previous attempts have been made to solve this problem using deep learning techniques, including convolutional neural networks (CNNs), support vector machine (SVM), and K-nearest neighbours (KNN), these have low accuracy or may not be employed in real time. This work addresses both issues: improving upon prior limitations and extending the challenge of classifying characters in Indian sign language (ISL). Our system, which can recognize 23 hand gestures of ISL through a purely camera-based approach, eliminates expensive hardware like hand gloves, thus making it economical. The system yields an accuracy of 97.5% on the training dataset, utilizing a pre-trained VGG16 CNN optimized by the Adam optimizer and cross-entropy loss function. These results clearly show how effective transfer learning is in classifying ISL and its possible real-world applications.

Keywords


convolutional neural networks; pre-trained models; real-time; sign language; VGG16 model;

Full Text:

PDF


DOI: http://doi.org/10.12928/telkomnika.v22i6.26498

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

TELKOMNIKA Telecommunication, Computing, Electronics and Control
ISSN: 1693-6930, e-ISSN: 2302-9293
Universitas Ahmad Dahlan, 4th Campus
Jl. Ringroad Selatan, Kragilan, Tamanan, Banguntapan, Bantul, Yogyakarta, Indonesia 55191
Phone: +62 (274) 563515, 511830, 379418, 371120
Fax: +62 274 564604

View TELKOMNIKA Stats