|
|
|
Alvaro A. Teran-Quezada, Victor Lopez-Cabrera, Jose Carlos Rangel and Javier E. Sanchez-Galan
Convolutional neural networks (CNN) have provided great advances for the task of sign language recognition (SLR). However, recurrent neural networks (RNN) in the form of long?short-term memory (LSTM) have become a means for providing solutions to problem...
ver más
|
|
|
|
|
|
|
Diksha Kumari and Radhey Shyam Anand
The deaf and hearing-impaired community expresses their emotions, communicates with society, and enhances the interaction between humans and computers using sign language gestures. This work presents a strategy for efficient feature extraction that uses ...
ver más
|
|
|
|
|
|
|
Amrutha K, Prabu P and Ramesh Chandra Poonia
Sign language is a natural, structured, and complete form of communication to exchange information. Non-verbal communicators, also referred to as hearing impaired and hard of hearing (HI&HH), consider sign language an elemental mode of communication ...
ver más
|
|
|
|
|
|
|
Ayanabha Jana and Shridevi S. Krishnakumar
The proposed research deals with constructing a sign gesture recognition system to enable improved interaction between sign and non-sign users. With respect to this goal, five types of features are utilized?hand coordinates, convolutional features, convo...
ver más
|
|
|
|
|
|
|
Angela C. Caliwag, Han-Jeong Hwang, Sang-Ho Kim and Wansu Lim
Sign language aids in overcoming the communication barrier between hearing-impaired individuals and those with normal hearing. However, not all individuals with normal hearing are skilled at using sign language. Consequently, deaf and hearing-impaired in...
ver más
|
|
|
|
|
|
|
Rung-Ching Chen, William Eric Manongga and Christine Dewi
Hand gestures and poses allow us to perform non-verbal communication. Sign language is becoming more important with the increase in the number of deaf and hard-of-hearing communities. However, learning to understand sign language is very difficult and al...
ver más
|
|
|
|
|
|
|
Tomasz Kapuscinski and Dawid Warchol
In this paper, a method for the recognition of static hand postures based on skeletal data was presented. A novel descriptor was proposed. It encodes information about distances between particular hand points. Five different classifiers were tested, incl...
ver más
|
|
|
|
|
|
|
Akm Ashiquzzaman, Hyunmin Lee, Kwangki Kim, Hye-Young Kim, Jaehyung Park and Jinsul Kim
Current deep learning convolutional neural network (DCNN) -based hand gesture detectors with acute precision demand incredibly high-performance computing power. Although DCNN-based detectors are capable of accurate classification, the sheer computing pow...
ver más
|
|
|
|
|
|
|
Shwetha V,Dr.Vijaya laxmi,Dhanin Anoop Asarpota,Himanshu Verma
A sizable population around the world has some form of hearing or speaking disability. This creates a communication barrier among them and the rest of the world. Sign language was introduced to bridge this gap. The objective is to design a glove that can...
ver más
|
|
|
|
|
|
|
Qifan Xue, Xuanpeng Li, Dong Wang and Weigong Zhang
Sign language recognition (SLR) is a bridge linking the hearing impaired and the general public. Some SLR methods using wearable data gloves are not portable enough to provide daily sign language translation service, while visual SLR is more flexible to ...
ver más
|
|
|
|