Conference Template A4

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 5

Story telling app for children with hearing

impairment using Natural Language Processing


(NLP)

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


Palavi Bhole
Samiksha Kale Ajaykumar Gujja
Smt. Indira Gandhi College of
Smt. Indira Gandhi College of Smt. Indira Gandhi College of
Engineering
Engineering Engineering
Navi Mumbai, India
Navi Mumbai, India Navi Mumbai, India
palavibhole@gmail.com
kalesamiksha61@gmail.com ajaygujja5@gmail.com
Prof. Deepti Chandran
Smt. Indira Gandhi College of
Engineering
Navi Mumbai, India
dvcnr@yahoo.co.in
Abstract—When we were kids, our teachers, parents, 3. An app which will narrate children’s stories to deaf
grandparents, etc. used to read to us a lot of fantastic stories. and dumb children by taking in stories in form of
They did that for the time we couldn’t read. Unfortunately, not
all of us are blessed with an ability to hear. The children with text as input and giving images of sign language
hearing impairments might not have had a chance to know gestures as output.
such stories at least in their childhood. This project is based on 4. Target audience is not limited to children with
an app which will narrate children’s stories to hearing hearing impairment.
impaired children by taking in stories in form of text as input
and giving images of sign language gestures and speech as
output. In this paper we present a platform, which translates B. Challenges
written English into Indian Sign Language.
1. The speech and the images of gesture should be in
Keywords—Hearing impairment, Natural language sync.
processing, Application, Storytelling 2. The system should be able to handle large database.
3. The system should be able to translate the words,
I. INTRODUCTION not having gesture, letter-by-letter.
Several computational works dealing with the
translation of sign languages from and into their spoken VII. DESIGN
counter-parts have been developed in the last years. For
instance, (Barberis et al., 2011) describes a study targeting A. Flowchart
the Italian Sign Language, (Lima et al., 2012) targets
LIBRAS, the Brazilian Sign Language, and (Zafrulla et al.,
2011) the American Sign Language. Some of the current
research, focus on sign language recognition (as the latter),
some in translating text (or speech) into a sign language
(like the previously mentioned work dedicated to Italian).
Some works aim at recognizing words (again, like the
latter), others only letters (such as the work about LIBRAS).
Only a few systems perform the two-sided translation,
which is the case of the platform implemented by the
Microsoft Asia group system (Chai et al., 2013), and the
Virtual Sign Translator (Escudeiro et al., 2013).
Unfortunately, sign languages are not universal or a mere
mimic of its country’s spoken counterpart. For instance,
Brazilian Sign Language is not related with the Portuguese
one. Therefore, none or little resources can be re-used when
one moves from one (sign) language to another. The afore
mentioned Virtual Sign Translator targets LGP, as well as the
works described in (Bento,2103) and (Gameiroetal.,2014).
However, to the best of our knowledge, none of these works
explored how current Natural Language Processing (NLP)
tasks can be applied to help the translation process of written
English into ISL, which is one of the focuses of this paper
[1].
B. Algorithm
II. ANALYSIS Dependency parsing is the task of extracting a
dependency parse of a sentence that represents its
Our study is based on LGP videos from different sources, grammatical structure and defines the relationships between
such as the Spread the Sign initiative, and images of hand “head” words and words, which modify those heads.
configurations presented in an ISL dictionary.
Example:
A. Objectives

III. The project’s objective is to build an app which will


narrate children’s stories to hearing impaired children by
taking in stories in form of text as input and giving images
of sign language gestures as output.

IV. To convert text into speech.

V. To convert speech into image of sign gesture.

VI. To use NLP for converting text to sign language.

A. Scope
1. The android application will be based on java or
Kotlin.
2. The concept of NLP is used.
Relations among the words are illustrated above the
sentence with directed, labelled arcs from heads to
dependents (+ indicate the dependent) [2].
Given an n-word sentence:
1. for i := 1 to n do
2. begin
3. for j := i − 1 down to 1 do
4. begin
5. If no word has been linked as head of word i, then
6. if the grammar permits, link word j as head of word
i;
7. If word j is not a dependent of some other word,
then
8. if the grammar permits, link word j as dependent of
word i
9. end VIII.CONCLUSION
10. end [3] This is an innovative project based on an app which will
narrate children’s stories to hearing impaired children by
taking in stories in form of text as input and giving images
C. Text to speech conversion
of Indian sign language gestures and speech as output. In
Text to speech (TTS) makes an android device read the this paper we present a platform, which translates written
text and convert it to audio out via the speaker. Android TTS English into Indian sign language.
supports multiple languages. TTS is a simple but powerful We will be using dependency parsing algorithm and text
feature. It can also be effectively used in mobile APPs to speech synthesis for implementation of the project.
dedicated to visually impaired people [4]. Text to speech
conversion can be done by using speech APIs like Google
speech API. REFERENCES
[1] I. Almeida, L. Coheur and S. Candeias, “Coupling natural language
processing and animation synthesis in Portuguese sign language
D. Speech to sign language conversion translation,” Proceedings of the 2015 Workshop on Vision and
Dependency parser is used for analysing grammatical Language (VL’15), pages 94–103, Lisbon, Portugal, 18 September
structure of the sentence and establishing relationship 2015.
between words. Then, the Indian sign language of input [2] http://nlpprogress.com/english/dependency_parsing.html
sentence will be generated using Indian sign language [3] M. A. Covington, “A Fundamental Algorithm for Dependency
Parsing,” in Proceedings of the 39th annual ACM southeast
grammar rules. The generation of sign language will be conference, 2001, Citeseer, pp. 95–102.
done with signing image. [4] https://javapapers.com/android/android-text-to-speech-tutorial

E. User interface

You might also like