RA - Literacy - P
RA - Literacy - P
RA - Literacy - P
Our game is designed in two modules: a letter Fig.1 A mockup for the game UI
recognition and an object collection game module. In the
following sections, we will discuss the design of these two B. Recognition with AR
modules respectively.
At the scan screen, the system will scan in the capital
A. Letter Recognition Module letter, and generate a list of objects that start with the scanned
The purpose of the letter recognition module is to drive letter. The objects will be rendered in AR space along with a
the user to read and write letters by giving a visual text name. From there the player will be presented with a
representation of an item associated with that letter. The user "play" button at which point they will be transferred to the
will first write down a single, capital letter. The user[s] will object collection game proper.
then scan the letter through the mobile device's camera. The
system will then display the letter's corresponding model in
AR space. Next to the model, a text spelling of the object
will appear. By providing the user with a visual context, we
believe the user's interest will be stimulated, motivating them
to pursue more independent action. To further tap into this
interest, we have developed an accompanying game module
to reinforce the context from the first module.
B. Object Collection Game Module
In this module, we store the objects generated in the first
module into a list. For example, if the user scanned the letter
"T" and the system displayed a "tree" model, the game would
store the tree object in a list. The objects from the list will
populate the game world, along with a several other Fig.2 Sift feature visualization
randomly selected objects. The player will be tasked with
collecting objects they had scanned in from the first module. During the scanning process, the the system will
We anticipate that adding a game element that interacts with first extract visual features from images. In our project, we
objects from the first module will enhance engagement and choose to use the Scale Invariant Feature
reinforce interest in exploring more vocabulary. Transformation(SIFT)[24] technique, a widely used, robust
visual technique. We compiled several letter templates in our
system and extracted the respective features of each
template; the vizualization of the features of a template is
shown in Fig. 2. When a letter is scanned, we pair match the
letter features to the features in the templates and define the
best match as the recognized letter.
We used Unity 3D to create our Augmented Reality
software. For image processing, we used the Vuforia 5
SDK.The device camera will scan the letter and parse it
through the Vuforia text recognition library. The game will
then compare the information with a cloud database and Fig.4 Player (Tank) approaching to collect the car
retrieve the appropriate item name (such as "tree" for T).
V. FUTURE WORK
As a future design, we plan to implement a "read" button,
which will pronounce the name of the object that is being
displayed. Another future implementation is a button labeled
"show another object" which would allow a user to cycle
Fig.3 A running example of the AR software through several models beginning with a corresponding letter
such as "turtle" or "tree" for the letter "T." All models will be
After the letter is identified, the next step is to overlay it displayed in AR space and have an accompanying
with our AR element. The Unity 3D game design engine can pronunciation audio. We also intend to expand the amount of
easily integrate AR components into a mobile interface. As objects in our 3D library and animate the 3D models to
seen in Fig. 3, our application successfully detected the further enhance user interest.
location of the written letter "T". The AR module also
displays the model statically; the image positioning and Future versions will have enhanced functionality. The
orientation stayed constant despite testing several different user will have control over the level of vocabulary
viewing angles and lightning conditions. complexity active in the learning phase. For example, a
three-year-old scanning the letter "A" will trigger "apple",
while for a five-year-old the same "A" will bring out
C. Object Collection Game "Alligator". We also want to have an interface where the
teacher or parents of the user can load in certain vocabulary,
Based on what items were scanned within the AR so that ABC3D can be integrated into users' everyday
portion, the player will be prompted to collect a certain education.
number of items in the game portion. The player will be
placed within a bounded world as a car and will collect items We want to carry out a user study with three to five-
by touching them. Whenever a player collects the correct year-olds and their parents. The subjects will be balanced
item, a counter on the UI will increment. Whenever a player both on gender and current literacy level. We will have one
collects an incorrect item, the game will notify the player group of subjects using ABC3D and the other group using
with the message "No, that is a [item name]." Upon traditional education methods like books. We will measure
collecting all the required items, the player will be presented and compare 1) subjects' improvement in their level of
with a victory screen at which point they may choose to literacy, and 2) their motivation and interest in continuing to
return to the main menu or scan another letter. learn.
VI. CONCLUSION [15] Lankshear, Colin, and Michele Knobel. “New Technologies in Early
Childhood Literacy Research: A Review of Research.” Journal of
Our approach draws upon Hidi's concept of interest as a Early Childhood Literacy 3, no. 1 (2003): 59–82.
motivational variable for learning [10]. ABC3D is consistent [16] Lowe, David G. "Distinctive image features from scale-invariant
with Labbo and Kuhn's work as it creates cognitive chains by keypoints," International Journal of Computer Vision, 60, 2 (2004),
capturing user interest through a process of constant, self- pp. 91-110.
sustaining engagement [13]. Our system highlights the [17] MacArthur, Charles, Steve Graham, and Jill Fitzgerald. Handbook of
exploration of trends within and technology as agents of Writing Research. Guilford Press, 2008.
education. With the expansion of technological accessibility [18] McQuillan, Jeff, and Julie Au. “THE EFFECT OF PRINT ACCESS
ON READING FREQUENCY.” Reading Psychology 22, no. 3 (July
in classrooms [5] we propose ABC3D explores a new 2001): 225–48. doi:10.1080/027027101753170638.
medium from which to approach education, both from the [19] Richards, Heraldo V., Ayanna F. Brown, and Timothy B. Forde.
platforms of interactive media and the situated experiences “Culturally Responsive Instruction.” Accessed November 13, 2015.
of augmented reality. http://www.twinriversusd.org/depts/ci/mathematics/files/Curriculum_
Package_appendix.pdf.
[20] Squire, Kurt, and Eric Klopfer. “Augmented Reality Simulations on
Handheld Computers.” Journal of the Learning Sciences 16, no. 3
VII. REFERENCES (June 13, 2007): 371–413. doi:10.1080/10508400701413435.
[21] Steinkuehler, Constance, and Elizabeth King. “Digital Literacies for
[1] Alexander, Jonathan. “Gaming, Student Literacies, and the the Disengaged: Creating after School Contexts to Support Boys’
Composition Classroom: Some Possibilities for Transformation.” GameͲbased Literacy Skills.” Edited by Constance Steinkuehler. On
College Composition and Communication, 2009, 35–63. the Horizon 17, no. 1 (January 30, 2009): 47–59.
doi:10.1108/10748120910936144.
[2] Armbruster, B.B., & Anderson, T.H. (1981). Content area textbooks
(Reading Edu- cation Report No. 23). Urbana, IL: Center for the [22] “Teens, Social Median & Technology Overview 2015”. Pew
Study of Reading. ResearchCenter, 2015. http://www.pewinternet.org/2015/04/09/teens-
social-media-technology-2015/
[3] Black, REBECCA W., and Constance Steinkuehler. "Literacy in
virtual worlds." Handbook of adolescent literacy research (2009): [23] U.S. Department of Education, Institute of Education Sciences,
271-286. National Center for Education Statistics, National Assessment of
Educational Progress (NAEP), 2015.
[4] Brabham, Edna G., Bruce A. Murray, and Shelly Hudson Bowden.
“Reading Alphabet Books in Kindergarten: Effects of Instructional [24] Vasquez, Vivian. “Resistance, Power-Tricky and Colorless Energy.”
Emphasis and Media Practice.” Journal of Research in Childhood Popular Culture, New Media and Digital Literacy in Early
Education 20, no. 3 (March 2006): 219–34. Childhood, 2005, 201–17.
doi:10.1080/02568540609594563. [25] Wu, Hsin-Kai, Silvia Wen-Yu Lee, Hsin-Yi Chang, and Jyh-Chong
[5] Burnett, C. “Technology and Literacy in Early Childhood Educational Liang. "Current status, opportunities and challenges of augmented
Settings: A Review of Research.” Journal of Early Childhood reality in education." Computers & Education 62 (2013): 41-49.
Literacy 10, no. 3 (September 1, 2010): 247–70.
doi:10.1177/1468798410372154.
[6] Chen, Jie-Qi, and Charles Chang. “Using Computers in Early
Childhood Classrooms Teachers’ Attitudes, Skills and Practices.”
Journal of Early Childhood Research 4, no. 2 (2006): 169–88.
[7] Clements, Douglas H., and Julie Sarama. 2002. “The Role of
Technology in Early Childhood Learning”. Teaching Children
Mathematics 8 (6). National Council of Teachers of Mathematics:
340–43. http://www.jstor.org/stable/41197828.
[8] Gee, James Paul. What Video Games Have to Teach Us About
Learning and Literacy. New York, NY, USA: Palgrave Macmillan,
2007.
[9] Hannon, Brenda, and Meredyth Daneman. “A New Tool for
Measuring and Understanding Individual Differences in the
Component Processes of Reading Comprehension.” Journal of
Educational Psychology 93, no. 1 (2001): 103–28. doi:10.1037/0022-
0663.93.1.103.
[10] Hidi, Suzanne. “Interest: A Unique Motivational Variable.”
Educational Research Review 1, no. 2 (January 2006): 69–82.
doi:10.1016/j.edurev.2006.09.001.
[11] Kerawalla, Lucinda, Rosemary Luckin, Simon Seljeflot, and Adrian
Woolard. “‘Making It Real’: Exploring the Potential of Augmented
Reality for Teaching Primary School Science.” Virtual Reality 10, no.
3–4 (2006): 163–74.
[12] Knobel, Michele, and Colin Lankshear, eds. A New Literacies
Sampler. New Literacies and Digital Epistemologies, v. 29. New
York: P. Lang, 2007.
[13] Labbo, Linda D., and Melanie R. Kuhn. “Weaving Chains of Affect
and Cognition: A Young Child’s Understanding of CD-ROM Talking
Books.” Journal of Literacy Research 32, no. 2 (2000): 187–210.
[14] Langer, Judith A., and Arthur N. Applebee. “Reading and Writing
Instruction: Toward a Theory of Teaching and Learning.” Review of
Research in Education, 1986, 171–94.