0% found this document useful (0 votes)
92 views44 pages

Kiruthiga Scan QR Code Model Pattern Using VR Report

Uploaded by

swtycelswtycel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views44 pages

Kiruthiga Scan QR Code Model Pattern Using VR Report

Uploaded by

swtycelswtycel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

GPS TRACKER USING PYTHON

A MINI PROJECT REPORT


Submitted by

CELCIYA L 510423401001
In partial fulfillment for the award of the degree of

MASTER OF ENGINEERING
IN
APPLIED ELECTRONICS

ARUNAI ENGINEERING COLLEGE


Thiruvannamalai 606 603

ANNA UNIVERSITY: CHENNAI 600 025


AUGUST 2024

i
ANNA UNIVERSITY: CHENNAI 600 025

BONAFIDE CERTIFICATE

Certified that this mini project report “SCAN QR CODE PATTERN

MODEL USING VR” is the bonafide work of “Ms. KIRUTHIGA S” who

carried out the mini project work under my supervision.

SIGNATURE SIGNATURE

Dr.S.ELANGO, Mr.S.A.SAMSON,

Prof/ECE. Asst.Prof/ECE.

HEAD OF THE DEPARTMENT SUPERVISOR

Department of ECE Department of ECE

Arunai Engineering College Arunai Engineering College

Thiruvannamalai -606 603. Thiruvannamalai -606 603.

Submitted for the university examination held on ____________________


at ARUNAI ENGINEERING COLLEGE, THIRUVANNAMALAI.

INTERNAL EXAMINER EXTERNAL EXAMINER


ii
ACKNOWLEDGEMENT

I would like to express our sincere gratitude to the Vice Chairman


Er.E.V.KUMARAN,M.E.,for successful completion of our project.

Iam fortunate to convey heartfelt thanks to Our Registrar


Dr.R.SATHIYASEELAN,M.E.Ph.D., and Our Principal
Dr.R.RAVICHANDARAN,M.E.Ph.D., for extending all facilities.

I am very grateful to our HOD Dr.S.ELANGO,M.E.Ph.D, Head of the


Department Electronics and Communication Engineering for his constant
encouragement.

I express my wholehearted thanks and gratitude to my Guide Mr.S.A.SAMSON


M.E., for their esteemed guidance and spending their valuable time with me for the
successful completion of my project.

Finally warm thanks to all our Department Staff members, Parents and friends
who gave moral support for successful completion of this project. It is our dignity
to dedicate this project to them.

Sincerely,

KIRUTHIGA S

iii
ABSTRACT

Mind VR is an open source web au

iv
particularly in mobile devices. Using your device's camera, augmented reality adds

sound, video, graphics, and other sensor-based inputs to real-world objects using

computer vision-based recognition algorithms. It's a good way to render real-world

data and make it interactive, making virtual objects feel like they belong in the real

world. This paper has various sections, including alphabets, fruits, animals, human

organs, tools, and a conversation section. A 3D character will be shown in the real

world through a camera and will teach the user name of the object. For instance, if

the user selects the fruit option, the character will begin by introducing the section

and then displaying a 3D model of various fruits and uttering the names of each

one. If, on the other hand, the user selects the conversation option, the character

will engage in conversation with the user, acting as a waiter and asking questions

like "what do you like to eat or drink?" This will assist the user in figuring out how

to behave and speak at that moment. The design, modeling, and animation of

models and characters are all part of the process described in this paper. Several

software and frameworks, including Autodesk Maya, Unity3D, and augmented

reality foundations, are used to accomplish this.

v
TABLE OF CONTENTS

CHAPTER.NO TITLE PG.NO

ABSTRACT iv

LIST OF FIGURES viii

LIST OF ABBREVIATIONS xi

1 INTRODUCTION 1

1.1 INTRODUCTION 1

1.2 OBJECTIVES 3

1.3 EXISTING SYSTEM 4

1.4 PROPOSED SYSTEM 5

1.5 ADVANTAGES 5

2 LITERATURE SURVEY 6

3 PROBLEM DEFINITION 10

4 INSTALLATION 11

4.1 AFRAME INSTALLATION 11

4.1.1 HTML SCRIPT 11

4.1.2 NPM 12

vi
4.2 THREE.js INSTALLATION 12

4.2.1 HTML SCRIPT 12

4.2.2 NPM 14

4.3 COMPILE TARGET IMAGES 14

4.4 BUILD THE PAGE 15

4.5 WEB SERVER 15

4.6 AR TECHNOLOGY 16

4.6.1 CHALLENGES AND FUTURE OF AR 16

4.7 VR TECHNOLOGY 17

4.7.1 CHALLENGES AND FUTURE OF VR 17

4.8 3-D ASSETS 19

4.8.1 ADDING ASSETS 18

4.8.2 CONSTRUCT THE SCENE 20

4.8.3 EVENT HANDLING 21

4.9 KEY FEATURES 23

4.10 SCOPE OF FUTURE WORK 23

5. IMAGE TRACKING 24

vii
5.1 OUTPUTS 25

6. SOURCE CODE 27

7. CONCLUSION 32

8. REFERENCE 33

viii
LIST OF FIGURES

S.NO FIGURES PG.NO

1. 4.5 WEB SERVER 15

2. 4.6 AR TECHNOLOGY 17

3. 4.7 VR TECHNOLOGY 18

4. 5. IMAGE TRACKING 24

ix
LIST OF ABBREVIATION

QR Quick Response

VR Virtual Reality

AR Augmented Reality

GUI Graphical User Interface

API Application Programming Interface

SDK Software Development Kit

UID Unique Identifier

FPS Frames Per Second

POI Point of Interest

HMD Head Mounted Display

FOV Field of View

3D Three-Dimensional

2D Two-Dimensional

OCR Optical Character Recognition

GPS Global Positioning System

WiFi Wireless Fidelity

BLE Bluetooth Low Energy

ML Machine Learning

AI Artificial Intelligence

IoT Internet of Things

UI User Interface

x
UX User Experience

JSON JavaScript Object Notation

XML eXtensible Markup Language

HTTP Hyper Text Transfer Protocol

HTTPS Hyper Text Transfer Protocol Secure

URL Uniform Resource Locator

MVP Minimum Viable Product

SDK Software Development Kit

IDE Integrated Development Environment

QRVR Quick Response Virtual Reality

ARQRC Augmented Reality QR Code

MRQRC Mixed Reality QR Code

VRQRM Virtual Reality QR Code Model

3DQR Three-Dimensional Quick Response

CV Computer Vision

OCR Optical Character Recognition

xi
CHAPTER 1
INTRODUCTION

1.1 INTRODUCTION
Virtual reality (VR), the use of computer modeling and simulation that
enables a person to interact with an artificial three-dimensional (3-D) visual or
other sensory environment. Support Image tracking and Face tracking. For
Location or Fiducial-Markers Tracking, checkout Image.js Written in pure html,
end-to-end from the underlying computer vision engine to frontend. Utilize gpu
(through webgl) and web worker for performance. Developer friendly. Easy to
setup. With AFRAME extension, you can create an app with only 10 lines of
codes.

As part of this new era in entertainment and engagement – the Rock and Bear
AR Project! In a world where technology constantly evolves, we present a
groundbreaking fusion of augmented reality (AR) and animated gifs to
revolutionize the way we experience digital content. Imagine holding a simple card
in your hands, and with a flick of your wrist, a captivating gif video materializes
before your very eyes, bringing characters to life, and stories to unfold in stunning
detail. Our project aims to redefine the boundaries of traditional entertainment by
seamlessly integrating the digital realm with the physical world.

Interactive gif videos with our innovative AR technology, static images on


cards transform into dynamic gif videos, offering an immersive and interactive
viewing experience. Engaging content whether it's showcasing the adventures of
the Rock and Bear duo or exploring fantastical worlds, our project offers a diverse
range of content to captivate audiences of all ages. User-Friendly Interface Simple

1
and intuitive controls allow users to navigate through the AR experience
effortlessly, ensuring that everyone can enjoy the magic of Rock and Bear with
ease.

Customizable options from personalized messages to customizable gif


animations, our project offers endless possibilities for creativity, making each
interaction unique and memorable. Impact the Rock and Bear AR Project
represents more than just a technological innovation; it signifies a shift towards
interactive storytelling and immersive entertainment. By combining the nostalgia
of traditional cards with the cutting-edge capabilities of AR technology, we aim to
inspire creativity, spark imagination, and forge meaningful connections between
people and the digital world.

Join us on this extraordinary journey as we embark on a quest to bring the


magic of Rock and Bear to life like never before. Let your imagination run wild,
and prepare to be enchanted by the endless possibilities of augmented reality!
Quick Response (QR) codes with Virtual Reality (VR) technology opens up
exciting possibilities for interactive and immersive experiences. This mini-project,
titled "Scan QR Code Pattern Model Using VR", aims to explore and
demonstrate the innovative use of VR to enhance the functionality and usability of
QR codes. In recent years, the use of QR codes has become ubiquitous across
various industries due to their ability to store and share information quickly and
efficiently. Traditional methods of scanning QR codes involve using smart phone
cameras or dedicated scanning devices. However, with the rapid advancements in
Virtual Reality (VR) technology, a novel approach to scanning QR codes is
emerging. This approach leverages VR environments to enhance user interaction
and engagement while scanning QR

2
1.2 OBJECTIVES
1. Increase user engagement and satisfaction through immersive VR
environments triggered by QR codes.
2. Streamline information access provide quick and efficient access to
complex information by scanning QR codes that lead to detailed VR
content.
3. Innovate in education utilize QR codes to transport students to virtual
classrooms, labs, and historical sites, making learning more interactive
and effective.
4. Improve training and simulation use QR codes to access VR training
modules and simulations in various fields, such as healthcare,
engineering, and military.
5. Foster innovation and creativity encourage the development of new
applications and creative uses for QR code and VR integration.
6. Increase Accessibility and Inclusivity- Make advanced VR experiences
accessible to a broader audience by simplifying access through QR code
scanning.
7. Enhance Data Visualization and Interaction- Utilize QR codes to
present data in an interactive VR environment, making complex
information easier to understand and manipulate.
8. Support remote and hybrid experiences facilitate remote and hybrid
experiences in education, training, and collaboration by integrating QR
codes and VR.

3
1.3 EXISTING SYSTEM
An existing system can be inconvenient QR codes require a smart phone with
the ability to scan the code. Some modern phones have the ability built in to their
camera, but oftentimes the user has to download an app. Some users may also rely
on non-smart phones, or simply don’t have their phone on them. Requires internet
connection QR codes also require an internet connection in order to function.
People with otherwise-compatible smart phones may have low signal or no access
to Wi-Fi, preventing them from accessing whatever is behind the QR code. Google
Cardboard is an affordable VR platform that uses a simple viewer and a smart
phone. Various AR apps can scan QR codes to trigger VR experiences. Microsoft
Holo Lens is a mixed reality headset that supports both AR and VR experiences. It
can use QR codes to initiate VR content. Unity is a powerful game development
platform, and Vuforia is an AR platform that can be integrated with Unity to create
QR code-triggered VR experiences. Developers create VR applications where
scanning a QR code with a mobile device triggers a transition to a VR
environment. Vuforia handles the QR code recognition and tracking. These
platforms allow developers to create apps where scanning a QR code initiates a VR
or AR experience. These experiences can be viewed using compatible devices like
iPhones, iPads, and Android smartphones. Users scan QR codes with Blippar or
Zappar apps to access VR content. These platforms provide tools for creating and
deploying such experiences. QR codes can be used to provide quick access to
specific virtual rooms in Hubs. Users scan the QR code with their device to enter a
VR environment. Creating and maintaining applications that integrate QR codes
with VR can be complex and resource-intensive.

4
1.4 PROPOSED SYSTEM
The proposed methodology includes fast QR codes are tailored to smart phone
usage, and many people nowadays have a smart phone always on them. Scanning
is faster than typing a URL in, making responses more likely. Cost-effective QR
codes don’t require any special training to make or use, and since they can store a
lot of information, you can reduce your printing and marketing costs by having it
redirect to an online page. While printed ads and brochures have their own
advantage, sometimes it’s easier to save the big stuff for a webpage. Reliable QR
codes are scan able from multiple angles, and even hold their information if the QR
code is partly damaged. Much easier to use than a barcode. Customizable and
trackable a QR code can hold a lot of information, so you’re rarely limited by the
length of your URL. Plus, since it links to an online platform, you may be able to
track users and gain data that way.

1.5 ADVANTAGES

1. Enhanced User Engagement


2. Simplified Access to Information
3. Innovative Learning and Training
4. Improved Marketing and Retail Strategies
5. Enhanced Data Visualization
6. Greater Accessibility and Inclusivity
7. Cost-Effective Solutions
8. Enhanced Remote Collaboration
9. Personalization and Customization
10. Environmental Benefits

5
CHAPTER 2
2. LITERATURE SURVEY
[1] The impact of QR codes on purchase intention and customer
satisfaction
Published Year: November 29,2018

Author: Md Shamim Hossain, Xiaoyan Zhou, Mst Farjana Rahman

Methodology: Artificial Intelligence


Description: This paper is mainly intended to provide impact of QR codes on
perceived flow, followed by perceived flow on purchase intention and customer
satisfaction, and finally the combination impact of perceived flow and customer
satisfaction on purchase intention. Stimulus–Organism–Response theoretical
model was used in this research. Data were collected by online questionnaires from
420 valid respondents who purchased a product online via QR code. The
covariance-based structural equation modeling approach was used to analyze the
structural model and measurements. This study showed that QR codes have a great
impact on purchase intention and customer satisfaction. The findings of this article
confirmed that QR codes influence perceived flow, combined effect of them, in
turn, influenced online shoppers’ satisfaction and, finally, purchase intention.
These findings are practically significant for both marketers and customers.
Marketers should use QR codes as an embedded advertising tool with particular
URLs that will lead to more profitability for shopping agency. Besides customers
can also learn their behavior regarding QR codes in the online shopping context.
This research has great real implications in the field of Internet shopping, online
marketing, retailing, marketing promotion, and consumer behavior.

6
[2] Evaluation of Deep Learning Techniques for Qr Code Detection
Published Year: 2019
Author: Leonardo Blanger, Nina S. T. Hirata

Methodology: Deep Learning

Description: This paper is mainly with the development of Deep learning for
employ deep learning models for detecting QR Codes in natural scenes. A series of
different model configurations are evaluated in terms of Average Precision, and an
architecture modification that allows detection aided by object subparts annotations
is proposed. This modification is implemented in our best scoring model, which is
compared to a traditional technique, achieving a substantial improvement in the
considered metrics. The dataset used in our evaluation, with bounding box
annotations for both QR Codes and their Finder Patterns (FIPs), will be made
publicly available. This dataset is significantly bigger than known available options
at the moment, so we expect it to provide a common benchmark tool for QR Code
detection in natural scenes.

[3] To limit physical contact and interaction time between doctor


and patient Using Covid-19 era

Published Year: 2020


Author: Andrea Faggiano, Stefano Carugo
Methodology: Smartphone Technology
Description: This paper is mainly during this pandemic, all health care workers
must wear personal protective equipment, which complicates interactions with
patients. Collecting patients’ information and medical history requires a longer
time and use of tools (i.e. pen and paper) that could facilitate infection. As already
experienced at the Hospital Universitario Gonzalez (Mexico),7 the use of a survey
accessible via QR code could reduce doctor–patient interaction time. Posters
containing a QR code linked to a survey that collects patients’ data (e.g. symptoms,

7
risk factors and medical history) can be placed in the waiting rooms of emergency
services and general practitioners’ clinics. This would provide immediate
information about the patient before the medical examination, allowing the staff to
quickly prioritize patients’ needs. The exposure of health care workers to
potentially infected patients, and vice versa, is also reduced.

[4] An Efficient Machine Learning-Based Model to QR Code


Published Year: October 2023

Author: Ahmad B.Wardak, Adnan M.Abu-Mahfouz, Tariq Umer, Mirsat


Yesiltepe and SadafWaziry

Methodology: Machine Learning


Description: This paper is mainly intended to provide Granting smart device
consumers with information, simply and quickly, is what drives quick response
(QR) codes and mobile marketing to go hand in hand. It boosts marketing
campaigns and objectives and allows one to approach, engage, influence, and
transform a wider target audience by connecting from offline to online platforms.
However, restricted printing technology and flexibility in surfaces introduce noise
while printing QR code images. Moreover, noise is often unavoidable during the
gathering and transmission of digital images. Therefore, this paper proposed an
automatic and accurate noise detector to identify the type of noise present in QR
code images. Noisy images are hazardous to the training of neural networks and
other techniques, decreasing the classification performance of the networks. By
scanning the QR code, we can receive accurate information in real time. The
typical QR code, which is composed of black and white modules, is unsightly and
difficult to see. In recent years, there has been an increase in the usage of graphic
QR codes in product packaging and marketing activities.

8
[5] A Large-Scale VR Panoramic Dataset of QR Code and Improved
Detecting
Published Year: 2023

Author: Zhu Zehao, Guangtao Zhai, Jiahe Zhang

Methodology: Deep Learning


Description: This paper is mainly intended to provide with the rapid development
of mobile payment and scanning technology, QR code has become widespread in
both consumer and enterprise domains. However, there is the lack of
corresponding research on detecting QR code in panoramic video due to the lack of
high-quality datasets. To fill this gap, in this work we propose a large-scale
panoramic QR code dataset to facilitate relevant research. Our dataset includes the
following characteristics: It is by far the largest dataset in terms of image quantity.
Compared with the existing datasets, ours is closer to the realistic setting and can
derive a variety of research problems. In addition to the dataset, we propose a QR
code detecting approach in a complex environment based on deep learning
improving accuracy of QR code detection. Nowadays, the QR code is often used in
many popular fields, such as payment and social networking. Therefore, it is
particularly important to quickly and accurately detect the position of QR code in
real complex scenes. Traditional QR code detection methods mainly use hand-
engineered features for detection. However, the QR code photos we take may be
blurred due to pixel, distance, and other problems, and may even produce some
rotations and deformations because of the complex scenes. Under such
circumstances, the traditional QR code detection methods may not be so
applicable.

9
CHAPTER - 3
3. PROBLEM DEFINITION
Developing a seamless and user-friendly system for scanning QR code pattern
models using VR, addressing the following challenges Integration Complexity,
Integrating QR code scanning functionality with VR applications and ensuring
compatibility across different devices and platforms. User experience designing an
intuitive user interface that seamlessly transitions users from QR code scanning to
immersive VR experiences, ensuring a smooth and engaging user journey.
Technical limitations overcoming technical challenges, such as latency issues, data
transfer rates, and device compatibility, to provide a seamless and high-quality VR
experience. Content accessibility ensuring that QR code-triggered VR content is
easily accessible and available to a wide range of users, including those with
limited technical expertise. Data security implementing robust security measures to
protect user data and prevent unauthorized access to VR content triggered by QR
codes. Scalability and Performance: Designing a system that can handle a large
number of simultaneous users scanning QR codes and accessing VR content
without compromising performance. User engagement and retention implementing
features that enhance user engagement and encourage repeat interactions with QR
code-triggered VR experiences. Cost effectiveness developing a system that
provides value for money in terms of development, deployment, and maintenance
costs, ensuring a positive return on investment for businesses and organizations.

10
CHAPTER – 4
4. INSTALLATION
Since v1.2.0, MindAR migrated to ES Module, which align with Three.JS v137
and onwards. For prior version of MindAR, please go to Installation-v1.1.x
MindAR project can be run directly in plain static HTML file. It's super easy!
MindAR comes with different types of tracking capabilities, including Image
Tracking and Face Tracking. To minimize library size, each of these are
independently built. Moreover, MindAR provides native support for three.js or
AFRAME. They are also being built independently. So altogether, there are 2x2 =
4 sets of distributions. There are two generally two ways to install the library,
either through HTML script or NPM.
4.1 AFRAME INSTALLATION:
Mind VR comes with an AFRAME extension that allows you to construct a
3D scene easily. AFRAME In short, you can see the <a-scene> block inside body.
This is the main part of the application. Most of the time you can just copy this
block of code as a template to start. We'll highlight two things here related to
MindVR.
Within <a-scene> you can see a property mindar-image="imageTargetSrc:
./targets.mind;" It tells the engine where is the compiled .mind file you built earlier.
There is an <a-entity>, with a property mindar-image-target="targetIndex: 0".
This tells the engine to detect and track a particular image target. The target Index
is always 0, if your targets.mind contains only a single image.
4.1.1 HTML SCRIPT:
Image Tracking
<scriptsrc="https://aframe.io/releases/1.5.0/aframe.min.js"></script>
<scriptsrc="https://cdn.jsdelivr.net/npm/mind-ar@1.2.5/dist/mindar-image-
aframe.prod.js"></script>
11
Face Tracking
<scriptsrc="https://aframe.io/releases/1.5.0/aframe.min.js"></script>
<scriptsrc="https://cdn.jsdelivr.net/npm/mind-ar@1.2.5/dist/mindar-face-
aframe.prod.js"></script>
4.1.2 NPM:
> npm i mind-ar --save
> npm i aframe --save
Image Tracking
import 'aframe';
import 'mind-ar/dist/mindar-image-aframe.prod.js';
Face Tracking
import 'aframe';
import 'mind-ar/dist/mindar-face-aframe.prod.js';
4.2 THREE.js INSTALLATION:
To align with Three.JS official installation of using ES module and import
maps since v137, MindAR three.JS version also follows a similar pattern.
Since MindAR v1.2.0, Three.JS becomes an external dependency, so you can
choose your own Three.JS version, but the minimum supported version is v137.
4.2.1 HTML SCRIPT:

Image Tracking

<script type="importmap">
{

"imports": {

"three": "https://unpkg.com/three@0.160.0/build/three.module.js",

12
"three/addons/": "https://unpkg.com/three@0.160.0/examples/jsm/",

"mindar-image-three":"https://cdn.jsdelivr.net/npm/mind-ar@1.2.5/dist/mindar-
image-three.prod.js"

</script> and then in your application:

<script type="module">

import * as THREE from 'three';

import { MindARThree } from 'mindar-image-three';

</script>

Face Tracking

<script type="importmap">

"imports": {

"three": "https://unpkg.com/three@0.160.0/build/three.module.js",

"three/addons/": "https://unpkg.com/three@0.160.0/examples/jsm/",

"mindar-face-three":"https://cdn.jsdelivr.net/npm/mind-ar@1.2.5/dist/mindar-
face-three.prod.js"

}
13
</script> and then in your application:

<script type="module">

import * as THREE from 'three';

import { MindARThree } from 'mindar-face-three';

</script>

4.2.2 NPM:

> npm i mind-ar --save

Image Tracking

import {MindARThree} from 'mind-ar/dist/mindar-image-three.prod.js';

Face Tracking

import {MindARThree} from 'mind-ar/dist/mindar-face-three.prod.js';

4.3 COMPILE TARGET IMAGES:

Before working on the webpage, we first need to preprocess (a.k.a. compile)


the images. We need to scan the images and extract interesting locations (a.k.a.
feature points) so we can detect and track the images later. This preprocessing step
takes time, therefore we want to do it beforehand as to reduce the loading time
when users actually use your AR app later. MindAR comes with a super friendly
compilation tool to do this. Image Targets Compiler.

14
4.4 BUILD THE PAGE:
Now you have the targets.mind file ready, we can start building the webpage.
First, create a clean folder for your project, let say mindar-project. Put the
targets.mind file there and create a blank html file, let's say index.html. So the
folder should have two files like this:
./targets.mind
./index.html
4.5 WEB SERVER:
Although, it's a simple html page, you probably cannot run it simply by
opening the file in browser. The reason is that the page requires camera access.
I believe there are many possible workarounds to that problems, like setting the
browser policy or something. One way to this problem is to setup a localhost server
that can server webpage. If you are web developer, I'm sure you probably have
some sort of localhost already in your machine. If not, you can try this chrome
extension: Web Server for Chrome. It will launch a simple web server, and you can
use it to open the index.html built in the last section. If you can successfully launch
the page, the camera will start. After you point it to the image target, you will see a
blue rectangle sticked on top.

4.5 WEB SERVER

15
4.6 AR TECHNOLOGY:
Augmented Reality (AR) technology blends digital information with the
physical environment in real-time, providing an interactive and immersive
experience. It overlays virtual elements, such as images, videos, or 3D models,
onto the user's view of the real world through a device, such as a smartphone,
tablet, or AR glasses. AR enhances the user's perception of reality by adding
contextual information, making it useful for various applications across industries.
Here are some key aspects of AR technology.
4.6.1 CHALLENGES AND FUTURE OF AR:
Hardware limitations current AR devices can be bulky, expensive, and have
limited field of view, hindering widespread adoption. Content creation developing
high-quality AR content requires specialized skills and tools, limiting the
availability of diverse and engaging experiences. Privacy and security AR raises
concerns about privacy and data security, especially regarding the collection and
use of personal information in AR applications. Integration with AI and IoT future
advancements in AR will likely involve integration with artificial intelligence (AI)
and the Internet of Things (IoT) to create more intelligent and context-aware
experiences. Advancements in display technology improvements in display
technology, such as transparent displays and light-field displays, could lead to
more natural and immersive AR experiences.
Standardization and interoperability establishing industry standards and
ensuring interoperability between different AR platforms will be crucial for the
widespread adoption and growth of AR technology. Overall, AR technology has
the potential to transform how we interact with the world around us, offering new
ways to learn, work, shop, and communicate. As the technology continues to
evolve and overcome its challenges, AR is expected to become increasingly

16
integrated into our daily lives, opening up endless possibilities for innovation and
creativity.

4.6 AR TECHNOLOGY
4.7 VR TECHNOLOGY:
Virtual Reality (VR) technology immerses users in a completely digital
environment, simulating a realistic experience that can be interactive and engaging.
VR typically involves the use of a head-mounted display (HMD) and other sensory
devices to create a sense of presence in the virtual world. Here are key aspects of
VR technology.
4.7.1 CHALLENGES AND FUTURE OF VR:
Motion sickness some users experience motion sickness or discomfort in
VR, especially with certain types of movement or visual effects. Hardware
limitations current VR headsets can be bulky, expensive, and have limited field of
view, resolution, and refresh rate. Content quality developing high-quality VR
content requires significant resources and expertise, limiting the availability of
diverse and engaging experiences.

17
Accessibility VR technology is not yet widely accessible to all due to cost,
technical requirements, and physical limitations. Social acceptance VR technology
raises concerns about privacy, social isolation, and the blurring of lines between
reality and virtual reality. Advancements in display technology improvements in
display technology, such as higher resolution and wider field of view, could lead to
more realistic and comfortable VR experiences. Integration with other technologies
VR is expected to integrate with other technologies such as artificial intelligence,
augmented reality, and the Internet of Things, creating new possibilities for
interactive and immersive experiences. Overall, VR technology has the potential to
transform various industries and aspects of daily life by providing immersive and
interactive experiences that were previously only possible in science fiction. As the
technology continues to evolve and become more accessible, it is expected to play
an increasingly significant role in entertainment, education, training, and many
other fields.

4.7 VR TECHNOLOGY

18
4.8 3-D ASSETS:
It's an augmented reality app, so it's not fun without some 3D assets. It's
likely that you now are using your desktop computer to go through this tutorial. In
that case, you can start run the webpage with your computer, which hopefully has
equipped with a webcam. Then, you can use your mobile phone to open this target
image, and put your phone screen in front of your desktop webcam to see the
effect. If you don't have two devices, you can also choose to print this image out
and test it in paper. Make sure you get it working before going to the next section,
in which we will start doing interesting stuff.
4.8.1 ADDING ASSETS:
The first thing we need to do is to add some assets to the scene. In AFRAME,
we do this by a-assets. Add this block of code inside the <a-scene/> element

<a-assets>
<img id="card" src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-
js@1.2.5/examples/image-tracking/assets/card-example/card.png" />
<a-asset-item id="avatarModel" src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-
js@1.2.5/examples/image-tracking/assets/card-example/softmind/scene.gltf"></a-
asset-item>
</a-assets>

The first one is actually our target image. The second one is a 3D
model in gltf format. AFRAME basically supports all the standard 3D format, so
you can probably replace it with the models of your choices later.

19
4.8.2 CONSTRUCT THE SCENE:
Now we can replace the dull rectangular plane in the earlier example with an
image asset.

<a-plane src="#card" position="0 0 0" height="0.552" width="1" rotation="0 0


0"></a-plane>

Also, we will add an animated 3D model on top of the image.

<a-gltf-model rotation="0 0 0 " position="0 0 0.1" scale="0.005 0.005 0.005"


src="#avatarModel" animation="property: position; to: 0 0.1 0.1; dur: 1000;
easing: easeInOutQuad; loop: true; dir: alternate">

The scale of the 3D model we use here was normalized to -1 to 1, therefore


we set an appropriate scale 0.005. We also have an animation to make the model
oscillate between 0 to 0.1 in z-axis. We will not go into the details of the
animation, but they are just standard AFRAME stuff.

Finally, we have also modify some rendering properties inside <a-scene>


(Optional)

color-space="sRGB" renderer="colorManagement: true, physicallyCorrectLights"

Sorry, I'm not entirely sure what that does, but it seems like the rendering is
prettier. You can skip this and still see the effect, not a big deal.

20
4.8.3 EVENT HANDLING:
MindAR will fire the events when happen:
ArReady- After arSystem.start(), or autostart, AR engine needs to boot up, when
it's ready, this event will be fired up. You can listen to this event through the scene
element:
const sceneEl = document.querySelector('a-scene');
sceneEl.addEventListener("arReady", (event) => {
// console.log("MindAR is ready")
});
ArError- Sometimes, AR engine might be failed to start. There could be many
reasons, but one most likely reason is camera failed to start. When this happens,
this event will be fired up.
const sceneEl = document.querySelector('a-scene');
sceneEl.addEventListener("arError", (event) => {
// console.log("MindAR failed to start")
});
targetFound and targetLost
This events are fired up when the image target is detected/lost. You can listen to
these events through the <a-entity> // detect target found
const exampleTarget = document.querySelector('#example-target');
exampleTarget.addEventListener("targetFound", event => {
console.log("target found");
}); // detect target lost
exampleTarget.addEventListener("targetLost", event => {
console.log("target lost");
});
<a-entity id="example-target" mindar-image-target="targetIndex: 0">

21
</a-entity>
Click- When you want to do interaction with the content, one thing you likely want
to detect is when the user click/touch a certain elements. Actually, this is
AFRAME stuff, but we'll also included here for reference. First, you need to
include the following cursor and raycaster in the <a-camera> element like this:
<a-camera position="0 0 0" look-controls="enabled: false" cursor="fuse: false;
rayOrigin: mouse;" raycaster="far: ${customFields.libVersion}; objects:
.clickable"></a-camera> and then in the object that you want to detect, add a class
clickable. Actually, it doesn't mean to be clickable, but the same as what you
specified in the raycaster above. <a-plane id="example-plane" class="clickable"
color="blue" opacity="0.5" position="0 0 0" height="0.552" width="1"
rotation="0 0 0"></a-plane> Then, it's ready. You can listen to the click event like
this: // detect click event
const examplePlane = document.querySelector('#example-plane');
examplePlane.addEventListener("click", event => {
console.log("plane click");
});
Title Description
Minimal The bare minimal example
Basic A very typical use case. Detect a single image target and show some 3D
objects on top.
Multiple Targets- In this example, you will learn how to detect multiple targets,
and track one of them at a time. In this example, you will learn how to detect and
track multiple targets simultaneously. Custom UI- In this example, you will learn
how to customize the UI, including Loading and Scanning screen.
Events Handling- This is an advanced topic. In this example, you will learn how to
interact with all the available APIs of MindAR.

22
Interactive-This is a comprehensive example. It shows you how to create an
interactive applications by utilizing all the available features.
4.9 KEY FEATURES:
It has the potential to make our lives safer and more hassle-free, the entire
world is eagerly anticipating the augmented reality of the future. The impossible
can be made possible with AR. Exploration and invention have always been
methods by which humans have sought to advance themselves. This inherent
quality of humans can be seen in augmented reality. By allowing customers to
access online real-time information about products by simply scanning the product
QR codes, this paper establishes that the use of QR codes in shopping malls can
significantly influence customers' speedy and efficient shopping. To begin, a
checkout system will need to be incorporated into the developed prototype to
provide customers with an entirely new shopping experience from choosing
products to locating them and checking out to avoid long lines and thus improve
their retail concept.
4.10 SCOPE OF FUTURE WORK:
As for the future of QR codes, experts agree that it is customers who will
eventually decide when they will disappear, as it happens with many apps that
nobody ends up downloading. However, this study will investigate a variety of
strategies for enhancing the customer experience. The system's floor directions,
also help customers become more familiar with the shopping mall. In addition, the
shopping mall's accurate statistical data reports and dependable data mining on
consumer and product information will be made possible by the system.
Customers are also permitted to investigate their smart phones' capabilities. It will
depend a lot on what added value you give the user with the QR code, but also the
ease and incentive they have to use it.

23
CHAPTER – 5
5. IMAGE TRACKING:
BASIC:

0-- id: basic title: Basic

sidebar_label: Basic

This is a very typical example that detect and track one target image, and
display a 3D effects on top. We have a step-by-step tutorial in Quick
Start. If you are new to MindAR, please check that out to understand
some basic principles.

You can use the following target images for testing:

5. IMAGE TRACKING

24
5.1 OUTPUTS:

25
26
CHAPTER – 6

6. SOURCE CODE:

<html>

<head>

<meta name="viewport" content="width=device-width, initial-scale=1" />

<script src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-js@1.0.0/dist/mindar-
image.prod.js"></script>

<script src="https://aframe.io/releases/1.2.0/aframe.min.js"></script>

<script src="https://cdn.jsdelivr.net/gh/donmccurdy/aframe-
extras@v6.1.1/dist/aframe-extras.min.js"></script>

<script src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-js@1.0.0/dist/mindar-
image-aframe.prod.js"></script>

</head>

<body>

<a-scene mindar-image="imageTargetSrc:
https://cdn.jsdelivr.net/gh/hiukim/mind-ar-js@1.0.0/examples/image-
tracking/assets/band-example/band.mind; maxTrack: 2" color-space="sRGB"
renderer="colorManagement: true, physicallyCorrectLights" vr-mode-ui="enabled:
false"

device-orientation-permission-ui="enabled: false">

<a-assets>

27
<a-asset-item id="bearModel" src="https://cdn.jsdelivr.net/gh/hiukim/mind-
ar-js@1.0.0/examples/image-tracking/assets/band-example/bear/scene.gltf"></a-
asset-item>

<a-asset-item id="raccoonModel"
src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-js@1.0.0/examples/image-
tracking/assets/band-example/raccoon/scene.gltf"></a-asset-item>

<img id="card" src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-


js@1.1.5/examples/image-tracking/assets/card-example/card.png" /><a-asset-item
id="avatarModel" src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-
js@1.1.5/examples/image-tracking/assets/card-example/softmind/scene.gltf"></a-
asset-item>

</a-assets>
<a-camera position="0 0 0" look-controls="enabled: false"></a-camera>

<a-entity mindar-image-target="targetIndex: 0">

<a-plane src="#card" position="0 0 0" height="0.552" width="1"


rotation="0 0 0"></a-plane>

<a-gltf-model rotation="0 0 0 " position="0 0 0.1" scale="0.005 0.005


0.005" src="#avatarModel"

animation="property: position; to: 0 0.1 0.1; dur: 1000; easing:


easeInOutQuad; loop: true; dir: alternate">

</a-entity>

<a-entity mindar-image-target="targetIndex: 0">

28
<a-gltf-model rotation="0 0 0 " position="0 -0.25 0" scale="0.05 0.05 0.05"
src="#raccoonModel" animation-mixer>

</a-entity>

<a-entity mindar-image-target="targetIndex: 1">

<a-gltf-model rotation="0 0 0 " position="0 -0.25 0" scale="0.05 0.05 0.05"


src="#bearModel" animation-mixer>

</a-entity>

</a-scene>

</body>

</html>

IMAGE.HTML:

<html>

<head>

<meta name="viewport" content="width=device-width, initial-scale=1" />

<script src="https://aframe.io/releases/1.5.0/aframe.min.js"></script>

<script src="https://cdn.jsdelivr.net/gh/donmccurdy/aframe-
extras@v7.0.0/dist/aframe-extras.min.js"></script>

<script src="https://cdn.jsdelivr.net/npm/mind-ar@1.2.5/dist/mindar-
image-aframe.prod.js"></script>

</head>

29
<body>

<a-scene mindar-image="imageTargetSrc:
https://cdn.jsdelivr.net/gh/hiukim/mind-ar-js@1.2.5/examples/image-
tracking/assets/band-example/band.mind;" color-space="sRGB"
renderer="colorManagement: true, physicallyCorrectLights" vr-mode-ui="enabled:
false" device-orientation-permission-ui="enabled: false">

<a-assets>

<a-asset-item id="bearModel" src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-


js@1.2.5/examples/image-tracking/assets/band-example/bear/scene.gltf"></a-
asset-item>

<a-asset-item id="raccoonModel"
src="https://cdn.jsdelivr.net/gh/hiukim/mind-ar-js@1.2.5/examples/image-
tracking/assets/band-example/raccoon/scene.gltf"></a-asset-item

</a-assets>

<a-camera position="0 0 0" look-controls="enabled: false"></a-camera>

<a-entity mindar-image-target="targetIndex: 0">

<a-gltf-model rotation="0 0 0 " position="0 -0.25 0" scale="0.05 0.05 0.05"


src="#raccoonModel" animation-mixer>

<a-entity>

<a-entity mindar-image-target="targetIndex: 1">

<a-gltf-model rotation="0 0 0 " position="0 -0.25 0" scale="0.05 0.05 0.05"


src="#bearModel" animation-mixer>

30
</a-entity>

</a-scene>

</body>

</html>

31
CHAPTER – 7
CONCLUSION
We studied QR code technology, its benefits, application areas, and its impact
on marketing and technological world. Initially, QR code are developed and use
for inventory tracking stuff but, now these days, they found applications in many
new areas like marketing, advertising, secure payment systems, education
industries, etc. Adoption of the QR codes grows rapidly during past years and
number of users increases exponentially, due to its features like high data storage
capacity, fast scanning, error-correction, direct marking and ease of use. The
integration of QR codes with Virtual Reality (VR) technology offers a promising
avenue for creating immersive and interactive experiences across various fields. By
leveraging the simplicity and accessibility of QR codes to trigger and navigate VR
content, this approach addresses several challenges and enhances user engagement.
Here are the key takeaways.

The combination of QR codes and VR technology holds significant potential for


transforming how we interact with digital content. This integration offers
numerous advantages, including enhanced user engagement, simplified access to
information, innovative educational and training applications, improved marketing
strategies, and cost-effectiveness. By addressing the current challenges and
leveraging ongoing technological advancements, the adoption of QR code-
triggered VR experiences is poised to grow, unlocking new possibilities for
immersive and interactive experiences across various industries.

32
REFERENCES
[1] AZUMA, R. T. 1997. A survey of augmented reality. In Teleoperators and
Virtual Environments, 355–385. CHEN, F.-C. 2007. Designing a Personalized
Mobile Shopping System for Cell Phones by QR Code. Master’s thesis, Tatung
University.

[2] FIALA, M. 2007. Webtag: A worldwide internet-based AR system. In 6th


International Symposium of Mixed and Augmented Reality.

[3] PARIKH, T. S., AND LAZOWSKA, E. D. 2006. Designing an architecture for


delivering mobile information services to the rural developing world. In
Proceedings of the Seventh IEEE Workshop on Mobile Computing Systems and
Applications, vol. 21, 31–33. PSYTEC INC., 1999. Psytec Inc. QR Code

[4] REKIMOTO, J., AND AYATSUKA, Y. 2000. CyberCode: Designing


augmented reality environments with visual tags. In Proceedings of DARE 2000
on Designing augmented reality environments.

[5] SEINO, K., KUWABARA, S., MIKAMI, S., TAKAHASHI, Y.,


YOSHIKAWA, M., NARUMI, H., KOGANEZAKI, K., WAKABAYASHI, T.,
AND NAGANO, A. 2004. Development of the traceability system which secures
the safety of fishery products using the QR Code and a digital signature. In
Proceedings of Marine Technology Society/IEEE TECHNO-OCEAN, vol. 1, 476–
481.

[6] TAKAO, N., 2007. libdecodeqr. http://trac.koka-in.org/libdecodeqr. TATENO,


K., KITAHARA, I., AND OHTA, Y. 2007. A nested marker for augmented reality.
In Proceedings of IEEE Virtual Reality Conference (VR ’07), 259–262.

33

You might also like