BSEF22M503 - Advancements in HCI

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

HCI Assignment

Punjab University – College of Information & Technology

Hafiz Furqan Ahmad


BSE-F22-M503

Submitted By: Hafiz Furqan Ahmad | BSE-F22-M503


Submitted To: Mam Mudassira
Topic: Advancements in HCI
Dated: Oct 11, 2024
Table of Contents
INTRODUCTION ......................................................................................................................................... 1
1. VIRTUAL AND AUGMENTED REALITY (VA/AR) ..................................................................................... 1
1.1 TECHNOLOGY CATEGORY .............................................................................................................................1
1.2 HOW IT HELPS USERS .................................................................................................................................1
1.3 TECHNOLOGY DESCRIPTION .........................................................................................................................2
1.4 TECHNOLOGY TIMELINE: .............................................................................................................................2
1.5 PRESENTED BY (RESEARCHERS/COMPANIES)...................................................................................................3
2. BRAIN-COMPUTER INTERFACES (BCI) ................................................................................................. 4
2.1 TECHNOLOGY CATEGORY .............................................................................................................................4
2.2 HOW IT HELPS USER...................................................................................................................................4
2.3 TECHNOLOGY DESCRIPTION .........................................................................................................................4
2.4 TECHNOLOGY TIMELINE...............................................................................................................................4
2.5 PRESENTED BY (RESEARCHERS/COMPANIES)...................................................................................................5
3. LEAP MOTION CONTROLLER .............................................................................................................. 6
3.1 TECHNOLOGY CATEGORY .............................................................................................................................6
3.2 HOW IT HELPS USER...................................................................................................................................7
3.3 TECHNOLOGY DESCRIPTION .........................................................................................................................7
3.4 LAUNCH AND EVOLUTION ............................................................................................................................7
3.5 PRESENTED BY (RESEARCHERS/COMPANIES)...................................................................................................7
REFERENCES .............................................................................................................................................. 8
Latest Advancements in HCI and
Their Impact on the Community
Introduction
Human-Computer Interaction (HCI) has come a
long way, changing how people use computers
and digital devices. It started with command-line
interfaces where users had to type commands,
and later moved to graphical user interfaces
(GUIs) with icons and windows, making
computers easier to use. Today, HCI has evolved
even further with new technologies like Virtual and
Augmented Reality (VA/AR), Brain-Computer
Interfaces (BCI), and Touchless Interaction,
which make interacting with digital systems more natural, immersive, and efficient.
These advancements are improving industries like healthcare, education, and
entertainment. VA/AR allows users to experience virtual worlds or add digital elements
to the real world, making tasks like gaming or medical training more interactive. BCI
enables direct communication between the brain and devices, helping people with
disabilities and advancing medical technology. Touchless Interaction, driven by the
need for safer, contactless technology during the COVID-19 pandemic, uses
gestures, voice, and eye-tracking to control systems without touching them. This
paper explores these three HCI technologies and how they are changing our daily
lives and communities.

1. Virtual and Augmented Reality (VA/AR)


1.1 Technology Category: Hardware
Virtual Reality (VR) and Augmented Reality (AR) are
technologies that bridge both hardware and software
domains. On the hardware side, VR requires
devices like headsets and motion controllers to
fully immerse users in virtual environments, while
AR relies on cameras, sensors, and AR-enabled
devices (like smartphones or smart glasses) to
overlay digital content on the real world. Software
plays a crucial role in processing inputs, generating
virtual environments, and seamlessly blending digital and physical spaces through
sophisticated algorithms and AI.

1.2 How it Helps Users


VR and AR significantly enhance user experiences by making interactions more
immersive, intuitive, and engaging. In VR, users are transported to entirely digital
worlds where they can interact with 3D environments, while AR enriches the real world
by overlaying relevant digital information, images, and animations. These technologies
benefit users across various fields: in healthcare, VR assists in pain management and
medical training, while AR aids surgeons by providing real-time patient data; in
gaming, VR delivers immersive experiences, while AR provides real-time overlays of

1
game data and statistics; in education, both VR and AR facilitate hands-on learning
through interactive simulations.

1.3 Technology Description


Virtual Reality (VR) creates entirely synthetic
environments that replace a user’s real-world
surroundings, offering a fully immersive
experience where users can interact with a
virtual world through headsets and motion
sensors. Augmented Reality (AR), on the other
hand, overlays digital content—such as 3D
images, data, or animations—onto the user’s
view of the real world, blending physical and
digital elements seamlessly. Both technologies
rely on advanced hardware (such as VR
headsets, AR-enabled smartphones, and smart glasses) and software to detect user
inputs like gestures, movement, or voice commands, enabling real-time interaction
with digital elements.

Virtual and Augmented Reality (VA/AR) technologies have made significant impacts
across various fields, particularly in healthcare, gaming, and education. In the medical
field, AR systems, such as Microsoft’s HoloLens, allow surgeons to view 3D
anatomical models and critical patient data during operations, enhancing precision
and safety. In gaming, VR creates
immersive experiences, enabling
players to feel as if they are truly
inside the game world, while AR
enhances live sports broadcasts
with real-time data and interactive
features. For instance, the image
above illustrates how a team can
collaborate using AR technology,
visualizing complex data in real-
time, which is valuable for industries requiring detailed analysis and teamwork. In
education, AR brings textbooks to life, allowing students to explore interactive 3D
models, making learning more engaging and effective. As these technologies
continue to evolve, they are poised to reshape how we interact with our surroundings
and the digital content that enriches our experiences.

1.4 Technology Timeline:


The concept of Virtual Reality dates back to the late
20th century, but significant advancements have
been made in recent years. VR gained mainstream
attention in 2016 with the launch of headsets like
Oculus Rift and HTC Vive, marking a major
breakthrough in immersive technology. Augmented
Reality became widely popularized in 2016 with the
release of the mobile game Pokémon Go, which
demonstrated the potential of AR for mainstream

2
audiences. More recently, the launch of devices like Apple’s Vision Pro headset in
2023 has marked a pivotal moment in AR/VR evolution, blending the two technologies
to create mixed reality experiences that merge physical and digital worlds seamlessly.

1.5 Presented By (Researchers/Companies)


Many tech companies and research institutions have driven the development of VR
and AR technologies. Oculus, a subsidiary of Meta (formerly Facebook), and HTC
have been instrumental in advancing VR technology. Microsoft’s HoloLens has
played a key role in AR, particularly in industrial and healthcare applications. Apple’s
Vision Pro, introduced in 2023, represents one of the latest innovations in mixed
reality, combining AR and VR for a more integrated experience. Other significant
contributors to VA/AR technologies include Google, with AR tools for education and
mobile applications, and companies like Medivis, which specialize in AR solutions for
medical professionals.

3
2. Brain-Computer Interfaces (BCI)
2.1 Technology Category: Hardware and Software
Brain-Computer Interfaces (BCIs)
represent a convergence of advanced
hardware and software technologies. On
the hardware side, BCIs involve devices
that detect and process brain activity,
including non-invasive tools like
electroencephalography (EEG) headsets
or more invasive methods such as
implanted electrodes. The software side
consists of algorithms and machine
learning models that interpret brain signals and translate them into actionable
commands for controlling external devices like computers, prosthetics, or other
systems.

2.2 How it Helps User


BCIs help users by offering direct brain-based control over digital systems, bypassing
the need for traditional physical input devices like keyboards or joysticks. For
individuals with motor disabilities, BCIs allow them to communicate and interact with
their environment using brain signals. In medical applications, BCIs help patients
control prosthetics or exoskeletons and provide diagnostic insights into neurological
disorders. Beyond medical uses, BCIs have the potential to enhance everyday
interactions with technology, enabling users to control computers, gaming systems, or
even smartphones through thought alone.

2.3 Technology Description


BCIs work by capturing brain signals, typically
in the form of electrical activity from neurons,
and converting them into commands that
machines can understand. Non-invasive BCIs
use methods like EEG to record brainwaves
from the scalp, while invasive BCIs involve
implanting electrodes in the brain to obtain
more accurate readings. The system’s
software interprets these signals, allowing
users to control external devices such as
robotic limbs, communication interfaces,
or virtual environments. BCIs are capable of
real-time responses, enabling users to execute commands like typing messages,
moving prosthetics, or even navigating digital environments with their thoughts.

2.4 Technology Timeline


While research into brain-computer communication began in the 1970s, significant
advancements in BCIs have occurred over the last two decades. In 2019, Elon Musk’s
Neuralink made headlines by announcing plans for brain implants aimed at restoring
mobility and communication for individuals with neurological disorders. More recently,
in 2021, Neuralink demonstrated a monkey controlling a computer game using a BCI.

4
Other companies like Paradromics and CTRL-Labs (acquired by Meta) have been
working on non-invasive BCIs for commercial applications in communication and
gaming. These developments signal the rapid progress of BCIs from experimental
technology to practical and accessible tools.

2.5 Presented By (Researchers/Companies)


Several pioneering companies and research institutions have driven BCI innovation.
Neuralink, founded by Elon Musk, is one of the most prominent names in BCI
research, with a focus on developing brain implants that could merge human cognition
with artificial intelligence. Other key players include Paradromics, which is developing
high-data-rate neural interfaces for medical and commercial applications, and CTRL-
Labs, a company acquired by Meta that focuses on non-invasive BCI technology for
human-computer interaction. In the academic world, universities like the University of
California, San Francisco (UCSF), and research centers such as DARPA have made
significant contributions to BCI development, particularly in neuroprosthetics and
rehabilitation.

5
3. Leap Motion Controller
3.1 Technology Category: Hardware (with supporting Software)
The Leap Motion Controller is a small, rectangular piece of hardware equipped with
infrared cameras and motion sensors. The hardware’s main job is to detect and
track hand and finger movements in 3D space. Here’s how it works:
• Infrared Cameras: The device emits infrared light, which is invisible to the
human eye. The cameras then capture the reflections of this light from the
user's hands and fingers, allowing the device to create a detailed model of the
hand’s position, orientation, and movement.
• Motion Sensors: These sensors detect precise movements, from broad hand
gestures to subtle finger twitches. The device is designed to recognize the
position of each finger individually, enabling highly accurate tracking in real
time.
The physical hardware is small and compact, making it easy to connect to a computer
via USB. This makes it versatile and portable, allowing users to employ it for various
applications, from gaming to virtual reality.

Supporting Software: While the hardware captures the movement, it’s the software
that interprets and translates these inputs into actions that the computer or application
can understand. Here’s how the software aspect
works:
• Motion Tracking Algorithms: The Leap
Motion software uses sophisticated
algorithms to interpret the raw data from
the sensors, converting it into actionable
information. For instance, if a user
pinches their fingers together in the air,
the software can recognize that as a
specific gesture and trigger an action like
clicking a virtual button.
• Gesture Recognition: The software
allows the user to configure specific
gestures and assign them to commands. For example, swiping left or right could
scroll a webpage, or opening and closing a hand could grab and release a
virtual object.
• Integration with Application: Leap Motion’s software development kit (SDK)
enables developers to integrate the controller’s capabilities into different
applications. This is especially useful in fields like virtual reality (VR) and
augmented reality (AR), where natural hand movements enhance immersion.
Developers can also create custom applications or games that fully utilize the
Leap Motion’s hardware.

Example of Hardware and Software Working Together: In a VR application, the


hardware captures the user's hand movements, while the software maps these
movements into the virtual space. For example, if a user reaches out in front of them,
the system tracks their hand’s position and represents it as a 3D hand in the virtual
world. The software further interprets gestures, allowing the user to pick up virtual
objects or interact with elements in the VR environment.

6
3.2 How it Helps User
The Leap Motion Controller enables users to
interact with their computers and virtual
environments using natural hand movements
and gestures, eliminating the need for physical
contact with traditional input devices like
keyboards and mice. As shown in the image,
users can manipulate virtual objects by simply
moving their hands in front of the device,
offering a highly immersive experience. This
technology is particularly beneficial in gaming,
3D design, virtual reality (VR), and augmented
reality (AR) applications, where precision and
intuitive interaction are essential. By providing a
touchless, gesture-based interface, the Leap Motion Controller enhances user
engagement and allows for more fluid and dynamic interactions with digital content.

3.3 Technology Description


The Leap Motion Controller is a small, USB-connected device that uses infrared
sensors and cameras to track the precise movements of hands and fingers. It captures
motion in three-dimensional space, allowing for high-fidelity gesture tracking.
Unlike traditional input methods such as a mouse, keyboard, or touchscreen, the Leap
Motion can detect subtle finger motions, such as pinching, grabbing, or swiping, and
translate them into on-screen actions.

3.4 Launch and Evolution


The Leap Motion Controller was first introduced in 2013, and its technology has
continued to evolve over the years. Initially, it was aimed at consumers, particularly in
gaming and creative fields, but it has since found applications in professional sectors
like healthcare, robotics, and education.

3.5 Presented By (Researchers/Companies)


The Leap Motion Controller was developed by Leap Motion, Inc., a company based in
San Francisco. The company has been instrumental in advancing gesture recognition
technology and making touchless interaction more accessible and precise.

7
References
• “VR and AR: Key Technologies and Applications in Healthcare”, MedTech
Insights, 2023.
• “Apple Vision Pro and the Future of Mixed Reality”, TechCrunch, 2023.
• “The Evolution of VR and AR in Education and Gaming”, Digital Trends, 2022.
• “Brain-Computer Interface: The Future of Neural Technology”, MIT Technology
Review, 2021.
• “Neuralink and the Brain’s Frontier”, Wired, 2021.
• “CTRL-Labs and the Future of Non-Invasive BCIs”, TechCrunch, 2022.

You might also like