Hyundai Wearable Robotics For Walking Assistance Offer A Full Spectrum of Mobility
Hyundai Wearable Robotics For Walking Assistance Offer A Full Spectrum of Mobility
Hyundai Wearable Robotics For Walking Assistance Offer A Full Spectrum of Mobility
Prev
3/5
Next
"Using LabVIEW and the LabVIEW RIO architecture allowed us to reduce development and test
time for our new robot control algorithm to just one week, compared to one month with a text-
based approach. We are able to prototype with software and hardware faster and adapt to rapidly
changing control requirements."
The Challenge:
Developing a system that can handle complex control algorithms to capture data remotely from
various sensors simultaneously and perform real-time control of multiple actuators for a
wearable robotics device for walking assistance.
The Solution:
Using the LabVIEW RIO platform, including a CompactRIO embedded system and a real-time
controller with an FPGA control architecture provided by Single-Board RIO, to acquire data
from various sensors and control peripheral units, high-speed communication devices, and
actuators; and using LabVIEW software to acquire reliable data by conducting real-time analysis
and applying various robot control algorithms to dramatically reduce development time.
Author(s):
DongJin Hyun, PhD - Hyundai Motor Company
The Central Advanced Research and Engineering Institute at Hyundai Motor Company develops
future mobility technologies. Rather than provide conventional vehicle products to customers,
this research center creates new mobility devices with a wide range of speeds for a variety of
people, including the elderly and the disabled. As our society ages, there is a greater need for
systems that can aid mobility. Thus, we are developing wearable exoskeleton robots with NI
embedded controllers for the elderly and patients with spinal cord injuries to use.
In the field of wearable robotics, physical interfacing between the human body and a robot
causes various engineering issues with mechanical design, control architecture construction, and
actuation algorithm design. The allowed space and weight for electrical devices is extremely
limited because a wearable robot needs to be put on like a suit. Additionally, the overall control
sampling rate of the robot should be fast enough that it does not impede human motions and can
properly react to external forces. Also, many questions remain regarding human augmentation
and assistance control algorithms for wearable robots, even though many of the endeavors of
robotic researchers have resulted in successful performances of wearable robots. Therefore, our
group mainly considered the following requirements for selecting a main controller for our
wearable robots:
System Configuration
The real-time control and FPGA hardware environment ensure reliability and stability by
providing I/O that is compatible with various robotic control devices. For instance, in the process
of building our wearable robots, the overall control architecture drastically changed several times
due to the replacement of sensors or changes in the control communication method. However,
the unique onboard combination of the real-time controller and FPGA features provided by NI
products empowered our group to manage these changes promptly, which helped reduce our
development period.
In addition, adopting the compact sbRIO-9651 System on Module (SOM) device helped us
reduce the robot’s weight to less than 10 kg while maximizing battery efficiency through a low-
power base system configuration.
The number of sensors and actuators increases significantly to achieve more complex tasks in
robotics, and the complexity of the control algorithms increases exponentially. Therefore,
simultaneously processing all data from multiple sensors and sending instructions to multiple
actuators becomes one of the most important challenges to address in robotics. LabVIEW
supports concurrent visualization for intuitive signal processing for installed sensors on robots
and further control algorithm design in the experimental stages. Lastly, NI products are
expandable and compatible, so we can possibly use smart devices as user interfaces (UIs) in the
future.
Figure 3. LabVIEW Front Panel for Robot Control
Following the demonstration of the wearable Life-Caring Exoskeleton for walking assistance for
the elderly at NIWeek 2015, we unveiled a wearable Medical Robot for people with paraplegia,
which was also designed using LabVIEW and CompactRIO. In a joint clinical demonstration
with the Korea Spinal Cord Injury Association in January 2016, a paraplegic patient equipped
with this Medical Robot succeeded in sitting down, standing up, and walking on flat ground. The
patient who participated in this clinical trial is paralyzed in the lower half of the body (injury at
2nd and 3rd lumbar vertebrae) with motor and sensory paralysis, but could walk successfully
with the assistance of the wearable Medical Robot after a short training. Building on this
achievement and current progress in development, we expect to manufacture a lighter and better
product with added functions by 2018, and begin mass production in 2020.
We have research plans for integrating smart devices into the UI to address future challenges.
Currently, robots for people with lower body disabilities are designed to use crutches as wireless
UIs for changing configuration, such as converting to walking, sitting, climbing or going down
steps, or normal mode. Embedding smart devices into this kind of UI can help users conduct
tuning of additional parameters including stride, time for taking one step, or depth/width for
sitting on a chair. Also, data related to walking patterns or normal activity range is useful for
treatment or rehabilitation. Rehabilitation experts or doctors can configure more advanced
parameters, such as forced walking time or adjusting joint movement, to continue to use them for
treatment.
Author Information:
DongJin Hyun, PhD
Hyundai Motor Company
37, Cheoldobangmulgwan-ro
Uiwang-si, Gyeonggi-do 437-815
South Korea
Tel: +82 (031) 596 0920
mecjin@hyundai.com