Astesj 0203111
Astesj 0203111
Astesj 0203111
3, 891-899 (2017)
ASTESJ
www.astesj.com
ISSN: 2415-6698
Special Issue on Recent Advances in Engineering Systems
www.astesj.com 891
https://dx.doi.org/10.25046/aj0203111
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
connected. The bionic leg, on the other hand, was constructed out Similar to the Very high Overshadowed by the
surgical limbs, flexibility surgical and the
Prosthetic
of more than one piece. The knee, the foot and the leg were but less (custom made for robotic limbs
assembled to create a fully functional bionic limb [7]. This limb medical each person
attention using exact
has a brain of its own, and can sense what surrounds it by the measurements)
processors that analyze the inputs. According to the head of this No Not flexible No potential
group of engineers, the leg costs more than a few sedan cars.
Static
mechanical
use
Prosthetic limbs need to be measured and fitted to the patient
for his needs [8]. To apply prosthetics on a patient, intense medical
observation and a training course for the patient are needed so that
he can use the limb comfortably. There are several techniques used The control techniques used are summarized and compared with
as a means of controlling robotics arms, and the top three methods each other in Table 2 hereafter in terms of cost, installation,
are highlighted hereafter. The first method is to use an degree of control and accuracy.
electroencephalogram (EEG) device [9], which will record the
Table 2. Control techniques comparison
person’s brain waves when he is thinking of a certain action or
implementing a facial expression. These readings are then Type Approximate Cost Installation Degree of Accuracy
(U.S. Dollars) Control
converted to commands for the arm. The author in [10] states that
the mind regulates its activities by electric waves registered in the EEG 100 – 400 Detachable Complete Accurate
brain that emits electrochemical impulses having different control
frequencies, which can be registered by an electroencephalogram. Surgical 10,000 – 120,000 Permanent Complete Very
For instance, beta waves are emitted when a person feels nervous control accurate
or afraid with frequencies ranging from 13 to 60 Hertz. Alpha Sensors Below 100 Detachable Limited Accurate
waves are emitted when a person feels relaxed mentally and
physically with frequencies from 7 to 13 Hertz. On the other hand,
The EEG method is not only cost effective, but it is also
delta waves are emitted when a person is in a state of
unconsciousness. The advancement in technology made it possible accurate and gives the patient complete control of the arm. It also
to process these EEG frequencies and data directly in real time by gives the user the luxury of taking it off when feeling discomfort.
the use of a brain-computer interface which is a combination of EEG is a noninvasive method of monitoring brain activity.
hardware and software. Typically, it uses electrodes placed on the outside of the head, and
The second method is the surgical implantation. The arm is measures voltage oscillations in the neurons of the brain caused by
surgically connected to the person’s torso. Connections are also ionic current. It has been used in medical applications for a very
made to the nerves to allow the reading of electrical signals so that long time. The Emotiv EPOC is an example of an EEG headset
these signals can then be filtered and converted to commands. The with 14 sensors and having an internal sampling rate of 2048 Hz.
last control method consists in using sensors, which will be After filtering the signals, it sends the data to the computer at
connected to the robotic arm in order to take specific readings.
approximately 128 Hz. The signals are transferred from the
Some of the most common sensors used in this case are EMG,
gyroscope, and accelerometer sensors. This will allow the user to headset to the computer through wireless technology. This offers
be aware of the position his arm can be in as well as expand and much greater mobility, and instead of requiring a special gel, the
enclose it. All the arm types used are summarized and compared electrodes of the EPOC simply need to be dampened using a saline
with each other in Table 1 hereafter in terms of usage, flexibility, solution that in disinfectant and common.
cost potential.
The project presented in this paper aims to develop a low-cost
Table 1. Arm Types comparison
and versatile human-like prosthetic arm controllable via brain
Arm Usage Flexibility Potential activity using EEG neuro-feedback technology. The arm is
Type
equipped with a network of smart sensors and actuators that give
Using mounts Depends on the Big potential with the
motors, material rise of 3D printers the patient intelligent feedback about the surrounding
Robotic
and design environment and the object in contact. It also allows the arm to
react and execute pre-programmed series of actions in critical
Surgically Depends on the Very high especially
cases (extremely hot or fragile objects, etc.) A first prototype has
Implanted training and the for the war victims been developed to test the prosthetic arm with the embedded
Surgical
analyze the revolution for it to be cost efficient. All parts are printed separately then
surroundings
assembled together. This prototype focuses on the arm-
environment interaction. A second prototype based on the EEG
www.astesj.com 892
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
control has also been developed and still under test. Preliminary
experimental results show that the EEG technique is a promising
and good alternative to other existing techniques.
2. System Architecture
The EEG signals provided by the input unit were sampled and
processed on a lightweight wearable device – the Processing Unit.
The processing activity consists of two main parts: a pattern
recognition part that identifies different brain behavior captured by
the input unit, and a command part that generates a series of
commands to be sent to the mechatronics system of the arm.
This unit was programmed to distinguish between several
states of the mind representing different levels of “meditation” and
“focus”. Every mind state was captured and encoded to represent
a set of desired tasks to be performed by the arm. Due to the
diversity and the complexity of brain wave activities among
different humans, machine-learning techniques were required to
train patients to specific arm movements according to a set of mind
states.
2.3. Electro-mechanical Unit
This unit was designed and built from various lightweight high-
strength materials that can handle high impacts and fragile
elements as well. It integrated servos capable of handling 800 oz.-
in. of stall torque. These servos were strategically placed to
minimize hardware and facilitate complex moves. A
Figure 1. Mind-controlled smart prosthetic arm architecture microcontroller was also integrated to this setup to provide the
interface between the Mechanical Unit and the Processing Unit. It
2.1. Input Unit can also be programmed to perform a series of predefined
movements, allowing the arm to have a sophisticated and realistic
In this unit, brain signals were captured by an array of advanced
real hand behavior.
EEG sensors communicating with a Signal Processing Unit via
low-power and secure connectivity using Bluetooth technology. 2.4. Interface Unit – Smart Sensor Network
This device has an internal sampling rate of 2048Hz and 14 This unit is composed of a network of smart sensors, including
sensors arranged according to the international 10-20 System as temperature, skin pressure and ultrasonic proximity sensors,
shown in Figure 2 in order to cover the most relevant area over accelerometers, potentiometers, strain gauges and gyroscopes. The
main features of this unit allow the arm to interact with and adapt
the brain. This allows a maximum and efficient coverage of the to the surrounding environment. Moreover, a bi-directional
brain activity. EEG signals are acquired using the Emotiv EPOC communication was required to give commands to the arm and
wireless recording headset bearing 14 channels (AF3, F7, F3, FC5, provides feedback to the patient. This integrated network of
T7, P7, O1, O2, P8, T8, FC6, F4, F8), referenced to the common sensors and actuators required custom communication protocols
and networking techniques that allow seamless interaction and
mode sense [12]. control between the arm and patient. By default, controlling the
www.astesj.com 893
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
arm was handled by the brain (patient); however, it can be component number 4 is the control unit, which is Arduino based;
transferred to the arm to proactively protect itself against damage. component number 5 is the XBee driver (wireless
Due to its unique features, the proposed Mind-controlled Smart communication with the processing unit).
Prosthetic Arm should be able to improve the quality of life for The Data Sampling Unit includes 14 EEG sensors
millions of patients and their families around the world. Its low implemented in the EEG-based headset. This unit converts EEG
cost design makes it accessible for a wide range of beneficiaries, signals to digital signals and send them to the processing unit
especially those with limited or no access to advanced health care. through Bluetooth communication channel.
3. Technical Specifications The Wireless Communication Unit includes low power
3.1. Hardware Overview connection Bluetooth module interfaced with the Raspberry III
microcomputer. This part implements a communication protocol
The Smart Prosthetic Arm architecture was based on an between the user (EEG sensors installed on the head) and the
advanced electro-mechanical system controlled by EEG neuro- embedded microcontroller in the prosthetic arm.
feedback technology. This architecture is divided into 4 main The Computing and Processing Unit: Sampled EEG signals
units in terms of hardware (see Figure 3 and Figure 4) [13]: are channeled through the wireless communication unit to reach
the Computing and Processing Unit, which is embedded in the
arm and includes a Raspberry Pi III microcomputer, interfaced
On board the arm
with an Arduino Mega microcontroller that handle the mechanical
servo units installed in the arm. The main function of the
processing unit is to treat the digitized EEG signals. It is
Control programmed to compare between the headset reading and a set of
Communication
Communication
EEG Signal programmed with multiple hand reflexes, triggered by the smart
Processing
Headset Unit Mechatronic sensor network embedded in the arm. It gives the arm a human-
like behavior via smooth movements and smart reflexes.
System
Finally, the Electro-Mechanical Unit includes nine servo
(The arm) motors installed on the 3D designed model. The 3D hand model is
built from various lightweight high-strength materials that can
handle high impacts and fragile elements as well. This unit
integrates all servos, which are capable of handling 800 oz.-in. of
Sensor stall torque. It is embedded in the arm and links servos to joints to
Network
perform different motions. The servos are strategically placed to
minimize hardware and facilitate complex moves. A
microcontroller is also integrated to this setup to provide the
Figure 3. System overview – block diagram
interface between the Mechanical Unit and the Processing Unit.
One of the main features of this arm is its affordability, which
is accomplished mainly due to the fact that the arm is entirely 3D
printed, which is one of the most affordable, yet durable ways of
manufacturing such a product in this day.
For the process of building, different materials were used to
create a full arm. The usage of different materials provided a more
practical build.
The hand was created using a material called EcoPLA, which
is PLA mixed with certain chemicals to provide a sense for heat
recognition. This material can change color when exposed to heat.
The added sensors provide visual aid to the user of the arm so that
the user can estimate the heat of the held item. EcoPLA is
Figure 4. System overview – photo of the actual components considered to be a clean material, mainly extruded from starch,
corn, and sugar cane, which makes it more environmental friendly.
As shown in Figure 4, component number 1 is the processing unit
that receives the EEG waves from the headset and communicates For the mechanical moving parts that need better endurance and
with the control unit; component number 2 is the Emotiv EEG will be exposed to more pressure, PLA is also used as the material
headset with 14 sensors; component number 3 corresponds to the for the build because of its strength, flexibility, and heat resistance,
embedded sensors (temperature, proximity and touch); which makes it more reliable; therefore, it is the best material to
www.astesj.com 894
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
execute the job with no flaws. It is also easily sanded and Table 3. Load calculation for a servo motor
machined.
Mass Force Wrist S. Hand S. Biceps S. Shoulder S.
(Kg) (N) Torque Torque Torque Torque
For the forearm, shoulder, and elbow, PLA is used to complete
1 Kg 9.8 1.96 3.43 6.615 7.889
the build. PLA is a printing material that is easily machined under 2 Kg 19.6 3.92 6.86 13.23 15.778
normal conditions, on the other hand ABS needs special heating 3 Kg 29.4 5.88 10.29 19.845 23.667
bed during the printing. PLA is also an environment friendly 5 Kg 49 9.8 17.15 33.075 39.445
plastic which can be created using a mixture of different plant 7 Kg 68.6 13.72 24.01 46.305 55.223
substances, such as potatoes and corn, yet, it does not bio-degrade 10 Kg 98 19.6 34.3 66.15 78.89
easily. PLA is also considered strong and rigid material, yet it has
a lower melting temperature which makes it unreliable under 3.2. Dynamic Modeling of Prosthetic Arm
pressure. In the time domain, the mechanical model of the Prosthetic
Arm can be represented as a generalized second order differential
The project required the use of nine servomotors in different
equation form for the translation motions as shown in (3):
places in the arm. To make sure the hand is able to carry a load
𝑑𝑑 2 𝑥𝑥 𝑑𝑑𝑑𝑑
varying from 3-5 kg, first we had to know how much torque each 𝑓𝑓(𝑡𝑡) = 𝑚𝑚 + 𝐶𝐶1 + 𝐾𝐾1 . 𝑥𝑥(𝑡𝑡). (3)
𝑑𝑑𝑡𝑡 2 𝑑𝑑𝑑𝑑
servo motor needed. To get that, the following formula shown in
where: f(t) represents the applied tension forces in the two sides of
(1) for the static mode is used:
the tendons to create displacement between fingers at different
+M = (L1 × cos(θ1 ) + L2 × cos(θ2 ))x9.8 (1) configurations, m is the mass of the movable part, x(t) is the
+M is the momentum; L1 is the length between the force and the displacement between fingers at different time, C1 is the friction
first angle θ1 ; L2 is the length between the force and the second damping coefficient for the translation motion , K1 is the spring
angle θ2 as shown in Figure 5. constant . the same model can be restated for the torque 𝑇𝑇(𝑡𝑡) at
the elbow and shoulder joints as shown in (4):
𝑑𝑑 2 𝜃𝜃 𝑑𝑑𝑑𝑑
𝑇𝑇(𝑡𝑡) = 𝐽𝐽 + 𝐶𝐶2 + 𝐾𝐾2 . 𝜃𝜃(𝑡𝑡). (4)
𝑑𝑑𝑡𝑡 2 𝑑𝑑𝑑𝑑
www.astesj.com 895
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
power to all servomotors embedded in the arm. The wireless human hand, etc…). This integrated network of sensors and
communication unit operates when a 50 mA is supplied at 3.3 V, actuators requires custom communication protocols and control
which requires a power of 0.165 Watts to operate. mechanisms techniques that allow seamless interaction and
In addition, a network of smart sensors is used, including control hand over between the arm and the user.
temperature, skin pressure and ultrasonic proximity sensors, By default, the brain signals control the arm movements semi-
accelerometers, potentiometers, strain gauges and gyroscopes. autonomously via a wireless connection. The EEG headset has a
These sensors consume a total current of 100 mA. The power proprietary wireless USB dongle that can be connected to the
consumption of the low-power single board computer processing unit via a Bluetooth module. It reads the neuro-
(Computing and Processing Unit) is 0.1 Watts corresponding to a electrical signals and interprets them as a set of predefined outputs
current of 30 mA. To power all the units listed above, the system that reflect facial expressions, mood and conscious intentions.
requires a power source with an output current not less than 2.8 A These predefined outputs are received by the processing unit,
and 5V as output voltage. compared with user-dependent library of pattern and then
Two 10,000 mAh lithium ion batteries are chosen with an converted into functions. These functions are then labeled using
output current of 2A each. They include a charging circuit (via variables and sent to the Arduino microcontroller through via
USB Cable), and a boost converter that provides 5V DC. These UART channel. Based on these variables, a certain movement of
batteries have an 80% efficiency loss on both ends, meaning that the arm occurs according to the mapping that is done between the
it is not recommended to operate the arm while the battery is being variables and the readings.
charged. With these two batteries, the arm will be operational for 4. Project Cost
7 continuous hours knowing that the average hand movements per
person during a day are equivalent to 1 to 3 hours of continuous The cost of the project was divided into components and
movements depending on the daily activity performed. In equipment cost (servo motors, sensors, Bluetooth, Raspberry PI II
conclusion, the whole system can be operational for 2 full days. and EEG-based headset, two USB battery packs), labor costs and
the 3D printing cost of the prosthetic arm. All the required
3.4. Smart Sensors components are relatively cheap off-shelf products that can be
The sensors included in the proposed solution can be classified purchased from multiple providers. The total cost of the system is
in two categories: user-end sensors and environment-end sensors. estimated to be around 1140 USD as shown in Table 4.
The first category consists of the 14 EEG sensors presented Table 4. Cost of project components
previously which are installed on the user headset.
Components needed Approximate Cost
(U.S. Dollars)
As for sensors of the second category, their main feature is to
allow the arm to interact and adapt to the surrounding EEG-based headset (including battery and 400
software)
environment, by providing intelligent feedback about critical
Servo motors 135
condition, such as high temperature or pressure, etc. When
interfaced with the embedded microprocessor installed on the arm, Bluetooth module 95
this network will give the prosthetic arm a human-like behavior Smart network sensors 150
with smart reflexes and smooth movements. Note that the Raspberry PI II & Arduino Mega 60
feedback coming from some of these sensors will not only be used
Batteries 100
to operate some servos of the arm, but will also be displayed on
3D model arm 200
small LCD screen mounted on the forearm.
Total cost 1140
3.5. Control
The proposed system was based on both fully autonomous and The total cost of the proposed solution is estimated to be
semi-autonomous control. relatively affordable and more economical when compared to the
existing solutions (surgical, bionic and static prosthetic arms)
A bi-directional communication channel was implemented
manufactured by different companies around the world.
between the smart sensor network and the embedded
microprocessor in such a way to autonomously control the 5. Testing and Results
electro-mechanical unit and provide feedback to the user by
displaying it on an LCD mounted on the arm. This setup offered During the training session performed, the signals
the arm the ability to have smart reflexes when it counters delicate, corresponding to valence, engagement, frustration, meditation,
dangerous and critical situations such as protecting the arm from short and long-term excitement detected by the Emotiv EPOC
very hot surface contact or over squeezing fragile objects (glass, headset for 30 and 300 seconds are shown in Figure 6.
www.astesj.com 896
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
The scenarios simply define the smart reflexes of the arm which
can protect the user from the surrounding environment or the
people who are communicating with the user. Initially, the arm
automatically checks the three sensors (pressure, IR temperature,
and proximity). If all of the sensors are in a safe state, the
following scenarios can be initiated:
On the other hand, if one of the three sensors is not in a safe state,
the following scenarios will be taken:
Figure 7. Training session with the 3D cube
Scenario 4: If the hand holds an object, the temperature sensor,
Thinking of moving the cube to the left stimulated the hand of
which is embedded within the hand, will measure the temperature.
the prosthetic arm to close (see Figure 8). However, thinking of
If the temperature is too hot for human skin and reaches the pain
moving the cube to the right stimulated the hand to open (see
threshold, “Action 3” will be automatically initiated. Once Action
Figure 9).
3 is initiated a red LED lights up.
Conflict of Interest
References
[1] S. W. Hawking, “World report on disability,” World Health Organization,
Geneva, Switzerland, 2011.
[2] NBC News. (2010, March 20). Limb loss a grim, growing global crisis
[Online].Available:
http://haitiamputees.nbcnews.com/_news/2010/03/19/4040341-limb-loss-a-
grim-growing-global-crisis
[3] M. LeBlanc. (2011, January 14). Give Hope – Give a Hand [Online].
Available: https://web.stanford.edu/class/engr110/Newsletter/lecture03a-
2011.html
[4] C. Moreton. (2012, August 4). London 2012 Olympics: Oscar Pistorius
finally runs in Games after five year battle [Online]. Available:
http://www.telegraph.co.uk/sport/olympics/athletics/9452280/London-
2012-Olympics-Oscar-Pistorius-finally-runs-in-Games-after-five-year-
battle.html
[5] Y. Jeong, D. Lee, K. Kim and J. Park, “A wearable robotic arm with high
force-reflection capability,” in 9th IEEE International Workshop on Robot
and Human Interactive Communication, Osaka, 2000, pp. 411-416.
[6] A. Bennett Wilson Jr., B. (n.d.). Retrieved Octobor 17, 2015, from
oandplibrary: http://www.oandplibrary.org/alp/chap01-01.asp
Figure 10. Flowchart of the algorithm
www.astesj.com 898
T. Beyrouthy et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 2, No. 3, 891-899 (2017)
[7] E. Sofge (2012, May 28). Smart Bionic Limbs are Reengineering the
Human. [Online]. Available:
http://www.popularmechanics.com/science/health/a7764/smart-bionic-
limbs-are-reengineering-the-human-9160299
[8] R. M. Coupland, War Wounds of Limbs: surgical management. Geneve,
Switzerland, ICRC, 1993.
[9] Jerkey. Brain-Controlled Wheelchair [Online]. Available:
http://www.instructables.com/id/Brain-Controlled-Wheelchair
[10] H. Heyrman. Brainwaves [Online]. Available:
http://www.doctorhugo.org/brainwaves/brainwaves.html
[11] S. Sequeira, C. Diogo and F.J.T.E. Ferreira, “EEG-signals based control
strategy for prosthetic drive systems,” in IEEE 3rd Portuguese Meeting in
Bioengineering, Braga, 2013, pp. 1-4.
[12] V. Charisis, S. Hadjidimitriou, L. Hadjileontiadis, D. Ugurca and E. Yilmaz,
“EmoActivity – An EEG-based gamified emotion HCI for augmented
artistic expression: The i-Treasures paradigm,” in Springler-Verlag Berlin
Heidelberg, Berlin, 2011.
[13] T. Beyrouthy, S. K. AlKork and J. A. Korbane, “EEG mind controlled smart
prosthetic arm,” in IEEE International Conference on Emerging
Technologies and Innovative Business Practices for the Transformation of
Societies, August 2016.
[14] S. K. Al Kork, "Development of 3D finite element model of human elbow to
study elbow dislocation and instability," in ASME 2009 Summer
Bioengineering Conference, American Society of Mechanical Engineers,
2009.
[15] T. Beyrouthy and L. Fesquet, “An event-driven FIR filter: design and
implementation,” in 22nd IEEE International Symposium on Rapid System
Prototyping (RSP), May 2011.
[16] C. Heckathorne, “Upper-limb prosthetics: Components for adult externally
powered systems,” in Atlas of Limb Prosthetics, H. K. Bowker and J. W.
Michael, Eds. Rosemont, IL: Am. Acad. Orthoped. Surg., 2002
www.astesj.com 899