JETIR2306627

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

© 2023 JETIR June 2023, Volume 10, Issue 6 www.jetir.

org (ISSN-2349-5162)

HUMAN FOLLOWING ROBOT USING


RASPBERRY PI
Sakshi Sabale Rupali Shelake
Department of E&TC Engineering Department of E&TC Engineering
AISSMS Institute of Information Technology. AISSMS Institute of Information Technology.
Pune, India Pune, India

Neha Sirsat Dr. D. K. Shedge


Department of E&TC Engineering Department of E&TC Engineering
AISSMS Institute of Information Technology. AISSMS Institute of Information Technology.
Pune, India Pune, India

Abstract— This system is about the human following robot which follows the human user based on his/her face and body
detection. We implement and design the robot in automatic mode system and to make a robot that can help humans with various
tasks. This robot uses its camera to capture the image and feeds the image to Object Detection Machine Learning Model. The
model returns the list of objects. This list is traversed to see if 'person' is present. If that specific person is detected whose data is
given to the model, then robot will follow that specific person only. The face and body detection are done using image processing.
This gives unique and customized mapping of user to the robot. The proposed system uses RASPBERRY Pi and computer vision
techniques for human tracking and following. This robot is able to differentiate between different faces of the user It is able to
track the human with that specific face and body. The human following robot can be used as being a following assistant, maid,
goods carrier or guider.

Keywords—Human following, Object detection, Virtual Control, Raspberry Pi, Bluetooth module, Image Processing.

RELATED WORK:

"Pi Robot Project" by Robotics Masters: This project focuses on creating a robot that can follow a person using a Raspberry Pi
and a camera module. It provides detailed instructions, code samples, and a step-by-step guide for building the robot and implementing
the human-following behavior.
"Human-Following Robot with Obstacle Avoidance" by Hackster.io: This project combines human following with obstacle avoidance
capabilities. It demonstrates how to build a robot using a Raspberry Pi, a camera module, ultrasonic sensors, and servo motors. This
project provides a comprehensive guide to building a human-following robot using a Raspberry Pi and an ultrasonic sensor. It covers the
hardware setup, software configuration, and the coding aspect required to implement the robot's behavior.

II. LITERATURE SURVEY

A literature survey on human following robot using raspberry Pi and machine learning model reveals that researchers and developers
have been exploring various approaches and techniques to build specific human following robot that can follow the specific person and
help to carry their luggage using specific human tracking and machine learning techniques.
K. Morioko, (feb 2004) designed paper in this paper we describe the design to build a robot capable of locating and following a human
target moving in a domestic environment as well as industrial area. A person follower robot is an autonomous robot which is able to
follow the color by using android camera of size 8MP and continuously capture the image of resolution 2448*3264 by using image
processing technology. Android device will take decisions according to the captured image and send it to the AVR atmega32 controller
through HC05 Bluetooth module and another technology is using HC-SR04 ultrasonic sensor. Ultrasonic sensor emits the sound wave to
measure the distance between person and robot. Also, robots follow the person at particular [100cm] distance. If any obstacle comes in
between the person and robot, it will stop the robot immediately [1].
JETIR2306627 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org g237
© 2023 JETIR June 2023, Volume 10, Issue 6 www.jetir.org (ISSN-2349-5162)
S. Shaker, (2008) Based on the journal, researchers implement laser range finder into their robot. They detect and follow a person
movement by using leg tracking algorithm. The laser range finder provides the data to the algorithm to detect the targeted person’s leg.
The algorithm can calculate the velocity of the targeted person’s leg movement with the respect to the human following robot. The
information that generated by the algorithm is then passed to a fuzzy controller which will control the follow speed of the robot when
following the targeted person’s leg. Fuzzy interference system is to deal with the problem which is humanistic, complex and situation
that the use of mathematical is too precise but is imprecise with nature. This system is to control and smoothen the human following
robot’s motion when following the targeted person [2].
K.S. Nair, (2014) in this research, the researchers use the ultrasonic sensor as the key element in detecting the target. The ultrasonic
sensor module will sense the presence of the target human and the robot will move according to the direction of the target person. As for
obstacle detection, the Infrared sensor is used to detect the obstacles and the robot will avoid the obstacle by changing the direction for
static obstacle or by stopping to wait for the motion obstacles to move away. The system uses AVR Atmega 32 as the processor on
controlling the human following robot [3].
B. Ilias, (2014) The research done by uses Kinect high speed sensor to detect and track the movement of the target person. When
initialize state, the target person needs to raise his hands in front of the Kinect sensor in order to calculate the human skeleton. Then the
information is passed to the Processing.Org software by using laptop. After the human skeleton is being traced out by the software the
command is sent to the BASIC stamp 2 Kinect to execute the direction of movement of the robot, then the BASIC stamp 2 ultrasonic
will check out for obstacles [4].
These literature surveys provide an overview of the current state of the human following robot, including their applications, challenges,
and future directions. They also review the latest research and advancements in AI and ML that are being used to develop specific
human following robot.

III. METHODOLOGY

In this project, Raspberry Pi is chosen as the main processing unit that is going to run the whole system of human following robot. For
safe distance tracking and obstacle avoidance.
Our system consists of a four-wheel robotic vehicle mounted with a separate microprocessor and control unit along with different
sensors and modules. Robo, Camera Connected to the Raspberry pi microcontroller. i.e., infrared sensors which helps them to move
with respect to people and objects in their surroundings. The above sensors work in unison with each other and helps the robot in its
operation and to navigate its path by avoiding the obstacles and maintaining a specific distance from the object. for obstacle avoidance
and to maintain a specific distance for the object.

A. Proposed System:

Our proposed system aims to develop a human-following robot using Raspberry Pi as the main control unit.
The robot will be capable of autonomously tracking and following a human target while maintaining a safe distance. The
system is built using the raspberry pi as a main controller of the system Raspberry Pi board such as Raspberry Pi 3 or Raspberry Pi 4 to
serve as the central processing unit of the robot. Motor Controller choose a suitable motor controller board compatible with Raspberry
Pi, such as the L293D motor driver, to control the robot's motors Operating System Install a Linux-based operating system Raspbian on
the Raspberry Pi to provide the necessary software environment. Motor Control Software Develop code that interfaces with the motor
controller board through the Raspberry Pi's GPIO pins, enabling precise control of the robot's movement. Sensor Integration Implement
code to read data from the chosen sensors connected to the Raspberry Pi. Human Tracking Algorithm Design and implement an
algorithm that tracks the human's position relative to the robot based on the sensor data. This algorithm will determine the adjustments
required for the robot movement. User interface optionally, develop a user interface a web-based interface to interact with the robot,
enabling users to control its behaviour remotely.
This human following robot is design to perform various task such as following a specific person, to carry weight. It ensures to
prioritize safety during the entire process, considering the robot's movements and potential interactions with humans and objects in its
surroundings.

B. Evaluation:

We evaluate the proposed system by testing its performance in different scenarios and comparing it to other popular human
following robots in the market. We conduct several tests to evaluate the accuracy and efficiency of the system in performing various
tasks such as tracking specific person, differentiate the target human, change robot direction with the direction of target human.
The results show that the proposed system was able to provide accurate and useful responses to users and the robot can follow
the specific person in the crowd it will be help full in significant cases. The system was also able to adapt its responses and behaviour
based on user interactions, providing a more personalized experience for the users.

JETIR2306627 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org g238
© 2023 JETIR June 2023, Volume 10, Issue 6 www.jetir.org (ISSN-2349-5162)
C. Block Diagram

Power Supply

Robot L293D

Raspberry pi

Camera Bluetooth

Ultrasonic
Sensor

Fig.1
In this block diagram, the central controller board that runs the software and controls the robot's actions. Camera device that
captures the video feed of the surroundings. Image Processing Software algorithms that process the captured video frames to extract
relevant information. Object Detection Algorithms that analyse the processed images to identify and locate humans in the frame. Motion
Controller Determines the required motion based on the detected human's position and generates commands for the motors. Motors
Actuators that control the movement of the robot, such as wheels or legs, based on the commands received from the motion controller.

The Raspberry Pi processes the video feed from the image sensor using image processing techniques. It identifies humans in the
frame using object detection algorithms. Once a human is detected, the motion controller calculates the necessary movements to follow
the human. It then sends commands to the motors to control the robot's movement accordingly. Bluetooth modules control the robot
motion virtually and it gives command to the motor driver through raspberry pi.

IV. TEST AND RESULTS

Fig.2

JETIR2306627 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org g239
© 2023 JETIR June 2023, Volume 10, Issue 6 www.jetir.org (ISSN-2349-5162)

Hardware Setup Assemble the robot chassis, motors, wheels, and other required components. Connect the Raspberry Pi to the
motor controller board and other necessary peripherals camera, ultrasonic sensor. Software Setup Install the Raspbian operating system
on the Raspberry Pi. Set up the necessary software libraries and dependencies GPIO libraries. Image Processing Capture video frames
using the camera connected to the Raspberry Pi. Apply image processing techniques to detect and track humans in the video frames.
OpenCV can be used for this purpose. Implement algorithms for human detection and tracking. There are various approaches like Har
cascades, HOG, or deep learning-based methods. Robot Control Based on the detected human's position, calculate the appropriate motor
commands to make the robot follow the human. Send the motor commands to the motor controller board connected to the Raspberry Pi
GPIO pins. Testing and Evaluation Set up a test environment where the robot can follow a human. Evaluate the robot's performance by
measuring metrics like tracking accuracy, response time,
smoothness of movement. Conduct tests in different scenarios different walking speeds to assess the robot's robustness. Iterative
Improvements Analyse the test results and identify areas for improvement. Modify the algorithms, tuning parameters, or hardware setup
as needed- retest the robot after making changes to assess the impact on performance.

VI. CONCLUSION

In conclusion, the robot's performance was evaluated through testing in different scenarios. Metrics such as tracking accuracy, response
time, and smoothness of movement were measured to assess the robot's effectiveness in following a human. The results indicated the
robot's ability to track a human successfully and adjust its movement accordingly.
Despite the challenges faced, the human-following robot using Raspberry Pi can be considered a success.
It demonstrated the ability to track and follow humans,
opening up possibilities for applications such as assistance in navigation, surveillance, or even interactive robotics.
In summary, the human-following robot project using Raspberry Pi showcased the potential of combining hardware and software to
create an autonomous robot capable of tracking and following humans. The project highlighted areas of success identified limitations,
and proposed avenues for future improvements, contributing to the field of robotics and intelligent systems.

VII. REFERENCES

[1] K. Morioka, J.-H. Lee, and H. Hashimoto, “Human- following mobile robot in a distributed intelligent sensor network,” IEEE
Trans. Ind. Electron., vol. 51, no. 1, pp. 229–237, Feb. 2004.
[2] Y. Matsumoto and A. Zelinsky, “Real-time face tracking system for human-robot interaction,” in 1999 IEEE International
Conference on Systems, Man, and Cybernetics, 1999. IEEE SMC ’99 Conference Proceedings, 1999, vol. 2, pp. 830– 835 vol.2.
[3] T. Yoshimi, M. Nishiyama, T. Sonoura, H. Nakamoto, S. Tokura, H. Sato, F. Ozaki, N. Matsuhira, and H. Mizoguchi, “Development
of a Person Following Robot with Vision Based Target Detection,” in 2006 IEEE/RSJ International Conference on Intelligent Robots
and Systems, 2006, pp. 5286–5291.
[4] N. Bellotto and H. Hu, “Multisensor integration for human-robot interaction,” IEEE J. Intell. Cybern.Syst., vol. 1, no.1, p.1,2005

JETIR2306627 Journal of Emerging Technologies and Innovative Research (JETIR) www.jetir.org g240

You might also like