Synopsis_report

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Major Project

Synopsis

School of Computer Science


&
School of Advanced Engineering

Under the guidance of

Dr. Poonam Kainthura Dr. Ajay Kumar Srivastava


Associate Professor (S.G) Professor
Data Science Cluster Mechanical Cluster
School of Computer Science School of Advanced
Engineering

Submitted By
Name SAP ID Branch
Harman Singh Malhotra 500091033 B. Tech AI/ML (H)

Saksham Shahu 500096027 B. Tech AI/ML (H)

Sidhant Tomar 500090961 B. Tech AI/ML (H)


Debankan Mukherjee 500091518 B. Tech ADE
Faith Minz 500098304 B. Tech ADE
Navasheen Roy Chowdhury 500091744 B. Tech ADE
Synopsis Report

1. Project Title
U-Tour - An automated tour robot guide

2. Abstract
This project focuses on the development of a fully autonomous robot designed to navigate a
college campus, providing guided tours to visitors while avoiding obstacles in real-time. The
robot uses a Raspberry Pi as its central processing unit, with onboard sensors including a
LIDAR for obstacle detection and a GPS module for location tracking. Navigation is driven by
a set of high-density waypoints, spaced approximately one meter apart, stored in a memoized
GIS dataset. This allows the robot to follow a predefined, human-friendly path that ensures it
adheres to intuitive routes while avoiding unnatural or highly optimized paths that are not easily
followable by humans. The robot’s path-planning is enhanced by a potential field algorithm
where waypoints act as attractive forces, guiding the robot towards landmarks, while obstacles
detected by the LIDAR create repulsive forces to prevent collisions. The use of dense
waypoints ensures precision in movement and adherence to designated paths, while the
dynamic collision avoidance system allows the robot to operate safely in populated and ever-
changing environments.

The integration of ROS (Robot Operating System) enables real-time processing and sensor
fusion, combining GPS, odometry, and LIDAR data for accurate localization and smooth
navigation. By maintaining a predefined route across key campus locations, the robot provides
an engaging and interactive experience for visitors, while demonstrating robust performance
in autonomous navigation and collision avoidance.
3. Introduction
Colleges and universities frequently face logistical challenges when organizing campus tours
for visitors, including prospective students, alumni, and guest lecturers. One major issue is the
reliance on professors or specialized staff to conduct these tours, which is not only a time-
consuming task but often takes these valuable personnel away from their primary academic
responsibilities. For instance, when a visiting professor from a specific field, like physics,
requires a tour, it becomes necessary to find another expert in that area to ensure meaningful
interactions—further straining resources. This process can lead to inefficiencies and even
missed opportunities for deeper engagement.

Furthermore, the growing demand for customized, insightful tours that cater to the specific
needs of different visitors adds another layer of complexity. Current manual methods are labor-
intensive, and often do not fully address the personalized requirements of all attendees. U-Tour,
an automated tour robot guide, offers an innovative solution to these challenges. By leveraging
advanced navigation technologies such as LIDAR-based collision avoidance and high-density
waypoints, U-Tour ensures that campuses can provide engaging, informative tours without
burdening faculty and staff. This not only frees up academic personnel but also elevates the
institution's ability to showcase its offerings efficiently and intelligently.
Through this project, the college will introduce a cutting-edge, fully autonomous system
capable of providing real-time, obstacle-avoiding tours that enhance the visitor experience, all
while representing a forward-thinking commitment to technological innovation.
4. Literature Review
Serial Paper Objec ve Approach Limita ons
Numb name
er
1 RoboX - The main goals of  So ware Architecture: 1. Scalability:
Fully the project were: o Distributed While 11
Autonomo  To system using robots
us Tour combine both a PowerPC were
Guide industrial (running XO/2 produced,
Robot high- real- me OS) the paper
quality for naviga on doesn't
produc o and an Intel address
n for Pen um for how the
mobile interac on system
pla orms o Safety-cri cal manages
with the tasks running mul ple
best on the PowerPC robots
available with watchdog opera ng
academic systems and a simultaneo
research redundant usly in the
technique security same
s for processor space.
mobile  Naviga on System: 2. Human
robot o Obstacle interven o
naviga on avoidance using n: The
and an adapted paper
interac o dynamic men ons
n. window that
method human
o Feature-based interven o
mul - n is
hypothesis some mes
localiza on required
using Kalman for error
filters and recovery,
constrained- which
based search could be a
limita on
in fully
autonomo
us
opera on.

2 Tour Guide To introduce a  Adap ve Naviga on  Crowded


Robot new tour-guide System: environme
'Jinny' robot named o Implemented nt
'Jinny', developed four types of challenges:
for permanent mo on While the
installa on at the algorithms that robot was
Na onal Science the robot can tested in a
Museum of Korea. select based on crowded
Adap ve environmental environme
naviga on in condi ons nt, the
dynamic and o Used range paper
unmodified sensors for map doesn't
environments construc on, deeply
1. Interac o path planning, explore
n that and localiza on how the
a racts without robot
and ar ficial handles
engages landmarks extremely
people's o Developed a busy or
interest probabilis c unpredicta
2. Manageab map-matching ble
ility of the scheme based situa ons.
robot's on Monte Carlo  Scalability:
knowledg localiza on The study
e base  Hardware Design: doesn't
3. Reliability o Equipped Jinny address
and safety with various how easily
in sensors (laser the system
opera on range finders, could be
infrared scaled up
scanners, for larger
gyroscope) for environme
naviga on and nts or
safety mul ple
o Implemented robots
mul ple working
interac on together.
elements  Compariso
(touch screen, n with
LED bu ons, human
speech guides: The
recogni on/syn study lacks
thesis, gesture a
capabili es) compara v
e analysis
between
the robot's
performan
ce and that
of human
tour
guides.

3 The primary goal The authors propose an  The


of the paper is to approach using ar ficial limita ons
develop a mo on poten al fields (APF), which are highlighted
planning mathema cal func ons that in the
technique for generate forces guiding the
mul ple mobile robots. Two key poten als are paper
robots that introduced: include:
balances human  Behavior Poten al:  Local
safety and robot Accounts for human Minima:
efficiency in behavior, including their The
dynamic posi on, velocity, and ar ficial
environments. The direc on. The von poten al
challenge is to Mises distribu on is field
avoid collisions used to model the method is
while maintaining likelihood of a human's prone to
efficiency, movement direc on, local
especially in enabling robots to avoid minima,
environments collisions by predic ng where a
where robots human behavior. robot may
interact with  Conges on Poten al: get
moving humans. Designed to manage trapped
robot traffic, avoiding and unable
conges on by to proceed
distribu ng robots toward its
efficiently. Kernel goal. In
density es ma on such cases,
(KDE) is applied to robots can
model the global stop un l
distribu on of robots the
and avoid collisions humans or
among them. obstacles
move
away,
impac ng
efficiency.
 Dynamic
Environme
nts: The
proposed
technique
assumes
that global
informa o
n about
the
environme
nt is
available in
real- me.
In rapidly
changing
environme
nts, the
algorithm
might
struggle to
update the
poten al
fields
efficiently.

4 Autonomo autonomous tour 1) uses QR code recogni on


us tour guide robot to system in place of RIFD as it is
guide designed to guide costly . uses wall following
robot visitors. with self naviga on technique.
using localizing abili es. 2) Most localiza on and
embedded It is designed to mapping techniques involve
system giide visitors of running complex algorithms.
control Asia Pacific The method consists of placing
University ZigBee modules at known
Engineering labs loca on to provide reference
informa on to the robot to
locate itself.
3)the Raspberry pi
minicomputer acts as the brain
of the robot. d the motors are
all connected to an Arduino
Mega microcontroller which
uses I2C communica on to
exchange data with the
Raspberry pi.

5. Problem Statement
Traditional campus tour methods are inadequate due to:
 Limited availability
 Inconsistency in delivery
 Accessibility issues
These limitations prevent prospective students and visitors from gaining a comprehensive
understanding of the campus environment. Additionally, staffing constraints often hinder the
institution's ability to offer personalized tours.
There is a need for an innovative, technology-driven solution that can:
 Be available at all times
 Provide consistent and informative campus tours
 Improve accessibility
 Showcase the institution's commitment to advanced technology
 Address staffing challenges in delivering personalized tour experiences

6. Objective
To create a POC prototype of the car, which will be a smaller mobile version of the car. This
will be to form a baseline for any planned production models.

POC prototype with following implemented:


 Localised Map
 Basic collision Avoidance
 Data logging, for error identification and improvement in further iterations
Prototype car aims to implement these basic functionalities

7. Methodology
1. Design the Hardware Model
 Frame Design: Start with designing the body of the robot. Consider a
wheeled robot for ease of movement over flat surfaces, ensuring it is
sturdy but lightweight for campus navigation.
o Use CAD software to design the frame.
o Choose appropriate materials like aluminum or durable plastic to
balance strength and weight.
o Ensure the frame accommodates all sensors, the Raspberry Pi, and
other necessary components securely.
 Motors and Wheels:
o Choose motors (likely stepper or DC motors) based on the robot's
weight and expected speed.
o Ensure the wheels provide adequate traction for outdoor and indoor
surfaces.
 Power Supply:
o Select a battery system to provide enough power for extended tours
(e.g., 4-6 hours).
o Include a power management system for efficient energy usage and
safe recharging.
 Sensor Placement:
o Attach a LIDAR sensor for 360-degree obstacle detection,
ensuring it is placed at an optimal height to detect obstacles.
o Place a GPS module on top of the robot for unobstructed satellite
connection.
o Include odometry sensors on the wheels for precise movement
tracking.
o Consider adding a camera for visual processing or feedback to
users.
2. Select and Integrate Electronic Components
 Raspberry Pi: As the central processing unit, install a Raspberry Pi (e.g.,
Pi 4) with sufficient processing power and connectivity options.
 Microcontroller: Consider integrating an additional microcontroller
(e.g., Arduino) to handle real-time control of motors, sensors, and other
components.
 Motor Controller: Choose an H-Bridge or a motor driver to interface the
motors with the Raspberry Pi for direction and speed control.
 Sensors: Ensure the LIDAR, GPS, and odometry sensors are compatible
with the Raspberry Pi and ROS.
3. Set Up the Software and Framework
 Install ROS (Robot Operating System):
o Set up ROS on the Raspberry Pi for real-time sensor fusion and
processing. ROS will handle the integration of sensor data
(LIDAR, GPS, odometry) and control the robot’s motion.
 Map Campus with GIS:
o Create a high-density GIS dataset with waypoints approximately 1
meter apart. This data will guide the robot on a human-friendly
path across campus.
o Use mapping software (e.g., QGIS) to plot these waypoints across
key landmarks.
 Collision Avoidance:
o Implement a potential field algorithm for obstacle avoidance.
Waypoints should act as attractive forces, guiding the robot, while
the LIDAR sensors detect obstacles and generate repulsive forces.
 Localization and Navigation:
o Use a combination of GPS for outdoor localization and odometry
to refine indoor or low-GPS signal situations.
o Ensure ROS handles smooth transitions between GPS data, LIDAR
inputs, and odometry for real-time navigation.
4. Develop Tour Logic
 Predefine Tour Routes:
o Using the GIS data, define several tour routes across the campus,
ensuring that the waypoints lead to major landmarks and visitor
destinations.
 User Interface: (tentative)
o Develop an intuitive user interface (on a tablet or smartphone) to
allow visitors or staff to start, pause, or end tours. The interface can
include route selection and progress tracking.
 Tour Narration: (tentative)
o Implement audio narration or visual information systems to
provide guided explanations at each landmark.
o Add a speaker or screen for visitors to interact with, enabling real-
time feedback during the tour.
5. Test and Refine the Robot
 Field Test:
o Test the robot in real-world conditions across the campus.
o Ensure that the LIDAR detects obstacles, that the robot follows the
waypoint-based paths, and that GPS accuracy is sufficient.
oTest indoor navigation where GPS signals are weak, relying on
odometry for precise localization.
 Refinement:
o Calibrate the sensor inputs, improve the potential field algorithm,
and optimize waypoint spacing based on test results.
o Refine tour timing and path-following precision, ensuring smooth
and natural movement.
6. Deployment
 Pilot Testing:
o Run several live tours with real visitors to ensure the robot
performs as expected.
o Gather feedback to adjust the tour logic, timing, and visitor
interactions.
 Full Deployment:
o Once refined, deploy the robot as a permanent feature for campus
tours, ensuring it integrates with the campus visitor management
system.

8. References
 Hoshino, S., & Maki, K. (2015). Safe and efficient motion planning of multiple
mobile robots based on artificial potential for human behavior and robot congestion.
Advanced Robotics, 29(17), 1095-1109.
https://doi.org/10.1080/01691864.2015.1033461
 Kim, G., Chung, W., Kim, K.-R., Kim, M., Han, S., & Shim, R. H. (2004). The
Autonomous Tour-Guide Robot Jinny. In Proceedings of the 2004 IEEE/RSJ
International Conference on Intelligent Robots and Systems (pp. [specific pages if
available]). Sendai, Japan.
 Tomatis, N., Philippsen, R., Jensen, B., Arras, K. O., Terrien, G., Piguet, R., &
Siegwart, R. (2002). Building a Fully Autonomous Tour Guide Robot: Where
Academic Research Meets Industry. In Conference Paper.
https://doi.org/10.3929/ethz-a-010098369

You might also like