RobotSports Team Description Paper 2023

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

RobotSports Team Description Paper

Ton Peijnenburg, Jürge van Eijck[0000−0002−2774−9009] , Noah van der


Meer[0000−0001−5729−2671] , Charel van Hoof, and Rob Burgers

VDL ETG, De Schakel 22, 5651 GH Eindhoven, The Netherlands


ton.peijnenburg@vdletg.com
http://www.robotsports.nl

Abstract. Robot Sports is an open industrial team, meaning that its


participants are all employed by or have retired from various high-tech
companies in the Dutch Eindhoven region or are active students. This
year, the team will report on initiatives which aiming to accelerate inno-
vation within RoboCup MSL: standardisation to increase the re-use of
components by existing teams and allow new teams to quicker partici-
pate in the MSL competition; Mixed Team Protocol: ability to combine
teams with a few robots to a complete team and to make steps towards
matches where humans can play with (or against) robots.

Keywords: robotics · machine vision · machine learning · artificial in-


telligence · motion control · RoboCup · MSL.

1 Introduction

The Robot Sports team is an open industrial team supported as main sponsor
by VDL, an international industrial family business with 105 operating compa-
nies, headquartered in Eindhoven the Netherlands. The team shares a dedicated
facility with the ASML Falcons team in the city of Veldhoven, near Eindhoven.
The team developed new robots, which have evolved from the previous genera-
tion. The previous generation robots of the Robot Sports Team were developed
as a mix of the Philips robot design used in the MSL competition [1] and the
Tech United TURTLE robot design from the year 2012 [2]. A new generation of
robots enabled with our latest insights and improvements.

2 Robot hardware

The revisions to our robots’ hardware are aimed at making them faster, more
reliable, easier to service, safer and more efficient to transport. Robots will have
four omni-directional wheels for better stability and traction, an improved ball
handler mechanism with better placement of passive and active wheels, and
a new camera tower that can be separated for transport and provides more
easy access for camera adjustments. In addition, the control electronics will be
modified to include off-the-shelf motion controllers as well as custom designed,
2 Ton Peijnenburg et al.

Fig. 1. Our old (left) and new (right) robot.

micro-controller based I/O, control and safety modules. The new design also
facilitates the housing of a stereo depth sensor camera.

The robot frame is designed entirely in sheet aluminum, which keeps the
weight down while still providing the required sturdiness and keeping cost down.
A four-wheel configuration is chosen, combined with individual suspension for
all wheels to avoid over-constrained design and secure proper traction.

Fig. 2. New wheel configuration (left) and detail of wheel unit with suspension (right).
RobotSports Team Description Paper 3

Robot control is hosted on an AAeon 8251AI system. The 8251AI brings


high performance AI capabilities to the edge in an extremely compact form-
factor. In addition, the unit has a small mass, possesses excellent IO facilities
and has low power consumption (15W, 6-Core power mode). On the 8251AI we
are running an Ubuntu 20.04 64-bits OS in combination with custom control
software. Motion control tasks for the drive- and ball handler wheels are hosted
on three dual axis Roboclaw motion controllers [3] which are coordinated by a
Teensy 4.0 microcontroller [4]. Interfacing to the main PC is via the LAN Eth-
ernet bus. General I/O control is centralized on a customer board based on a
microcontroller which includes PLC functionality for (main) power control, and
safety circuits. During demonstrations and experiments an 900 MHz RF mod-
ule is connected with with the robot’s safety controller to provide remote kill
switch functionality. We have an electromagnetic kicking mechanism. Automo-
tive solenoids are used for actuation of a lever. One of two “feet” can be selected
which will kick the ball. One foot kicks low over the floor, the other kicks a lob
shot. A new charging circuit has been developed to charge a capacitor stack.
Discharge is done through a novel custom IGBT based switch that can be pulse
modulated to control shooting power and -duration. Control is implemented on
a microcontroller that interfaces via LAN Ethernet to the AAeon 8251AI.

3 Visual sensing

Our robots have a GigE camera from Point Grey with a 1280 x 1024pixel image
sensor and omnimirror combination. To resolve north south playing field ambi-
guity, an electronic compass unit is used. With the camera, a ball sized object
can be detected up to 7 meters. Discrimination between a ball and environment
is done based on color segmentation in the YUV domain. Color segmentation for
field and ball colors is based on (semi) auto calibrated segmentation parameters.
An additional stereo vision system to supplement the omni-directional vision
system is installed on the robot. Specifically, we have worked with the ZED2 2K
Stereo Depth sensor developed by Stereo-Labs [5]. The ZED2 contains two syn-
chronized high-resolution RGB cameras which can deliver frames up to 100fps.
Through specific calibration and triangulation, these two cameras can be used
to estimate the depth of objects present in the image. The ZED2 device can
be used to determine distances be-tween 0.5m and 20m with a very tolerable
error [6]. The main advantages of using the ZED2 are that the estimation of po-
sitions of objects is typically more accurate than what can be achieved using the
omni-directional vision system, and the fact that it allows for the detection of
airborne objects such as balls passing through the air. In the past, we have used
similar devices such as the Microsoft Kinect to supplement the omni-directional
camera with great success. In indoor environments with artificial lighting, these
devices that essentially employ active IR projection and detection to estimate
depth perform quite well. On the other hand, performance in out-door condi-
tions under direct illumination from the sun is typically severely limited. The
ZED2 does not suffer from this constraint due to the passive nature of the depth
4 Ton Peijnenburg et al.

estimation. As the RoboCup community moves closer to its 2050 goal of chal-
lenging human opponents, this is highly relevant as it would allow for outdoor
matches. In order to detect objects such as the ball and other robots in the frames
delivered by the ZED2 sensor up to high distances, we have worked with Deep
Neural Network frameworks such as Tensorflow [7] and PyTorch [8]. Recently we
achieved very promising results with the YOLO neural network using the latter
framework [9]. We are currently also investigating the use of the MobileNetV3
network developed by Google, which is supposed to be particularly suitable for
resource-constrained systems [10].

4 Behavior and reasoning

We believe that the reasoning that is required for soccer should be responsive.
Our robots must react quickly, making a non-optimized but appropriate deci-
sion. This is a trade-off between timing and quality. The robot behavior is imple-
mented as a set of executable skills. These skills have dedicated responsibilities
and effectively run parallel. A finite state machine (FSM) controls the highest-
level states of the robot. The FSM decides when and which transition is made.
When a transition is made the set of skills that are relevant for that state are
made active. We are using a heuristic based team planner, which calculates for
every available player a path to an objective, until no players are available. The
team planner combines dynamic role assignment and strategic positioning. The
dynamic role assignment is made more robust by taking previous assignments
into account and allowing some hysteresis. The Robot Sports Team uses RTDB
[11] to exchange and synchronize data between team players, which results in a
fast and accurate shared world model.

5 RobotSports Open API

Robotsports has opened up their robots for students to design, implement and
test robot control software. In collaboration with Fontys University of Applied
Sciences Eindhoven, Robotsports has created an API that offers students the
necessary tools to use soccer robots of Robotsports in their practical studies.
The software architecture of a soccer robot can be divided in four main sec-
tions (see Figure 3): sensing, sensor fusion (world model), action/command selec-
tion logic, and command execution. For students to work on the action/command
selection logic, the other three sections need to be in place. This is the case with
the Robotsports soccer robots.
A next hurdle for a student is to familiarize with the Robotsport software, in
order to replace the action/command selection logic with some of their interest.
Robotsports now facilitates this step with an API called rsopenapi [13]. It offers
a RTDB-based API providing status information and robots controls to move
and kick. It also comes with a dockerized simulator.
RobotSports Team Description Paper 5

Fig. 3. Soccer Robot Architecture (Source: Eric Dortmans, Fontys)

6 Mixed Team Protocol

The team considers the development of a mixed team protocol as an impor-


tant element for accelerating innovation by allowing more teams to participate
in RoboCup MSL, even with less than five robots, and to make steps towards
matches where humans can play with (or against) robot. Members of Robot-
sports and Falcons defined together a first version of a mixed team protocol.
This was a follow-up of the 2020 MSL workshop. The team with robots from
Robotsports and Falcons demonstrated a freekick during the Technical Challenge
of RoboCup 2021. Robotsports has an active role in the further development of
the mixed team protocol. This will be done with other teams. The progress and
developments will be aligned with the MSL league. The ambition is to demon-
strate a full game a mixed team with the co-developpers at RoboCup 2023.

7 Standardisation

Standardisation will be the key element for accelerating innovation within RoboCup
MSL. We believe that the ROS2 platform is the generic platform which allow the
MSL to share developments. Standardisation will lead to more re-use of compo-
nents by existing teams and allow new teams to quicker participate in the MSL
competition. Robotsports defined a roadmap for the transition of the current
software into a new design. The key elements of the new design are easy replac-
ing the software components by other implementations and designed for sharing
6 Ton Peijnenburg et al.

with other teams. A new team should be able to use our new implementation
as basis and adjust it with a limited effort to make it running on their robots.
This will significant reduce the time between the start of a MSL team and the
first match. In the first match the team can already play according to the MSL
rules.
The transition roadmap supports that during development the software al-
ways can be running on the robot.
– Phase 1: identifying explicit and implicit interactions within the existing
software, which development started at end of 1999.
– Phase 2: two activities will be done in parallel. Learning ROS2 in a practical
way, therefor the existing software will be wrapped in 2 ROS2 components. A
ROS2 component for controlling the platform and a ROS2 component which
contains the rest of the software. This helps to build up knowledge about
ROS2. In parallel we evaluated the software architectures of the active MSL
teams based on the MSL MES data of 2022. From the evaluation a software
component decomposition will extracted.
– Phase 3: design a generic concepts for the ROS2 components like inter-
component communication, configuratie etc.
– Phase 4: Transition for the 2 ROS components from phase 2 to the new
component structure with the existing interfaces. The new ROS2 components
will use the generic concepts for interaction.
– Phase 5: Refactoring to more generic interfaces between the components and
apply design patterns to make the software better shareable.

8 Wheelchair kicker

Fig. 4. (Source: www.specialheroescampus.nl)

Inspired by the MSL electronic kicker Tech2Play developed a kicker (AmiGo)for


a wheelchair [14] [15]. The wheelchair kicker allow kids in wheelchairs to fully
participate in sports. Robotsports members act as consults for the project. The
RobotSports Team Description Paper 7

team active co-developed the electronics for this project and the mechanical
redesign to reduce the cost price and size of the wheelchair kicker.

9 Outlook

For a future generation of robots, we are considering two-wheeled robots. We


aim for a cost-effective platform based on technology of a hoverboard, e.g., an
Oxboard [16]. Key advantages include a much higher wheelbase than the typical
MSL robots, creating compatibility with natural sports environments including
artificial and natural turf, and the ability to create mixed settings with human
players. After finalizing our current platform revision, we plan to continue our
work on the design of this two-wheeled robot platform.

10 Conclusion

We continued with the revision of our robots. The team contributes to initiatives
which aiming to accelerate innovation within RoboCup MSL. RobotSports takes
initiative to continue with development of the Mixed Team Protocol. We started
refactoring our software to provide a reference implementation for a standardized
MSL software design on ROS2. RobotSports expects that after the introduction
of a standard MSL software platform allow the team to specialize in some areas.
We benchmark our performance against European teams: specifically the
ASML Falcons during our regular practice matches in our shared facility and
during the European RoboCup 2022. This brought us to the level where we
are now: we can play a basic level of robot soccer. In order to close the gap to
the top teams, we need to make our robots more robust and at the same time,
more advanced. Making the hardware more robust prevents downtime during
tournaments and automating calibrations reduces the time we need unboxing
our robots to be ready for a fist match. This challenge is not unlike installation
and calibration of high-tech equipment in its production environment. More
robust also includes more robust sensing for different/changing environments.
More advanced in our case implies faster motion, better ball control and faster
responses. Especially the latter is performance characteristic that has system-
wide impact when improving. When improvements for these aspects have been
made, more advanced robot and team behavior will become more relevant.

References
1. A.T.A. Peijnenburg, T.P.H. Warmerdam et.al.: Philips CFT RoboCup Team De-
scription. In: preliminary proceedings 2002 RoboCup conference, July 2002.
2. Turtle Robot description on robotic open platform,
http://www.roboticopenplatform.org/wiki/TURTLE, last accessed: 2020/01/30.
3. Basicmicro RoboClaw Dual 34VDC, https://www.basicmicro.com/motor-
controller2, last accessed 2021/03/20.
8 Ton Peijnenburg et al.

4. PJRC Teensy 4.0 Development Board, https://www.pjrc.com/store/teensy40.html,


last ac-cessed 2021/03/20.
5. Stereolabs, “StereoLabs Website,” 2021, accessed: 2021-03-18. [Online]. Available:
https://www.stereolabs.com/.
6. Ortiz, L. E., Cabrera, E. V., Gonçalves, L. M. (2018). Depth data error modeling of
the ZED 3D vision sensor from stereolabs. ELCVIA: electronic letters on computer
vision and image analysis, 17(1), 0001-15.
7. Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., ... Zheng, X. (2016).
Tensorflow: A system for large-scale machine learning. In 12th USENIX symposium
on operating systems design and implementation (OSDI 16) (pp. 265-283).
8. Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., ... Chintala, S.
(2019). Pytorch: An imperative style, high-performance deep learning library. arXiv
preprint arXiv:1912.01703.
9. Bochkovskiy, A., Wang, C. Y., Liao, H. Y. M. (2020). Yolov4: Optimal speed and
accuracy of object detection. arXiv preprint arXiv:2004.10934.
10. Howard, A., Sandler, M., Chu, G., Chen, L. C., Chen, B., Tan, M., ... Adam, H.
(2019). Searching for mobilenetv3. In Proceedings of the IEEE/CVF International
Conference on Computer Vision (pp. 1314-1324).
11. Santos, F. , Almeida, L., Pedreiras, P. ; Lopes, L.S.: A real-time distributed software
infra-structure for cooperating mobile autonomous robots. In Proceedings of 14th
IEEE Interna-tional Conference on Advanced Robotics, Munich, Germany (2009).
12. MacAlpine P., Stone P. (2017) Prioritized Role Assignment for Marking. In: Behnke
S., Sheh R., Sarıel S., Lee D. (eds) RoboCup 2016: Robot World Cup XX. RoboCup
2016. Lecture Notes in Computer Science, vol 9776. Springer, Cham.
13. https://github.com/RobBurgers/rsopenapi
14. https://www.tech2play.nl
15. https://www.sportknowhowxl.nl/achtergronden/sport-knowhow-xl-news–in-
english-/item/145138/
16. Oxboard homepage, http://www.oxboard.eu, last accessed: 2020/01/30.

You might also like