A9RCAD0

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/3344583

Visionary automation of sack handling and emptying

Article in IEEE Robotics & Automation Magazine · January 2001


DOI: 10.1109/100.894032 · Source: IEEE Xplore

CITATIONS READS

18 722

2 authors, including:

Anastasios D. Pouliezos
Technical University of Crete
78 PUBLICATIONS 856 CITATIONS

SEE PROFILE

All content following this page was uploaded by Anastasios D. Pouliezos on 11 July 2013.

The user has requested enhancement of the downloaded file.


An Inexpensive Yet “Intelligent” Prototype System Using Vision-Guided
Robotics for Depalletizing and Emptying Polyethylene Sacks

I
n the production line of many industries there exists Such a problem—a carton depalletizing problem—has also been
the procedure of palletizing/depalletizing sacks of raw solved with a robotic system equipped with a single camera [3].
or finished materials. The plastics industry, for exam- An ultrasound sensor attached in the gripper provides depth in-
ple, uses polyethylene for the production of a series of formation, and the system is reported to be able to handle cartons
products, including polyethylene pipes for irrigation, randomly placed in the pallet layers. A robotic material transport
plastic sheets for greenhouse coverings, and other agricul- system designed for the automated supply of packaging machines
tural uses. The usual production procedure is as follows. with paper bobbins has also been reported in [4]. A single camera
Pallets are kept in storage areas and are transported by fork- is used to identify the locations of the bobbins, which come in
lifts to depalletization stations when depalletizing is needed. pallets. The manipulator
Depalletizing is a labor-intensive job for workers. There are grasps it in turn and moves it by M. KAVOUSSANOS
some purely mechanical systems in the market that assist the to the requesting machine. and A. POULIEZOS
process in one way or another, but they tend to be expen- In this article a novel
sive and they do not provide a complete solution to the structure is presented that is designed and built to minimize costs
problem. and increase reliability of the whole system, while at the same time
In the majority of palletizing or depalletizing jobs, the fully automating the depalletizing and emptying of polyethelene
environment is either completely structured or may be sacks. The system is built around a conventional PC that carries
structured in such a way that the procedure is automated out the tasks of control, vision, gripping, and fault monitoring.
with dedicated machines. This probably explains the low Simple PID loops are used for the control of all the robot axes,
degree of penetration of robots in this area—at least until while an inexpensive vision system is used for the determination of
lately. Changes in this area should be expected once the the orientation of the pallets. A pneumatic gripper is used for the
costs of the required robotic systems decrease further [1]. A lifting of the sacks. The most probable faults are monitored for and
requirement for a class of palletization-depalletizion prob- appropriate actions are taken in the event of fault occurrence. In
lems is that the automated system should have the ability to this way, a form of “intelligence” is built into the system. The sys-
recognize the shape and the position in space of the pallet or tem in its present form is ready for use in an appropriate industrial
of the pallet elements. The use of a robotic system with vi- environment.
sion capabilities seems to be the most appropriate way to
tackle this problem. Solving the Problem with a
The Mitsubishi Company presented in 1997 a prototype Prototype Robotic System
depalletizing robotic system equipped with stereo vision The procedure that we are trying to automate is as follows: poly-
[2]. The vision system was implemented with the aid of a ethylene pellets are kept in 25 Kgr sacks that come in pallets of 40
PC and uses structured lighting techniques to detect the ex- sacks arranged as shown in Fig. 8 (eight layers of five sacks each).
act three-dimensional (3-D) position of the cargo in the A forklift carries the pallet in the depalletizing station where a
pallet. If the height of the objects in the pallet and the num- worker lifts the sacks one by one, cuts them open and empties
ber of the pallet layers are known, then the vision problem their contents in a silo for temporary storage (Fig. 1). The prob-
is essentially a two-dimensional (2-D) recognition problem. lem is to replace the human “depalletizer” with a robotic system.

44 IEEE Robotics & Automation Magazine 1070-9932/00/$10.00©2000IEEE DECEMBER 2000


disk. The pellets of polyethylene fall due to gravity in
The problem is to replace the human a silo placed below the cutting disk and are trans-
“depalletizer” with a robotic system. ported to big silos by a pneumatic conveyance system.
The system is equipped with a vision subsystem in
order to:
For cost-reduction purposes, it was decided that the sim- ◆ Identify the position of the pallet. In fact, since the pallet
plest possible robotic system capable of satisfying the require- is very heavy, it cannot be placed easily by the forklift in
ments of the application would be designed. In other words, a a specific place, so that the robotic system does not know
system designed specifically to suit the requirements of the ap- its position and orientation beforehand but has to calcu-
plication in terms of degrees of freedom, work envelop, con- late it.
figuration, speed, payload capacity, accuracy, etc. [5, 6]. ◆ Identify the arrangement of the sacks in the layers of the
The implemented solution uses a 4-degree-of-freedom pallet.
(DOF) Cartesian manipulator to depalletize and empty the The mechanical part has a Cartesian configuration with
sacks of polyethylene. The end-effector attached in the ma- three translations (X, Y, Z) plus a rotation (R) of the
nipulator grasps the sack and passes it along a rotating cutting end-effector (Fig. 2). Four degrees of freedom are enough
for the end-effector to be able to move the sack from any
place in the pallet and orient it along the cutting disk. For the
A Worker Cuts the Sack and
Places it in the Funnel
three translations, standard industrial linear bearings are used.
Synchronous belts and pulleys convert the rotation of the
drive motors to linear motion. The system maximum speed
is 1 m/s.
The end-effector is a construction that utilizes vacuum cups
to grasp the sack [7, 8]. It is pivoted in such a way that it can ro-
tate during cutting, with the aid of a pneumatic cylinder, about
450 to allow the polyethylene pellets to fall (see Fig. 3).

Silo of Temporary
Storage

Blowpipe

Measuring Vane

Figure 1. The depalletizing procedure.

Figure 2. The prototype robotic system. Figure 3. The gripper subsystem.

DECEMBER 2000 IEEE Robotics & Automation Magazine 45


Permanent magnet, dc, geared motors
are used, powered by 4-Q thyristor drives. Since the detection of the boundary of the
This is a cheap solution for building servo- whole pallet layer seems to be easier, this
systems, but it proved to be adequate for
the requirements of this application.
boundary is found first and then the
The vision system incorporates a camera arrangement of the sacks is identified.
to provide the image of the pallet as well as
a frame grabber for image capturing.
◆The control of ON-OFF air electrovalves.
Control of the System ◆The monitoring of limit switches and critical control sig-
Main Control Functions nals for fault detection.
There are four main functions of the control system: The control system is designed around a PC-486. Appro-
◆ The control of the four robot axes. priate plug-in cards are used for the control and the vision sub-
◆ The acquisition and analysis of the image of the camera systems [Fig. 4]. The above approach has been motivated by
to obtain the information of the position, the orientation several reasons:
of the pallet, and the arrangement of the sacks. ◆ The environment is user friendly.

PC 486
486 Processor

PC Bus

Motion Frame
Control Card Grabber

ADSP-2105
Memory
Memory
VMC
Interface of
ON−OFF
Devices
(Electrovalves,
Limit Switches...)
Encoders
D/A
Interface

A/D

Amplifier Amplifier Amplifier Amplifier CCD


Camera

Motor Motor Motor Motor

Figure 4. The overall control scheme.

46 IEEE Robotics & Automation Magazine DECEMBER 2000


◆ High-level languages can be used for software develop- cessor (486) free for other tasks. The DSP receives informa-
ment; C is used here. tion for the actual position of the axes via rotary shaft encoders
◆ Low hardware costs are involved. attached to the shafts of the driving motors. If the position
◆ High computing power of the PC-486. needs to be changed, correction signals are sent to the
◆ The ability to integrate all the functions of the control servo-amplifier via D/A converters. The DSP also receives
system in the same platform of the PC. commands from the main processor, via the PC bus, regarding
the desired final position as well as the desired traveling speed
Control of the Robot Axis and acceleration of each axis.
A special-purpose motion control card is used for the control Figure 5 shows the response of the robot’s X-axis to a com-
of the four axes (Motion Engineering Inc.’s LC/DSP). As can mand of 0.6 m displacement with a velocity of 0.5 m/s. It is
be seen in Fig. 4, the card utilizes a digital signal processor clearly seen that velocity feedforward has eliminated satisfac-
(DSP) (the Analog Devices ADSP-2105 chip) to perform the torily the steady-state position error.
low-level servo control of all the axes, leaving the main pro- For details on the control method used, the reader is re-
ferred to [9].

0.7
The Vision System
0.6 Functions and Hardware Used
As mentioned earlier, the vision system provides the necessary
0.5 information so that the robot is able to find the correct posi-
tion of the pallet as well as to recognize the pattern of the ar-
Position (m)

0.4 rangement of the sacks in the pallet’s layers. The pallet is


obviously a 3-D structure, but the exact information of the el-
0.3
evation of each pallet layer from the base is not necessarily
0.2 Real Trajectory needed. For the robot to be able to “find” the sack, it only
Desired Trajectory needs to bring the gripper in the correct X-Y position and
0.1 move the Z-axis towards the pallet until a proximity detector
is activated. So the vision problem is actually a 2-D problem if
0 the camera is placed above the expected center of the pallet,
0 0.5 1 1.5 2 2.5
Time (s)
looking toward the X-Y plane.
The vision system incorporates a CCD camera and an in-
Figure 5. X-axis trajectory (displacement 0.6 m, velocity 0.5 m/s). expensive frame grabber (Screen Machine II, FAST Electron-
ics) as a plug-in card for the PC bus. The A/D converter
samples the video signal (RGB) of the camera and converts it
to digital. The signal is then filtered and routed either to the
PC screen as “live image” or captured to 1 MB memory,
which the card has on-board. The captured image is then
transferred to PC memory for analysis. The whole process is
controlled by the video memory controller according to in-
structions sent by the PC processor via the bus.
Color information is not needed, therefore gray-scale im-
ages are manipulated. Image resolution has been set at 368 ×
(a) (b)
280 pixels, as recommended by the frame-grabber manufac-
Figure 6. (a) A pallet layer; (b) the detected edges. turer. With the above resolution, a system resolution of about
4 mm is achieved, since the “inspection” area is about 1.4 ×
1.2 m (the pallet size is 1.2 × 1.0 m), which is satisfactory for
the application.

Image Analysis: Detection of the


Pallet’s Layer Boundary [10, 11]
In Fig. 6(a) an image of a rather distorted pallet layer is shown.
As can be seen, the boundary of the pallet layer is easily dis-
criminated from the black background, but the same cannot
be said for the inner sides of the sacks. So the problem of find-
(a) (b)
ing directly the boundary of each sack, and hence its position,
Figure 7. (a) Thinned edges; (b) the detected boundary. is not an easy task. Since the detection of the boundary of the

DECEMBER 2000 IEEE Robotics & Automation Magazine 47


whole pallet layer seems to be easier, this
boundary is found first and then the ar- During motion execution, the main processor
rangement of the sacks is identified. has enough time to monitor several critical
For the detection of the layer’s bound-
ary, local edges are first detected by apply-
functions for the operation signals.
ing an edge-detection operator to the
image. A local edge is a small area in the im-
age where the luminance changes significantly. Obviously, where li = length of ith “almost” parallel line to the corre-
the boundary of the layer is composed of local edges. sponding rectangle of the mask, is evaluated. It is obvious that
In Fig. 6(b) the image of the detected edges for the arrange- the index is much greater when the correct arrangement is de-
ment of the pallet layer shown in Fig. 6(a) is shown. As a result tected; thus, it is considered as the recognition feature.
of using 5 × 5 mask, the specified edges are rather “thick.” A
process of line thinning follows that aims to generate a new Reliability-Fault Detection
image that will have edges one pixel thick. In Fig. 7(a), the im- The use of the PC as a platform offers computing power that
age of the “thinned edges” of the pallet can be seen. The edges permits the incorporation of fault detection functions, which
have become much more clear. Also, the boundary of the pal- increases the reliability to the system. In fact, during motion
let can be seen to be well discriminated from the background. execution, the main processor has enough time to monitor
To identify the boundary of the pallet’s layer, use is made of several critical functions for the operation signals.
the information that it is a rectangle; thus we are actually look-
ing for four straight line segments. To identify those segments,
their points need to be located. Points in the pallet boundary are
found by scanning the image of the thinned edges horizontally
and vertically. In fact, with two horizontal scans, from left to
right and vice-versa, and two vertical, it is easy to identify points
in the boundary. They are the first points to be found with the
previous scans. The least-median-squares robust regression is
then used to find the lines that best suit the identified set of
points. In Fig. 7(b), the four lines that best fit the pallets bound-
Figure 8. Possible sack arrangements.
ary using the least-median-square method are shown.

Image Analysis: Recognition


of the Sacks Arrangement
Once the outline of the pallet’s layer is found, there are only
two possible arrangements of sacks, as shown in Fig. 8. Indeed
there are always five sacks in the layer, and the length of the
rectangle is not equal to its width. The problem is therefore
how to determine which of the two arrangements is the actual
one in the detected outline.
To get a robust solution for the problem, we have designed
two masks that are superimposed on the detected outline, as (a) (b)
shown in Fig. 9. The idea is to hide as much irrelevant infor-
Figure 9. Masks to hide irrelevant information. (a) Inner boundaries
mation as possible—letters, illustrations—but to keep the areas revealed. (b) Inner boundaries almost hidden.
where the sack sides probably are; indeed, this will happen
only in one of the two cases.
Figure 10 shows how the image of the thinned edges ap-
pears after the imposition of the masks. In Fig. 10(a), the mask
has revealed the sack sides while in Fig. 10(b) it has almost hid-
den them. In the first case one can observe that there are lon-
ger straight lines “almost” parallel to the sides of the mask’s
rectangles than in the second case. Therefore, the two images
are next searched for straight lines. This is done using a Hough
transform.
For each image the index (a) (b)

TL = ∑ l i ,
Figure 10. Image after the imposition of the masks. (a) Mask reveals
the arrangement. (b) Mask reveals nothing.

48 IEEE Robotics & Automation Magazine DECEMBER 2000


Monitoring of travel limits for each axes of motion—a [3] M. Hashimoto, K. Sumi, and S.-I. Kuroda, “Vision system for
depalletizing robot using genetic labeling,” IEICE Trans. Informat. and
common practice for such systems—is performed by the mo -
Syst., vol. 78, no. 12, p. 1552, 1995.
tion control processor itself. Both hardware (limit switches) [4] H.K. Toenshoff, C. Soehner, and G. Isensee, “Vision guided tripod mate-
and software limits are monitored and immediate halt of mo- rial transport system for the packaging industry,” Robotics & Computer In-
tion is commanded if they are overpassed. tegrated Manufacturing, vol. 13, no. 1, pp. 1-7, 1997.
Except for travel limits, the main processor monitors the [5] J.F. Canny and K.Y. Goldberg, “A RISC approach to sensing and manip-
ulation,” J. Robotic Systems, vol. 12, no. 6, 1995.
error signal of all position control loops. This signal is found to
[6] J. Trevelyan, “Simplifying robotics - A challenge for research,” Robotics
be sensitive to a number of malfunctions and progressively de- and Autonomous Systems, vol. 21, pp. 207-220, 1997.
veloping faults. The signal is used to detect: [7] N.C. Tsourveloudis, R. Kolluru, K.P. Valavanis, and D. Gracanin, “Suc-
◆ “Loss” of actual position information in the axis tion control of a robotic gripper: A neuro-fuzzy approach,” J. Intell. and
servoloops. This could be due to either encoder failure Robotic Syst., vol. 27, pp. 215-235, 2000.
or lead cutting. [8] N.C. Tsourveloudis, K.P. Valavanis, R. Kolluru, and I.K. Nikolos, “Posi-
tion and suction control of a reconfigurable robotic gripper machine,”
◆ “Loss” of power to servoloops. This could be either due
Intelligence & Robotic Control, vol. 1, no. 2, pp. 53-62, 1999.
to power amplifier failure or lead cutting. [9] E. Kavoussanos, “Development of a prototype robotic system for the
◆ Certain failures to the motion control card itself, which depalletizing and emptying of polyethylene sacks. Ph.D. dissertation,
do not influence the actual position registration. Technical University of Crete, Greece, 2000.
◆ Wear or malfunction of mechanical parts. [10] B. Klaus and P. Horn, Robot Vision. Cambridge, MA: MIT Press, 1993.
The error signal of the vertical axes is also used as redun- [11] R. Jain, R. Kasturi, and B.G. Schunk, Machine Vision. New York:
McGraw-Hill, 1995.
dant information to verify that the sack is grasped—very criti -
cal information for the operation of the system. For the same Manolis M. Kavoussanos received the B.S. degree in me-
task, an appropriate proximity detector is used in the gripper chanical engineering from the National Technical University
assembly. of Athens in 1983 and the M.S. degree in robotics and auto-
mation from Imperial College, UK, in 1988. He recently pre-
sented his Ph.D. thesis in autumn 2000 at the Technical
Conclusions/Results University of Crete. He has worked in industry for six years
A vision-guided robotic system for depalletizing and empty- and participated in R&D projects concerning mainly automa-
ing of polyethylene sacks has been presented. At the moment, tion. Since 1993, he has been an assistant professor in the Me-
a prototype system is working in the laboratory and has passed chanical Engineering Department, Technological Education
several tests. Its industrial version is designed with minor Institute of Crete, Greece. He teaches industrial automation,
changes, necessary for the adaptation to the hard industrial en- CAD/CAM, and robotics. In addition to teaching, he partici-
vironment: the PC will be an industrial one, the slides will be pates in common R&D projects with local industry in the
protected from dust, and standard industrial cable rails will be above areas.
used. In the authors’ opinion, this project demonstrates how a
“robotic” approach to system design can lead to better as well A. Pouliezos is an associate professor in the Department of
as less-expensive solutions for a class of industrial problems. Production and Management Engineering at the Technical
University of Crete, Greece. He received a B.Sc. in comput-
ing and mathematics from Polytechnic of North London,
Keywords London, in 1975; an M.Sc. in control systems from Imperial
Real-time control, robotics, robotic vision, edge detection, College, London, in 1976; and a Ph.D. from Brunel Univer-
Hough transform. sity in 1980. His research interests include fault diagnostics,
biomodeling, intelligent systems and control, and control ed-
ucation.
References
[1] G.A. Weimer, “New robotics technology adds value to material handling, Address for Correspondence: A. Pouliezos, Dept. of Produc-
palletizing,” Material Handling Engineering, vol. 53, no. 7, p. 61, 1998.
[2] Mitsubishi Electric Corp., “Depalletizing robot using random-dot projec-
tion and Management Engineering, Technical University of
tion stereo system,” The Japan Industrial & Technological Bulletin, vol. 25, Crete, Counoupidiana, 73 100 Hania, Greece. E-mail:
no. 4, p. 20, 1997. pouliezo@dssl.tuc.gr.

DECEMBER 2000 IEEE Robotics & Automation Magazine 49

View publication stats

You might also like