Cag 99
Cag 99
Cag 99
Gabriel Zachmann
Fraunhofer Institute for Computer Graphics, Rundeturmstraße 6, 64283
Darmstadt, Federal Republic of Germany, email: zach@igd.fhg.de
Abstract
1 Introduction
chip design.
2
So, immersive virtual prototyping is one of many techniques for implementing
DMU.
2 Related Work
Researchers at Caterpillar Inc. use VR to improve the design process for heavy
equipment. Their system [11] allows them to quickly prototype wheel loader
and backhoe loader designs to perform visibility assessments of the new design
in a collaborate virtual environment. Further the engineers can simulate the
operation of the equipment and evaluate visual obstructions.
3
CAD models, and for the subsequent use of the prototype in immersive VR
[6].
A vision of virtual prototyping was developed within the ESPRIT project AIT
(Advanced Information Technologies in Design and Manufacturing). Project
partners were many European automotive, aerospace, IT suppliers, and academia
[5]. A lot of visionary prototypes have been presented also by [1].
Today’s computer-aided tools (CAx) for automotive and other industries can
simulate a lot of the functions and operating conditions of a new product.
In some cases, software simulations are as good or even better than physical
mock-ups. However, they still do not meet all requirements to avoid PMUs
completely. Certain functions of a new product cannot be simulated at all
by current CAx tools, while others don’t provide the results in an acceptable
time.
Therefore, many PMUs are built during the development process to achieve
a 100% verification of the geometry, the functions, and the processes of a
new car project. Additionally, today’s CAx tools do not provide a natural and
intuitive man-machine interface that allows the user to feel and to get the
spatial presence of the virtual product.
In order to “fill” these gaps, many automotive and other companies have
established projects to investigate the use of VR technologies for verification
of designs and processes [7].
4
Series Start Series End
SF DF SF DF SF DF
Time
Detailed
Concept Design Manufacturing Maintenance/Recycling/
Engineering
Further Developments
Design
Design/CAD
CA-Prototype
Semi-
prototypes Prototypes Pre-series
Vehicle Legend:
SF - Styling
Semi-
Prototype Freeze
DF - Design
prototypes Prototypes Pre-series Freeze
u - Deadline
Today’s approach
• Fast CA loops. CAx tools are used to quickly verify different design concepts
and assembly/disassembly of the design concepts. These verifications take
place in-between the design and the CA prototype process (see Figure 1). At
the beginning of the business process chain the freedom to change concepts
and the number of variations of a component is higher. Due to this fact the
number of CA verifications during the development process will decrease.
• PMU loops. For detail verification of design concepts and assembly processes
in some sections of a product, various PMUs are built. This sub-process
can be identified in Figure 1 between the design and the physical mock-up
process (see dashed line).
• PMU verification. Some complete PMUs of the final product (e.g., a car)
are built to verify if all the designed components fulfil all the requirements
related to ergonomics, functions and processes. Before these full prototypes
are built, a freeze of the styling and design processes occurs. In Figure 1
these phases are marked by the deadlines SF and DF.
5
In the traditional process chain several problems arise due to the fact that
verifications of the processes are made using PMUs and CAx tools:
• Parallel verification processes. Verifications are made with CAx tools and
with PMUs (in this case they are obtained by e.g. the use of rapid prototype
techniques and/or hand-built prototypes) concurrently. The correlation be-
tween this two verification processes is very hard to obtain.
• Not enough co-ordination. The handling, synchronisation, correlation, and
management of these processes is very difficult and in some cases impossible.
In order to build a PMU a design stage needs to be freezed. At this time,
the building of the PMU starts and can take 6 to 12 weeks. Due to concur-
rent engineering, further changes of CAD parts (sometimes even significant
ones) can be made during the build-time. Therefore, by the time results are
obtained by the PMU verification they have no more a direct correlation to
the current design. Even if there have not been changes in the design, the
“transfer” of the results of the PMU verification to the DMU is, in some
cases, very difficult.
Vision
Strategic objectives are global and involve the complete business process. The
most important ones are: reduction of development costs, development time,
and time-to-market; increase of product innovation, product quality, flexibility,
6
Series Start Series End
Time
Today:
! PMU=Leader
? PMU ! Parallel Process
!
? ? ?
No Correlation
DMU
Tomorrow:
PMU !DMU=Leader
!Parallel Process
!Correlation HW/SW
DMU
Goal:
PMU ! DMU=Leader
! One final verification in HW
! Final correlation HW/SW
DMU
Time
Series Start Series End
Operative objectives are more local, related to only one or a few key-processes.
The most important objectives which need to be fulfilled for assembly and
maintenance are [12]:
7
Fig. 3. Data flow between CAD and VR system.
The electronic report contains information related to simulation and investiga-
tion results, proposals for changes of CAD components, assembly/disassembly
paths, collision areas, sweeping envelopes, and the status of all verification
processes.
4 From CAD to VR
The complete data pipeline form the CAD system to the VR system has
various modules. CAD systems are the source of most of the data. This data
is stored in a PDM system, which also maintains administrative data together
with CAD data, such as ID, version, name, project code, etc. Via a retrieval
and conversion tool these data can be converted, reduced, and prepared for
use in a VR system (see Figure 3).
Common problems, especially with carry-over data (i.e., CAD data designed
for predecessor products, but re-used in new ones), are the orientation of nor-
mals, missing geometry, and deletion of interior or other “unwanted” geometry.
To our knowledge, there are no commercial tools available yet which can solve
these problems automatically. So the process for preparing data for VR needs
to access the CAD system interactively. We have tried to depict that in Fig-
ure 3 by the arrow between CAD and preparation tool. Furthermore, the VR
8
representation of CAD data and the configuration of the virtual environment
need to be managed within the PDM systems (dashed arrow in the figure).
Design data available today in the manufacturing industries and others do not
meet the geometric and non-geometric requirements so they can be used as-is
for a VR simulation. There are two ways to tackle the problems described in
the previous section: new data must be designed with virtual prototyping in
mind; old data must be dealt with, either by redesigning (least preferred), or
by semi-automatic conversion to representations suitable for VR.
To avoid that designers have to become familiar with different software tools,
the number of interfaces must be kept low. To achieve that an the two worlds
need to be integrated, at least to a higher degree than present today. Ideally, a
designer can create the CAD components and also perform assembly feasibility
studies, himself. Also, with a better integration it will be easier to exchange
data between CAD/PDM systems and VR systems.
5 Immersive Verification
In this section, we will briefly explain the process of authoring VEs, present
the two scenarios which have been chosen for our studies and developments,
and finally describe the functionality needed for assembly investigations.
5.1 Authoring
9
Actions Events
Fig. 4. We have identified three layers Fig. 5. The AEO framework. Note that
of increasing abstraction and special- actions are not “tied-in” with graphical
ization, on which authoring of a virtual objects.
environment takes place.
At the next level we have implemented the event-based scripting approach for
building VEs [16]. It is a general framework based on the concept of objects,
actions, and events, each of which with higher-level, yet general “story-board
driven” functionality.
Scenario templates
10
Fig. 7. Overview of the tail-light sce- Fig. 8. The door scenario. Two hands
nario. The tail-light is to be removed. and several tools are necessary to per-
form the assembly.
would only have to modify one of those templates. To some extent this idea
can be combined with “manual” authoring described above.
However, it is not clear to us yet, whether designers will ever design all the VR-
relevant attributes. Some of them are geometric, like visible material, thickness
of metal sheets, and the like. So far, a lot of authoring time is spent basi-
cally on specifying the non-geometric (semantic) attributes of parts, such as
the function of objects (screw, tool, etc.), non-geometric materials (flexibility,
smoothness), the order of tasks in the (dis-)assembly process, etc.
5.2 Scenarios
The first scenario is the disassembly of the tail-light of the BMW 5 series
(Figure 7). First, the covering in the car trunk must be turned down, in order
to get access to the fastening of the lights (Figure 10). To reach the screws
fixing the tail-light, the fastening needs to be pulled out.
Then the tail-light itself can be unscrewed by a standard tool. After all screws
are taken out, the tail-light cap can be disassembled by pulling it out from
the outside.
11
5.2.2 The door
This scenario is much more complex and more difficult in that both hands and
various tools must be utilized (Figure 8).
The first task is to put the lock in its place in the door. This is quite difficult in
the real world, because it is very cramped inside the door and the lock cannot
be seen very well during assembly. Screws have to be fastened while the lock
is held in its place (Figure 12).
Next, the window-regulator is to be installed (Figure 13). This task needs both
hands, because the window-regulator consists of two parts connected to each
other by flexible wires. After placing the bottom fixtures into slots, they must
be turned upright, then the regulator screws can be fixed.
Finally, several wires must be layed out on the inner metal sheet, clipped into
place, and connected to various parts. However, this part of the assembly was
not performed in VR.
While grasping two objects with both hands, a user must still be able to give
commands to the computer. This can be achieved most intuitively by voice
recognition. Also, multi-sensory feedback (see below) plays an important role
in multi-modal interaction.
Getting help from the system. When the number of functions becomes
large in the VR system, it happens that occasional users cannot remember a
certain command. Similar to 2D applications, we additionally provide hierar-
chical 3D menus. They are invoked by a speech command. In our experience,
12
Fig. 9. Administrative Fig. 10. Inverse kine- Fig. 11. With the virtual
data stored in the PDM matics is needed for yard-stick distances can
about parts can be dis- “door-like” behavior of be measured in the VE.
played during the VR parts.
session.
3D menus are to be considered only as an auxiliary interaction technique, since
it is more difficult to select menu entries in VR than it is in 2D.
When the trainee is ready to learn by doing, he will perform the task step by
step. After each step is completed the system will point him to the part or tool
he will need for the next step and tell him what to do with it. For instance,
after all screws for the door lock have been fastened, the system highlights
the window regulator (by blinking) and instructs him how to assemble it. The
instructions have been pre-recorded and are played back as sound files.
So far, the virtual service manual and the interactive training session are hand-
crafted via manual scripting. However, it should be straight-forward to extract
them from a PDM system, if the process data are there in a standardized form.
13
can be displayed in a heads-up fashion by pointing at objects with a ray (see
Figure 9). Of course, any other selection paradigm can be used as well.
A tool which has been requested by designers is the clipping plane. It can help
to inspect “problem areas” more closely. When activated, the user “wears” a
plane on his hand; all geometry in front of that plane will be clipped away
in real-time. Optionally, the part clipped away can be rendered transparently.
Sometimes it can be necessary to restrict the motion of the plane so that it is
always perpendicular to one of the world coordinate axes.
Another tool to inspect assembly situations and the mechanical design is the
user size. This parameter can be controlled by simple speech commands, which
in turn affect all parameters by which a virtual human is represented, in
particular navigation speed and scale of position tracking. This way, a user
can comfortably “stick his head” inside some narrow space.
A lot of the parts in a vehicle are flexible: wires, hoses, plastic tanks, etc. It
is still a major challenge to simulate all these different types of flexible parts
with reasonable precision and at interactive rates. In particular, simulation of
the interaction of flexible objects with the surrounding environment and the
user’s hands by a general framework is, to our knowledge, still unsolved.
We have implemented hoses and wires in our VR system; the wires or hoses
are attached at both ends to other, non-flexible parts, and they can be pushed
14
Fig. 12. Tools snap onto Fig. 13. The window Fig. 14. The ob-
screws and are con- regulator has to be in- ject-on-the-lead
strained. Also, they are stalled with two hands; paradigm allows to
placed automatically at the “ghost” paradigm verify assembly. The
an ergonomic position signales collisions. object is not linked
within the hand by the rigidly to the hand.
system.
In order to help the user placing parts, we have developed two kinds of snap-
ping paradigms: the first one makes objects snap in place when they are re-
leased by the user and when they are sufficiently close to their final position.
The second snapping paradigm makes tools snap onto screws when sufficiently
close and while they are being utilized (see Figure 12). The second paradigm
is implemented by a 1-DOF rotational constraint which can be triggered by
events.
The major problems is: how can we verify that a part can be assembled by a
human worker? A simple solution is to turn a part being grasped into what
we call a ghost when it collides with other parts: the solid part itself stays
at the last valid, i.e., collision-free, position while the object attached to the
user’s hand turns wireframe (see Figure 13).
15
Fig. 15. During assem- Fig. 16. Annotations can Fig. 17. Violations of se-
bly, the path of any part be put into the scene by curity-distance are high-
can be recorded, edited, voice commands. lighted by yellow, colli-
and stored in the PDM sions are red.
system.
stuck in tight environments. So, at any time it can assume only valid positions.
Of course, exact and fast collision detection is a prerequisite [18].
This is only a first step. A completely reliable verification will check the virtual
hand for collisions as well. Also, the hand and/or part should slide along
smooth rigid objects to make assembly easier for the user.
During assembly/disassembly the path of any part can be recorded and edited
in VR (see Figure 15). Saved paths can then be stored in the PDM system.
While parts are being moved, the sweeping envelope can be traced out. It
does not matter whether the part is moved interactively by the user or on an
assembly path.
16
Eventually, we will utilize speech synthesis.
6 User-Survey
In order to evaluate the acceptance and the potential of VR for VP, a survey
of prospective users has been performed at BMW.
We chose representatives from each of five groups involved with assembly and
maintenance investigations.
• CA specialist (CA). These are engineers that have a good CAx expertise
and also some specific assembly/maintenance processes knowledge.
• Skilled worker (SW). These are skilled mechanics who actually perform the
physical prototype verifications. This group has no CAx knowledge.
• Interface specialist (IS). This group comprises mechanical technicians. They
have mostly specific assembly/maintenance process knowledge, but they are
starting to get familiar with CAx tools. This group mediates between group
1 and group 2.
• Managers (MA). They are the ones that co-ordinate all three groups in the
vehicle prototype group.
• IT specialists (IT). In this group are very highly skilled engineers that do
development and evaluation of new CAx methods and IT tools. They pro-
vide new technologies to the key-user’s departments (the above four groups
are from the key-user department vehicle prototype).
Notice that all subjects have never before been exposed to any VR experiences.
This was because the base of subject with VR experiences is insufficiently small
for a survey.
The scenario used for the survey was the installation of the door lock, which
is a very difficult task, because the space inside the door is very tight. Only
subjects from group SW were to completely install the lock with their “virtual
hand”, because they are the only ones who really know how to do it. For all
other groups, subjects were to focus on the “feel” of the technology, the I/O
17
devices, and interaction techniques and capabilities.
While one user was performing the verification tasks, all other users in the
group were watching on a large-screen stereo projection with shutter glasses.
The following hardware and set-up was used for during the survey: SGI ONYX
with 6 processors, 2 IR graphics pipes, 1 GB RAM, FS5 head-mounted display,
CyberTouchTM data glove with tactile feedback, stereo projection, Ascension
electromagnetic tracking system. The frame-rate was about 18 and 14 Hz
for the tail-light and the door scenario, respectively. Latency was reduced by
a predictive filter. Tracking errors (which are immanent in electro-magnetic
systems) were corrected by our method presented in [17].
6.1 Evaluation
The results that are presented below were obtained with a questionnaire that
each subject filled out after the VR experience. Each question was to be an-
swered by a multiple choice with five levels: very good (=100%),
ˆ good, satis-
factory, bad, and very bad (=0%).
ˆ
6.1.1 Interaction
18
Fig. 18. User satisfaction with object Fig. 19. Navigation in the VE with a
selection. data glove.
From Figure 18 it is clear that almost all users prefer the combination of voice
input and data glove, instead the combination 3D menu and glove or selection
be someone else. However, there are some interesting differences among the
groups: users who had no experience with CAx technologies or IT technologies
in general (SW group) had no distinguished preferences. On the other hand,
users with a long experience in computers (IT group) clearly preferred voice
input.
6.1.2 Navigation
19
Fig. 20. Although the reliability Fig. 21. Evaluation of several collision
and ease-of-memorization got only feedbacks. Tactile feedbacks was given
a medium rating, user’s preferred through the CyberTouch’s vibrators.
voice input significantly for giving
commands to the VR system.
Most users were missing precision movements of the viewpoint and exact po-
sitioning of parts in the VR system. Of course, we expected this; yet, we chose
not to burden the session by too many tasks, although parts can be positioned
exactly with the VR system by voice commands. In this survey however, users
were only shown the point-and-fly navigation and how to position parts by
using the glove.
As Figure 20 indicates almost all subjects preferred voice input for giving
commands to the computer. We believe that this is due to the very natural
way in which commands can be given, e.g., a user just says “selection on”
or “switch selection on”. Another advantage is that voice commands can be
chosen in a very “descriptive” manner. Some tasks can be performed much
more precisely with a binary trigger than by a virtual hand, for instance exact
positioning.
This is why the user satisfaction with voice recognition reliability and com-
mand memorization is significantly less than the overall satisfaction.
20
Fig. 22. Ergonomics of VR devices.
The result of the survey can be found in Figure 21. We believe that the visual
feedback was not completely satisfactory to the users, because it highlighted
the whole object instead of the area of collision only, which is what engineers
are interested in.
Acoustic feedback plays a main role in two different ways: first, it provides
feedback to the user of what is happening at the moment; secondly, it provides
information about the material (e.g., metal, wood, plastic, etc.) that colliding
components consist of.
Tactile feedback was evaluated significantly less helpful than the other two.
Although our subjects found it an exciting experience, it is nevertheless un-
natural to them. After some discussion with them, we realized that what they
really would have liked is force feedback. They reported that without force
feedback some assembly tasks are almost impossible to do.
6.1.5 Ergonomics
Our conclusion is that HMDs and data gloves are not yet fit for being used
for daily work. Users have reported the following shortcomings, which we have
21
expected anyway:
• The HMD is too heavy and when the user looks down or up the HMD can
fall off. If the user does quick movements, he has to hold the HMD with his
hand, otherwise it shifts and the images get blurred or obscured.
• Unnatural sensation when the data glove starts to vibrate.
• Too many cables for I/O devices, which tether the user too much. Users
don’t have the freedom necessary to perform complicatesd assembly tasks.
7 Conclusion
In this paper we have discussed the benefits of virtual reality for virtual pro-
totyping in assembly and maintenance verification. Also, the integration of
VR with a company’s existing IT infrastructure has been discussed. Several
problems have been addressed and solutions have been proposed.
Finally, we have reported on a user survey which has been performed at BMW
with a representative group of key users 3 . The result of our survey indicates
that the use of VR for VP will play an important role in the near future
in automotive (and probably other) industries. In particular, the response of
the surveyed users has been very encouraging and optimistic that VR/VP
does have the potential to reduce the number of PMUs and improve overall
product quality, especially in those processes of the business process chain
where humans play an important role.
22
is not yet integrated in the daily productive work environment of our business
process.
Future directions
Based on our user survey we feel that current VR I/O devices are still way too
cumbersome, and not robust enough, to be used in CAD working environments
or in workshops. Gloves must become easier to put on; HMDs must become
much more light-weight while still getting a better hold on a person’s head 5 .
The lack of wireless tracking devices, gloves, and HMDs is among the most
annoying deficiencies of VR devices 6 .
Another problem is the way engineers are designing today. It is still quite diffi-
cult to prepare CAD data such that an interactive frame rate will be achieved
by the VR system both in rendering and in simulation speed. This problem
might be solved in a few years by faster hardware. However, a lot of the se-
4 For instance, in order to create a complete VE for immersive assembly simulation
we need 70% of the time to find and prepare the CAD data and only 30% for
authoring the VE.
5 We believe that HMDs and Gloves are necessary for assembly simulation in VR,
expensive.
23
mantical (non-geometric) data needed in a virtual environment just do not
exist. This can be solved only by a shift in the design process: design guide-
lines with virtual prototyping in mind, which are partially established already,
need to be implemented and integrated in the product development process,
and some data just have to be modeled for VP either by CAD engineers or by
“VP engineers”.
8 Acknowledgements
We would like to thank the following persons for their support, ideas, and
collaboration. At BMW: Hasmukh Jina, Peter Reindl, Peter Baacke, Rainer
Klier, Christoph Rieder, Hans-Jürgen Penzkofer. At IGD: Prof. Encarnação,
Dr. Müller, Dr. Rix, Dirk Reiners, Christian Knöpfle, Andreas Flick, Axel
Feix.
References
24
Applications, Computer Graphics: Systems and Applications, chapter 9, pages
151–158. Springer, Berlin, Heidelberg, 1998.
[15] D. L. Spooner and M. Hardwick. Using views for product data exchange. IEEE
Computer Graphics & Applications, 17(5), Sept.–Oct. 1997. ISSN 0272-1716.
[16] G. Zachmann. A language for describing behavior of and interaction with virtual
worlds. In Proc. ACM Conf. VRST ’96, Hongkong, July 1996.
[17] G. Zachmann. Distortion correction of magnetic fields for position tracking.
In Proc. Computer Graphics International (CGI ’97), Hasselt/Diepenbeek,
Belgium, June 1997. IEEE Computer Society Press.
25