A Neural Fuzzy System To Evaluate Software Development Prod U Ctlvlty

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

A NEURAL FUZZY SYSTEM TO EVALUATE SOFTWARE DEVELOPMENT

PROD U CTlVlTY
Ekkehard Baiseh, Thomas Bleile, Ralf Belschner
Institute for Automation and Software Engineering
University of Stuttgart
Pfaffenwaldring 47, D-70550 Stuttgart, Germany. FAX: (+49)-7 1 1-685-7302
e-mail: [baisch, belschner]@ias.uni-stuttgart.de, bleile@fli.sh.bosch.de
Keywords: rule-based fuzzy classification, neural
networks, rules generation, neural fuzzy systems, quality
models, software metrics.
ABSTRACT
Managing software development and maintenance projects
requires early knowledge about quality and effort needed
for achieving this quality level. Quality-based productivity
management is introduced as one approach for achieving
and using such process knowledge. Fuzzy rules are used as
a basis for constructing quality models that can identify
outlying software components that might cause potential
quality problems. A special fuzzy neural network is
introduced to obtain the hzzy rules combining the metrics
as premises and quality factors as conclusions. Using the
law of DeMorgan, this net structure is able to learn
premises just by changing the weights. Note that we
change neither the number of neurons nor the number of
connections. This new type of net allows for the extraction
of knowledge acquired by training on the past process data
directly in the form of fuzzy rules. Beyond that, it is
possible to transfer all the known rules to the Neural
Fuzzy System in advance. The suggested approach and its
advantages towards common simulation and decision
techniques is illustrated with experimental results. Its
application area is in maintenance productivity. A module
quality model - with respect to changes - provides both
quality of fit (according to past data) and predictive
accuracy (according to ongoing projects).
1. INTRODUCTION
Although striving for high quality standards, only a few
organizations apply true quality control. Quality control
consists of comparing observed quality with expected
quality. This minimizes the effort expended on correcting
the sources of defect thereby improving life-cycle
productivity.
In order to achieve an indication of software quality, the
software must be subjected to measurement. This is
accomplished through the use of metrics and statistical
evaluation techniques that relate specific quantified
product requirements to some attributes of quality.
Multivariate analyses techniques provide feedback about
relationships between components. Classification
techniques help determine outliers ( e. g. error-prone
components ).
In this presentation we are focussing on fuzzy
classification techniques and the generation of the
underlying fuzzy rules based on a neural fuzzy system.
The resulting fuzzy expert system yields a quality-based
productivity model. Both development and maintenance
are governed by the simple rule that more changes require
more effort. Hence unnecessary effort needs to be
minimized in order to improve productivity. Our approach
is to identify software components, that require
unnecessary effort, soon in the development process and
thus to provide design feedback before maintenance costs
will show up.
The presentation is organized as follows. The second
section presents a brief overview of the background and
the problems associated with metric-based decision
models. We are discussing the fuzzy sets, their application
in fuzzy classification and the construction of a fuzzy
classification system for software quality control. The next
section describes the new fuzzy neural network, its
structure and learning algorithm. Finally, experimental
results are provided to demonstrate the effectiveness of
our approach.
2. DATA ANALYSIS FOR SOFTWARE METRICS
The software development process is a creative process
with a more or less immaterial product as its output - that
has however value and can be measured by metrics. The
nature of software data presents a number of problems. It
is often heavily positively skewed with a large number of
outliers [SI. Daily life in software quality control shows
that it is often inappropriate to expect hard boundaries
between what is a good component and what is not. It is
for example somehow difficult to explain to design staff
that a specific module with 15 decisions should be
redesigned while another one with only 14 just passed the
check. However this is basically the approach of common
decision support systems. in software quality control.
Fuzzy Logic provides a natural conceptual framework for
representation of knowledge and inference processes
based on knowledge that is imprecise, incomplete or
0-7803-2559-1195 $4.00 0 1995 IEEE 4603
inconsistent. For example, cyclomatic complexity (number
of independent paths in a programm) is a linguistic
variable when its values are assumed to be: high, small,
medium, very high, rather small, etc., where each value
can be interpreted as a possibility distribution over all
integers. Assigning a numeric value to its linguistic value
seems to be highly dependent on experience and
environment (e.g. a cyclomatic complexity of twenty in
one module may be assigned to "medium" or "small" by
the same person at different times and in different
contexts). Therefore, the universe of a fuzzy value is never
a crisp point but rather a region where the term gradually
moves from being applicable to being not applicable.
Rules for classification and decision support thus should
be fuzzy in nature, because it is difficult for software
engineers to provide a complete set of mutually exclusive
heuristic classification rules. On the other hand, it is
unsatisfactory and practically unrealistic to deal with
automatically generated decision trees or rule sets with
crisp thresholds and results.
There are numerous introductions to the fundamentals,
methods and algorithms of fuzzy logic and approximate
reasoning techniques E121which allow us to just skim the
theory and proceed to an application of interest.
2.1 SOFTWARE METRICS AND QUALITY
MODELS
Quality or productivity factors to be predicted during the
development of a software system are affected by many
product and process attributes (e.g. software design
characteristics or the underlying development process and
its environment). Quality models are based upon former
project experiences and combine the quantification of
aspects of software components with a framework of rules
(e.g. limits for metrics, appropriate ranges etc.). They are
generated by the combination and statistical analysis of
product metrics (e.g. complexity measures) and product or
process attributes (e.g. quality characteristics, effort, etc.)
[3,6,8]. These models are evaluated by applying and
comparing exactly those invariant figures they are intended
to predict, the process metrics (e.g. effort, error rate,
number of changes since the project started, etc.). Iterative
repetition of this process can refme the quality models
hence allowing the use of them as predictors for similar
environments and projects. For assessing overall quality or
productivity, it is suitable to break it down into its
component factors (e. g. maintainability), thus arriving at
several aspects of software that can be analyzed
quantitatively. Typical problems connected to data
collection, analysis, and quality modelling are addressed
and discussed comprehensively in [3,6,8,9].
In existing software classification systems, the fuzziness of
the knowledge base is ignored because neither predicate
logic nor probability-based methods provide a systematic
basis for dealing with it [ 5] . As a consequence, fuzzy facts
and rules are generally manipulated as if they were
non-fuzzy, leading t3 conclusions whose validity is open to
question. As a simple illustration of this point, consider the
fact: " r f data bindings are between 6 and 10 and
cyclomatic complexity is greater than 18 the software
component is likely to have errors of a distinct type".
Obviously the meaning of this - automatically generated -
fact is less generally valid than stated and might be
provided by a maintenance expert as a fuzzy fact: "Ifdata
bindings are medium and cyclomatic complexity is large
than the software component is likely to have errors of a
distinct type." Of course, the latter fact requires the
determination of the fuzzy attributes "medium" or "large"
in the context of the linguistic variables they are associated
with (i.e. data bindings and cyclomatic complexity).
Although human experts are rather unsuccessful in
quantitative predictions (e.g. predicting the error number
or length of a given component), they may be relatively
efficient in qualitatively forecasting (e.g. maintainability,
error-proneness).
Any development project starts with defining project
specific quality goals that depend on customer
requirements, technical requirements and the company's
business goals. If these goals are contradictive they can be
analyzed and priorized with the help of techniques like
Quality Function Deployment (QFD) [l] or an Analytic
Hierarchy Process [ 10. After having defined the goals the
desired quality factors are determined. Product quality
relies on a defined quality model and the evaluation is
done with a fuzzy rule base.
The rule-base can be built by human experts. Rule
predicates are based on the fuzzified metric values. The
inference process of all active rules generates a fuzzy set
for the defined quality factor. The principle is shown in
premlses Onput data for fuzzy subsets) concluslon
Rul es
0
Maintainabilii index 60%
(crlsp out put data value)
:ig. 1 : Fuzzy inference with Min-Prod-Operator
2.2 QUALITY-BASED PRODUCTIVITY
MEASUREMENT
Productivity of any development or maintenance process is
defmed as the amount of output produced by the process
divided by the amount of input consumed by the process.
Common software productivity metrics take into account
only the product's size and the effort for its development.
Size can be derived from specification (Function Points or
requirements), design (Bang [4] or design objects) or
4604
implementation (LOC). Effort estimation is most often
based on the specification counting weighted functions
(function points [7]) or data. The size of the
implementation is commonly measured in lines-of-code
(LOC) even though everybody knows about their
disadvantages. The input is split up into personnel and
ressources. Both can be measured by consumed time and
related costs.
Boehm [ 2] distinguishes between software development
productivity and software life-cycle productivity. It is
acknowledged in the software engineering community that
maintenance can be the most costly stage in the life-cycle
of a software-product. High productivity might be
achieved measured as KLOC, function points or Bang
divided by the development effort, while at the same time
severe problems in maintaining the product can show up.
Depending on the project, trouble with other quality
factors like reliability may have similar consequences.
Such observations underline the importance of considering
all relevant quality factors requested for the project as
productivity factors, but this seldom happens.
We are introducing a fuzzy environment to predict
life-cycle productivity based on quality metrics.
Productivity predictions are typically used to control and
manage productivity as a means of technical controlling.
It is relatively easy to construct metric-based quality
models that classify data of past projects well, because all
such models can be calibrated according to quality of fit.
The difficulty lies in improving and stabilizing models
based on historic data, which are of value since they can be
used in anticipating future outcomes.
Wepropose to use a neural fuzzy system to assist human
experts defining the heap of unknown parameters in a
fuzzy rule-base.
3. THE STRUCTURE OF THE NEURAL FUZZY
SYSTEM
The principal structure of the net consists of a forward
driven multilayer perceptron with 1 input layer, 1 output
layer and 3 hidden layers. Each layer is equivalent to one
basic fimction of a fuzzy system.
The output of a neuron in the first layer is equal to the
crisp value x of the input variable.
net(') = x
(1)
))(I) =
In the second layer each neuron represents a fuzzy set of
one input variable. The output of one of these neurons
equals to the inverted membership of the crisp input x to
the appropriate set s.
net(2) =p 3 n ( ~ ( n ( 1 ) ,
(2)
-
+2) = net(2) =1 - net@)
where: W(n(' ), d2)) is the weight between a neuron
Any neuron of the Td layer is connected to one input
neuron by the weight of 1.
In the third layer there is one neuron per rule. The output
of one of these neurons indicates how much the current
of the lst and the 2"* layer
input values comply with the specific premises.
Concerning ( 2) we calculate the transfer function:
net(')
Using the law of DeMorgan we can easily deduce from
equation (3)
Hence we obtain:
Supposing that the weights can either be 0 or 1, the output
of a rule's neuron can be written as:
-- -
y( 3) =Wl J' y) A W2y 2 ( 2) A ... A Wry, (2)
y(31= (1 - Wl y i ( 2 ) ) A (1 -W2y2 ( 2) ) A ... A (1 - Wry, (2))
wn=l
Using (2) we obtain
y(') =A psk (4)
k=l I
wp=l
We see that the output of a neuron in the third layer is the
disjunction of those sets connected by the weight Wk =1.
That is why we call those neurons 'rules neurons'.
nnsi mO bs
crisplfuzzy rules fuzzylcrisp
Fig. 2:
The fourth layer contains as many neurons as there are
fuzzy sets of the output variable. These neurons are
connected to those rules' neurons having the appropriate
set as conclusion.
net 4) =( ay) , n(4))y:3) v ... v (nif), n(4))yj3)
y(4) = areaset . net
A Neural Fuzzy System with 2 inputs, 1 output,
5 rules and 3 sets per crisp variable
( 5 )
(4)
The totality of these output values is the fuzzy answer of
the neural fiuzy system to the current input pattern.
The neuron in the output layer realizes the defbzzification
using the 'center-of-area' strategy to obtain the crisp output
value.
To obtain 100% compatibility between this type of neural
network and a fuzzy system, the weights must comply with
some restrictions:
I.) The connections at the input of a rule's neuron must
have either the weight 0 or 1.
2.) One connection at the output of a rule's neuron must
have the weight 1. All the others have to be 0.
4605
It is remarkable, that the rules implemented in such a
Neural Fuzzy System are invisible on the level of the net
structure. Only the weights, chosen or learned, determine
the actual functionality.
4. THE LEARNING ALGORITHM
Adapting the free parameters to fit a set of learning data is
an example for the application of a supervised learning
algorithm. We adjusted slightly the error-back-propagation
algorithm to this new type of neural network. By changing
the input and output connections of the rules' neurons a
rulebase can be learned from example data. To
additionally learn the position of the fuzzy sets of the input
variables, the pertinent parameters can be adapted at the
same time.
In this context 'learning' means minimizing the energy
function:
- 1,
where: L: the entirety of all the learning patterns
t?) : target of the i-th neuron in the 4"' layer
y y ) : output of the i-th neuron in the 4' layer
Although this neural network has 5 layers, the target is
defmed in the 4'h layer. Defining the target in the 5"' layer
led to an insufficient learning result (see fig. 3).
Using (IO) and ( 11) we can easily calculate the
modification of the weibts:
where: (i,j) E [(2,3), (3,411
All the other weights are not touched by the learning
algorithm and stay at w =1 ,
Assuming we have triangular membership fimctions
defined by the parameters 1, m, r (see fig. 4) we obtain the
following learning rules:
A1 =~ - _ _ 6 ( ~ ) ay(*)
ai
- a - - b -
1 small big ,! middle
1 - h
b 0 0.5 1
Fig. 4: The 3 parameters defining a membership function
target defined inthe fifth layer target defined in t he fourth layer
Fig.3: Fuzzy result with crisp target =0.5
In both cases, the crisp output value equals to the target of
the learning pattern. Nevertheless, the result -b- is clearer
and therefore preferable.
Since the weights between the 4I h and the 5I h layer are
invariable anyhow, this kind of target definition is not a
limitation to the learning capability.
As the error-back-propagation algorithm needs derivable
functions, the 'fuzzy-and' and 'fuzzy-or' operators are
chosen as:
(7)
PAvB =P A ' !J-B
WAVB =PA +ps - V A . pi?
Corresponding to the basic idea of the error-back-
propagation algorithm, the modification of every
parameter is calculated as:
To obtain a recursive calculating scheme, we introduce the
error of a neuron:
According to (9) we obtain:
(9)
5. XDEALIZATION OF THE WEIGHTS
Using the foregoing equations, the weights and the
parameters of the membership functions can be modified
to minimize the energy function. But as mentioned before,
the weights have to comply with some restrictions to
enable the transformation of the trained neural network
into a fuzzy system.
For that purpose, the neural network is trained until the
energy function reaches a minimum and stagnates. Then
the input and output weights of one rule's neuron are
idealized in order to comply with the restrictions. Because
the energy function has reached a minimum before this
idealization, changing the weights means a perturbation to
the learning algorithm and increases the value of the
energy function. To keep this perturbation as small as
possible, we idealize that rule's neuron of which the input
and output weights comply best with the restrictions.
Afterwards, the training is continued, but the weights once
idealized are kept at their value.
In this procedure, we make use of the capability of neural
networks to compensate the damage of one neuron by
changing the weights of the others.
4606
6. TESTRESULTS
In order to study the capability of this new type of Neural
Fuzzy System, we built up a testing environment described
in figure 5.
Neural-Fuzzy-System
T
I Fuzzy-System 1
I
example data
(input /output pattem)
Fig. 5: The testing environment
We take some rules to build up a fuzzy system and apply
random generated input data to it. Combining the input
pattem with the output value calculated by the fuzzy
system yields the example data.
The Neural Fuzzy System is trained using this data and
leams a set of rules. After that, all these rules are
compared to those chosen to build up the fuzzy system.
This comparison makes it possible to judge the leaming
algorithm.
Subsequently we will consider a fuzzy system with 2
inputs and 1 output variable each splitted in 3 triangular
fuzzy sets. The rulebase consists of 2 rules with one
premise and 3 rules with 2 premises. Using this fuzzy
system we generated 70 inpub'output patterns.
If in, is big
If in, is big
If in, is small and in, is small then out is small
If in, is small and in, is middle then out is small
If in, is middle and in, is small then out is small
Tab. 1 : The rulebase of the example fuzzy system
The structure of the Neural Fuzzy System is chosen
exactly like that one shown in figure 2.
For a first test the membership functions in the Neural
Fuzzy System are chosen equal to those of the example
fuzzy system. After presenting the set of example data 350
times all weights were idealized. Examining the resulting
weights tolfrom the rules' neurons showed that the Neural
Fuzzy System has learned the original rulebase.
For a second test we plagued all input and output data by
adding a white noise signal:
where: x(k) is an input or output value (0 I x(k) I xmax)
~ ( k ) is a random number (0 5 r(k) 5 1 )
Even with this plagued example data the Neural Fuzzy
System was able to find out the original rules.
For the third test the positions of the membership fimctions
in the Neural Fuzzy System were different from those in
then out is small
then out is middle
x, ( k) =x(k) +5% . xmaxr(k) (17)
the example Fuzzy System. The leaming algorithm now
modified both at the same time, the weights and the
positions of the fuzzy sets. After 2500 cycles the Neural
Fuzzy System has learned all the original rules and adapted
the positions of the fuzzy sets sufficiently well.
7. APPLICATION ON REAL QUALITY DATA
To demonstrate the effectiveness of our approach we
investigated the quality results of 45 I modules of real-time
software based on 10 complexity metrics. Instead of
predicting number of errors or number of changes (i.e.
algorithmic relationships) we are considering assignments
to groups (e.g. "change-prone"). While the first goal has
been achieved more or less with regression models or
neural networks predominantly for finished projects, the
latter goal seems to be adequate for predicting potential
outliers in running projects, where preciseness is too
expensive and unnecessary for decision support.
We defined the quality factor changeproness based on the
results of the system test. Setl-modules had 0 changes,
set2-modules had 1-2 changes and and set3-modules had 3
or more changes. Based on 10 complexity metrics we tried
to find a fuzzy-rule-base to explain the quality-behaviour
of the different modules. First we decided to build a rather
simple set of linguistic rules based on experiences ii-om
literature and former projects in the same environment
[3],[1 I]. We optimized the positions of the different fuzzy
sets manually and received the result, shown in fig. 6.
Chi -Resul t : 131.71
I
238
/238
I
Fig. 6: Quality prediction with manually generated
The rows indicate reality, the columns indicate the
prognosis. The prognosis is correct on the 1st diagonale.
299 modules were classified correctly.
Next we applied our neural fuzzy system. We received 12
new rules and slightly different positions for most of the
hzzy sets. The new rules had up to 4 premises. The result
is shown in fig. 7. The number of correctly classified
modules has improved to 312 modules. More important,
rulebase
4607
classification errors of type I ( 'change-prone-modules'
classified as 'free-of changes') have been reduced. The
found fuzzyrules indicate what has to be changed in order
to avoid expensive and unnecessary changes or
maintenance after delivery of the product.
:han Chi -Resut : 182.88
233
/233
/ /
0
Set1
8 *." B
Prognosis
'ig. 7: Quality prediction with neural fuzzysimulation
Finally we weighted the energy function of the neural net
differently to identify the most change-prone modules.
The result is shown in fig. 8
$
sen *
Fig. 8: Quality prediction with weighted energy function
Type I errors have been further reduced. This would be
interesting especially for safety-critical applications, where
we can afford the increase in numbers of Type I1errors
('kee-of-changes modules' classified as 'change-prone').
8. Conclusions
Our target is a quality-based improvement of software
life-cycle productivity. The evaluation of software quality
early in the life-cycle can be used to do the necessary
corrections as cheap as possible. Changes after delivery of
the product are usually several times more expensive.
We demonstrated, that it is possible to predict
module-changes based on a fuzzyexpert system. The fuzzy
expert system can be built with a neural fuzzy system
trained with past project data from the same development
environment. We avoid the black-box behaviour of
common neural networks. The found rules should be
interpreted by experts and rules with accidental
combinations of premises should be removed.
There are other even more promising applications of our
approach. It would be very easy to predict the quality
factor maintainability based on metrics and the size of the
different modules if one had the maintenance effort data
per module. It is well known, that up to 80% percent of the
software development effort has to be spent in
maintenance. Our approach could lead to tremendous
improvements.
Acknowledgements
We would like to thank the Institute for Microelectronics,
Stuttgart, especially Stefan Neusser, for the NNSIM-tool
support.
References
[l] Akao, Yoji, ed. : Quality Function Deployment :
Integrating Customer Requirements into Product Design.
Translated by Glenn Mam. Productivity Press,
Cambridge, MA, USA, 1990.
[2] Boehm, B.W.: Software Engineering Economics.
Prentice Hall. Englewood Cliffs, N.J ., USA, 1981.
[3] Card, D. N. und R. L. Glass: Measuring Software
Design Quality. Prentice Hall. Englewood Cliffs, N.J .,
USA, 1990.
[4] DeMarco, T. : Controlling Software Projects.
Prentice Hall. Englewood Cliffs, N.J ., USA, 1982.
[5] Ebert, C.: Rule-based fuzzy classification for
software quality control. Fuzzy Sets and Systems, Vol. 63,
[ 6] Fenton, N. E.: Software Metrics: A Rigorous
Approach. Chapman & Hall, London, UK, 1991.
[7] Jones, C.: Applied Software Measurement -
Assuring Productivity and Quality. McGrawHill, New
York, NY, USA, 1991.
[8] Kitchenham, B. et al.: Towards a constructive
quality model. Sofhvare Engineering Journal, J uly 1987.
[9] Munson, J . C. and T. M. Khoshgoftaar: The
Detection of Fault-Prone Programs. IEEE Transactions
on Software Engineering, Vol. 18, No. 5, pp. 423-433,
1992.
[lo] Saaty, T.L.: The Analytic Hierarchy Process, 2nd
ed. RWS Publications, Pittsburgh, PA, USA, 1991.
[I l l Shepperd, M.: Early life-cycle Metrics and
Software Quality Models. Information and Software
TechnoZogy, Vol. 32, No. 4, pp. 3 1 1-3 16, 1990.
[12] Zimmermann, H.-J .: Fuzzy Set Theory and its
Applications. Kluwer, Boston, 2nd edition, 199 1.
pp. 349 - 358, 1994.
4608

You might also like