SPM Unit 3 Notes

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Unit-3:Effort Estimation

Software Effort Estimation:

Successful project is that the system is delivered on time and within budget and with the
required quality. This implies that targets are set which the project manager then tries to
meet. But sometimes the deadlines can’t be met due to incorrect initial estimates. Realistic
estimates are therefore crucial.

Difficulties in Software estimation

 Subjective Nature of estimating

 Political Implications

 Changing Technology

 Lack of homogeneity of project experience

Where are estimates done?

Estimates are carried out at various stages of software project.

 Strategic Planning :Decide priority to each project.

 Feasibility Study :Benefits of potential system

 System Specification :Detailed requirement analysis at design stage.


 Evaluation of Suppliers Proposals :Tender Management
 Project Planning: Detailed estimates of smaller work components during
implementation.

Sample Project Data:


Problems with over and underestimates
 Parkinson's Law 'Work expands to fill the time available', which implies that given an
easy target staff will work less hard.
 Brooks' Law The effort required to implement a project will go up disproportionately
with the number of staff assigned to the project. As the project team grows in size so
will the effort that has to go into management, co-ordination and communication. This
has given rise, in extreme cases, to the notion of Brooks' Law: 'putting more people on
a late job makes it later'. If there is an over-estimate of the effort required then this
might lead to more staff being allocated than are needed and managerial overheads
will be increased. This is more likely to be of significance with large projects.
Basis for software estimating:
 The need for historical data
Nearly all estimating methods need information about how projects have been
implemented in the past. However, care needs to be taken in judging the applicability
of data to the estimator's own circumstances because of possible differences in
environmental factors such as the programming languages used, the software tools
available, the standards enforced and the experience of the staff.
 Measure of work
It is normally not possible to calculate directly the actual cost or time required to
implement a project. The time taken to write a program might vary according to the
competence or experience of the programmer. Implementation times might also vary
because of environmental factors such as the software tools available. The usual
practice is therefore to express the work content of the system to be implemented
independently of effort, using a measure such as source lines of code (SLOC). The
reader might also come across the abbreviation KLOC which refers to thousands of
lines of code.
Software Effort Estimation Techniques:

Barry Boehm, in his classic work on software effort models, identified the main ways of
deriving estimates of software development effort as:
• algorithmic models - which use 'effort drivers' representing characteristics of the target
system and the implementation environment to predict effort;
• expert judgement - where the advice of knowledgeable staff is solicited;
• analogy - where a similar, completed, project is identified and its actual effort is used as a
basis for the new project;
• Parkinson - which identifies the staff effort available to do a project and uses that as the
‘estimate’?
• price to win - where the 'estimate' is a figure that appears to be sufficiently low to win a
contract;
• top-down - where an overall estimate is formulated for the whole project and is then broken
down into the effort required for component tasks;
• bottom-up - where component tasks are identified and sized and these individual estimates
are aggregated.
Bottom-up estimating

Estimating methods can be generally divided into bottom-up and top-down approaches. With
the bottom-up approach, the estimator breaks the project into its component tasks and then
estimates how much effort will be required to carry out each task. With a large project, the
process of breaking down into tasks would be a repetitive one: each task would be analysed
into its component sub-tasks and these in turn would be further analysed. This is repeated
until you get to components that can be executed by a single person in about a week or two.
We will use this approach when a project is completely novel or no historical data
available.

A Procedural code-oriented approach:

 Envisage the number and type of software modules in the final system
 Estimate the SLOC of each identified module
 Estimate the work content , taking into account complexity and technical difficulty
 Calculate the work-days effort.

Top-down Approach and Parametric Models:

The top-down approach is normally associated with parametric (or algorithmic) models. The
effort needed to implement a project will be related mainly to variables associated with
characteristics of the final system. The form of the parametric model will normally be one or
more formulae in the form:
effort = (system size) x (productivity rate)
For example, Amanda at IOE might estimate that the first software module to be constructed
is 2 KLOC. She might then judge that if Kate undertook the development of the code, with
her expertise she could work at a rate of 40 days per KLOC and complete the work in 2 x 40
days, that is, 80 days, while Ken, who is less experienced, would need 55 days per KLOC and
take 2 x 55 that is, 110 days to complete the task.

Some parametric models, such as that implied by function points, are focused on system or
task size, while others, such are COCOMO, are more concerned with productivity factors.

Expert judgment:
This is asking someone who is knowledgeable about either the application area or the
development environment to give an estimate of the effort needed to carry out a task. This
method will most likely be used when estimating the effort needed to change an existing
piece of software. The estimator would have to carry out some kind of impact analysis in
order to judge the proportion of code that would be affected and from that derive an estimate.
Someone already familiar with the software would be in the best position to do this.
Some have suggested that expert judgement is simply a matter of guessing, but our own
research has shown that experts tend to use a combination of an informal analogy approach
where similar projects from the past are identified (see below), and bottom-up estimating.
Estimating By Analogy:
Analogous estimating is the act of using former projects to estimate how long or how
much a current project will take or cost. In other words, it is a technique that centres on
comparison. This means that the more data that is available, the better the estimate will
be. Thus, collecting data for each project will build up a database that can be used in
future projects for comparisons of cost and time.

Albrecht Function Point Analysis:

This is a top-down method that was devised by Allan Albrecht when he worked for
IBM. Albrecht was investigating programming productivity and needed some way to
He developed the idea of function points (FPs).
The basis of function point analysis is that computer-based information systems comprise
five major components:
• External input types are input transactions that update internal computer files.
• External output types are transactions where data is output to the user. Typically
these would be printed reports, since screen displays would come under external
inquiry types .
• Logical internal file types are the standing files used by the system. The term 'file'
does not sit easily with modern information systems. It refers to a group of data that is
usually accessed together. It might be made up of one or more record types.
• External interface file types allow for output and input that might pass to and from
other computer applications.
• External inquiry types - are transactions initiated by the user that provide
information but do not update the internal files. The user inputs some information that
directs the system to the details required.
The analyst has to identify each instance of each external user type in the projected system.
Each component is then classified as having either high, average or low complexity. The
counts of each external user type in each complexity band are multiplied by specified weights
(see below Table ) to get FP scores, which are summed to obtain an overall FP count, which
indicates the information processing size.
Example: A logical Internal File contains purchase order organized into two separate record
types: Main purchase order details namely Purchase order number, supplier reference,
purchase order date and Purchase order item details namely product code, unit price and
number ordered.

Solution: From given Question, we can draw the details as

 No. of record types = 2

 No. of data types = 6

 File type would be rated as low (By above Table)

Therefore the FP Count is 7

Function Points Mark II:

Sponsored by CCTA(Central Computer and Telecommunications Agency) . Mark II


– Improvement and replacement in Albrecht method

In Albrecht method

 Information Processing Size is measured in UFPs(Unadjusted Functional


Points)

 Then TCA(Technical Complexity Adjustment) is applied

For each transaction UFPs are calculated as:

UFPs = Wi * (number of input data element types)+ We * (number of entity types


referenced)+ Wo * (number of output data element types)

 Wi We Wo are weightings derived by asking the developers the proportions of


effort spent.
 FP counters use industry averages which are:
 Wi = 0.58
 We = 1.66
 Wo = 0.26

COCOMO: a parametric model

Boehm's COCOMO (Constructive COst MOdel) is often referred to in the literature on


software project management, particularly in connection with software estimating. The term
COCOMO really refers to a group of models.

Boehm originally based his models in the late 1970s on a study of 63 projects. Of these only
seven were business systems and so they could be used with applications other than
information systems. The basic model was built around the equation

effort = c( size)k

where

 effort is measured in pm, or the number of 'person-months'



size is measured in kdsi, thousands of delivered source code instructions, and

c and k are constants.

The first step was to derive an estimate of the system size in terms of kdsi. The constants, c
and k (see below Table), depended on whether the system could be classified, in Boehm's
terms, as 'organic', 'semi-detached' or 'embedded'. These related to the technical nature of the
system and the development environment.

 Organic mode - this would typically be the case when relatively small teams
developed software in a highly familiar in-house environment and when the system
being developed was small and the interface requirements were flexible.
 Embedded mode - this meant the product being developed had to operate within very
tight constraints and changes to the system were very costly.
 Semi-detached mode - this combined elements of the organic and the embedded
modes or had characteristics that came between the two.
USECASE Based Estimation:

Use Case Points (UCP) is a software estimation technique used to forecast the software size
for software development projects. UCP is used when the Unified Modeling
Language (UML) and Rational Unified Process (RUP) methodologies are being used for the
software design and development. The concept of UCP is based on the requirements for the
system being written using use cases, which is part of the UML set of modeling techniques.
The software size (UCP) is calculated based on elements of the system use cases with
factoring to account for technical and environmental considerations. The UCP for a project
can then be used to calculate the estimated effort for a project.

The method for determining the size estimate to develop a system is based on a calculation
with the following elements:

 Unadjusted Use Case Weight (UUCW) – the point size of the software that accounts
for the number and complexity of use cases.
 Unadjusted Actor Weight (UAW) – the point size of the software that accounts for the
number and complexity of actors.
 Technical Complexity Factor (TCF) – factor that is used to adjust the size based on
technical considerations.
 Environmental Complexity Factor (ECF) – factor that is used to adjust the size based
on environmental considerations.
Once the previous four elements have been calculated, the final size estimate can be
calculated. This final number is known as the Use Case Points or UCP for a software
development project.
Unadjusted Use Case Weight (UUCW)
The UUCW is one of the factors that contribute to the size of the software being developed. It
is calculated based on the number and complexity of the use cases for the system. To find the
UUCW for a system, each of the use cases must be identified and classified as Simple,
Average or Complex based on the number of transactions the use case contains. Each
classification has a predefined weight assigned. Once all use cases have been classified as
simple, average or complex, the total weight (UUCW) is determined by summing the
corresponding weights for each use case. The following chart shows the different
classifications of use cases based on the number of transactions and the weight value assigned
for each use case within the classification.

UUCW = (Total No. of Simple Use Cases x 5) + (Total No. Average Use Cases x 10)
+ (Total No. Complex Use Cases x 15)

Unadjusted Actor Weight (UAW)


The UAW is another factor that contributes to the size of the software being developed. It is
calculated based on the number and complexity of the actors for the system. Similar to
finding the UUCW.
The following chart shows the different classifications of actors and the weight value
assigned.

UAW = (Total No. of Simple actors x 1) + (Total No. Average actors x 2) + (Total
No. Complex actors x 3)
Technical Complexity Factor (TCF)[edit]
The TCF is one of the factors applied to the estimated size of the software in order to account
for technical considerations of the system. It is determined by assigning a score between 0
(factor is irrelevant) and 5 (factor is essential) to each of the 13 technical factors listed in the
table below. This score is then multiplied by the defined weighted value for each factor. The
total of all calculated values is the technical factor (TF). The TF is then used to compute the
TCF with the following formula:
TCF = 0.6 + (TF/100)

Environmental Complexity Factor (ECF)


The ECF is another factor applied to the estimated size of the software in order to account for
environmental considerations of the system. It is determined by assigning a score between 0
(no experience) and 5 (expert) to each of the 8 environmental factors listed in the table below.
This score is then multiplied by the defined weighted value for each factor. The total of all
calculated values is the environment factor (EF). The EF is then used to compute the ECF
with the following formula:
ECF = 1.4 + (-0.03 x EF)
Use Case Points (UCP)[
Finally the UCP can be calculated once the unadjusted project size (UUCW and UAW),
technical factor (TCF) and environmental factor (ECF) have been determined. The UCP is
calculated based on the following formula:
UCP = (UUCW + UAW) x TCF x ECF
Activity Planning
Introduction: A detailed plan for the project. However, must also include a schedule
indicating the start and completion times for each activity. This will enable us to:

 ensure that the appropriate resources will be available precisely when


required;

 avoid different activities competing for the same resources at the same
time:

 produce a detailed schedule showing which staff carry out each activity;

 produce a detailed plan against which actual achievement may be


measured:

 produce a timed cash flow forecast:

 Replan the project during its life to correct drift from the target.

Objectives of Activity Planning:

 Feasibility imminent is the project possible within required timescales and


resource constraints?

 Resource allocation What are the most effective ways of allocating resources to
the project and when should they be available?

 Resource allocation What are the most effective ways of allocating resources to
the project and when should they be available?

 Motivation Providing targets and being seen to monitor achievement against


targets is an effective way

 Co-ordination When does the staff in different departments need to be available to


work on a particular project and when do staffs need to be transferred between
projects?

Project Schedule

Project plan is a detailed plan which shows dates when each activity should start and
finish and when and how much of each resource will he required. Once the plan has been
refined to this level of detail we call it a project schedule.

Creating a project schedule comprises four main stages.

 Constructing an Ideal Activity Plan

 Risk Analysis
 Risk Allocation

 Schedule Production

Identifying Activities: Essentially there are three approaches to identifying the activities or
tasks that make up a project .

 The activity-based approach

 The product, based approach

 The hybrid approach.

Activity Based Approach:

The activity-based approach consists of creating a list of all the activities that the project is
thought to involve.

 This might involve a brainstorming session involving the whole


project team .

 it might stem from an analysis of similar past project

When listing activities, particularly for a large project, it might be helpful to subdivide the
project into the main life style stages and consider each of these separately.
Rather than doing this in an ad hoc manner, with the obvious risks of omitting or double-counting tasks, a much favoured way
of generating a task list is to create a Work Breakdown Structure (WBS). This involves identifying the main (or high-level)
tasks required to complete a project and then breaking each of these down into a set of lower-level tasks.
Product Based Approach:

 It consist of producing a Product Breakdown Structure and a product flow diagram(PFD).

 PFD indicated, for each product, which other products are required as inputs.

 The PFD can therefore be easily transformed into an ordered list of activities by identifying
the transformation that turn some products into others.

The Hybrid Approach:

 The WBS is based entirely on a structuring of activities.

 The figure below is based on a simple list of final deliverables and, for each deliverable, a set
of activities required to produce that project.

 In a project of any size, it would be beneficial to introduce additional levels- structuring is


product and activities.

You might also like