Iteration DEvpt
Iteration DEvpt
Iteration DEvpt
5th Edition
4-0
BBC
4-1
Without going through a formal tender process, the contract to develop DMI was awarded to the BBCs existing technology provider (Siemens) in 2008.
4-3
Aftermath
Results:
Reference:
4-5
4-6
Operational Problems
Failure Factors
Tracking and analysis of the errors identified after the
launch, identified more than 500 distinct defects in the system. Of those 44 were deemed to be very serious. In Aug of 2012 when the system went live reports indicate that only 147 defects were known meaning that Quality Assurance testing had failed to identify several hundred problems in the system. Many of those problems were traced back to errors in the original project requirements and the design of the new system that allowed incorrect data to be entered into the system thereby leading to incorrect payroll payments and related problems. 4-8
Aftermath
Affecting daily life
Those problems continued to grow and the issues became headline news in New Zealand as affected employees struggled to maintain their personal finances in the face of the cash flow problems the systems failures were causing.
A Mar 2013 review performed by Deloitte raised serious questions about the stability of the system and outlined a 1-year remedial plan that needed to be followed to ensure the operational stability of the system and that the originally planned business benefits were realized. Why Project Fail: New Zealand Ministry of Education, May 28, 2013
http://calleam.com/WTPF/?p=5835
4-9
Result:
Reference:
4-10
IT Plan
would take 8 years to reach full deployment and would cost $3B. Work was to be started in 2004 and was to be completed by 2012.
4-12
Problems
By 2010, signs of major problems had surfaced. Between 2010 and early 2012, the project had been
through no less than three project resets. No results even spending more time & money
By 2012, the Air Force had determined that the $1B spent to date had yielded negligible benefits and if they proceeded they would need $1.1B more to deploy just 25% of the original scope. Even with such a scaled back proposal, deployment would be pushed back to 2020 meaning that significant additional work and risk remained. Recognizing that a partial solution negated the overall vision of an integrated system, the Air Force scrapped the Project in Nov 2012.
4-13
Deciding to scrap it
Failure Factors
Failure to baseline existing practices and to establish
effective measures for the desired outcomes Failure to establish an effective governance structure Selecting an Oracle based Commercial-Off-the-Shelf (COTS) based product that was a poor fit for the project requirements Lack of experience in large scale, complex integrated systems development and deployment Organizational silos Failure to effectively engage all affected stakeholders Lack of collaboration and a lack of understanding of 4-14 change management
Aftermath
For the $1B spent, nothing could be saved. $1 billion wasted on Air Force computer
system, NBC Nightly News, February 08, 2013
http://www.nbcnews.com/video/nightlynews/50749586#50749586 http://www.youtube.com/watch?v=M_x_E4UkgV0 Why Projects Fail: US Department of Defense US Air Force
http://www.youtube.com/watch?v=M_x_E4UkgV0
4-15
Reference:
4-16
Project type : Field Data Collection Project name : Field Data Collection
Automation (FDCA) Date : Apr 2008, Cost :$595M Issues:
Due to concerns about escalating costs and questions about the accuracy of the data being collected, in 2001 the US Census Bureau decided to undertake a major modernization program in preparation for the 2010 census. 4-17
IT Development
Having first attempted to do the project inhouse, field testing in 2004 demonstrated that the project was more complex than anticipated. As a result the Bureau changed direction and engaged an external provider to complete the project. Taking a further year to get the Request for Proposal published time remaining before dress rehearsals in 2006 and 2008 was running short.
4-18
Failure Factors
Underestimation of complexity. The projects problems continued even after engaging an
outside supplier to complete the work. Lack of due diligence on behalf of the Bureau and failure to establish effective communications with the supplier resulted in a significant number of missing requirements. Failure to establish and stabilize requirements resulting in significant requirements volatility (at one point 400 plus change orders had been raised). Despite warnings from external auditors the problems were allowed to persist and ultimately time ran out.
4-19
Aftermath
The Bureau was left with no choice other than reverting back to using pen and paper. The failure of the project resulted in the
Bureau having to request an addition $3B in funding to complete the work using the existing manual procedures. Reference:
US Census Field Data Collection Automation Case Study Why Project Fail: US Census Bureau Field Data Collection Automation (FDCA) Case Study
http://calleam.com/WTPF/?p=1894
4-20
4-21
eCourier
eCourier are a same day 24/7 courier service based
in London, UK. The company was formed in 2003 with the aim of providing a courier service that is focused on delivery transparency and automated customer interaction. Parcels are collected from any London address and taken to any final destination requested by the client. Although the company originally only delivered parcels to London addresses, they now deliver parcels to any global location requested by corporate clients.
4-22
System
Advanced Information Based Allocation (AIBA), an intelligent dispatch and fleet management system is utilized to allocate a booking to the most appropriate vehicle (including bicycles, motorbikes and vans of varying sizes) and sends a message to the courier to alert them to the job through their handheld terminal. Couriers are rewarded and motivated with high pay for the service levels they provide, supported by the technology, thus rendering a courier service that is reliable, trustworthy and transparent. In addition to systems for employees, the customer has been provided with a real time tracking of their parcel on a map, utilizing GPS technology. eCourier.co.uk Demo
http://www.youtube.com/watch?v=92pxg91HYbE
4-23
Understanding and managing technical issues Managing the project through the proposed
control cycle Reference:
Case Study of Successful Complex IT Projects - BCS
4-24
Reference:
What Makes Projects Succeed? Synergy Professional Services, http://www.spspro.com/brochures/9What%20Makes%20Projects%20Succeed%20070213%20lo ng.pdf 4-25
Customer Involvement Agreement on the goals of the project Frequent progress checks and course corrections A plan that shows overall path and responsibilities Constant and effective communication to everyone Controlled scope Management support
4-26
Customers don't (really) know what they want Requirements change during the course of the
project Customers have unreasonable timelines Communication gaps exist between customers, engineers and project managers The development team doesn't understand the politics of the customer's organization Reference:
Five common errors in requirements analysis (and how to avoid them) http://www.techrepublic.com/article/five-common-errors-inrequirements-analysis-and-how-to-avoid-them/
4-29
4-30
analyzing and incorporating change requests, and make your customer aware of his/her entry point into this process. Set milestones for each development phase beyond which certain changes are not permissible -- for example, disallowing major changes once a module reaches 75 percent completion. Ensure that change requests (and approvals) are clearly communicated to all stakeholders, together with their rationale, and that the master project plan 4-31 is updated accordingly.
Solving Issues about Not Familiar with Politics in Customers Site Review your existing network and identify both the
information you need and who is likely to have it. Cultivate allies, build relationships and think systematically about your social capital in the organization. Persuade opponents within your customer's organization by framing issues in a way that is relevant to their own experience. Use initial points of access/leverage to move your agenda forward.
4-34
Chapter 4 Outline
Use Cases
Elements of a use case. Alternative use case formats. Use cases and functional requirements. Use cases and testing. Building use cases.
Copyright 2011 John Wiley & Sons, Inc. 4-35
INTRODUCTION
Use cases are a means of expressing user
requirements. Use cases are used extensively in the analysis phase. A use case represents how a system interacts with its environment by illustrating the activities that are performed by the users and the systems responses. The text-based use case is easy for the users to understand, and also flows easily into the creation of process models and the data model.
4-36
USE CASES
A use case depicts a set of activities that produce
some output result. Each use case describes how an external user triggers an event to which the system must respond. With this type of event-driven modeling, everything in the system can be thought of as a response to some triggering event. Creation of use cases is often done as a part of interview session with users or a part of JAD sessions. Copyright 2011 John Wiley & Sons, Inc.
4-37
Basic Information
Example
4-39
4-40
Preconditions
4-41
Example: Preconditions
4-42
Normal Course
4-44
Alternative Courses
4-45
4-46
Postconditions
Example: Postconditions
4-48
Exceptions
4-49
Example: Exceptions
4-50
4-51
4-52
4-54
4-56
4-58
4-59
4-60
4-61
4-62
4-64
SUMMARY