Report To The Chairman, Subcommittee On Transportation, Committee On Appropriations, House of Representatives
Report To The Chairman, Subcommittee On Transportation, Committee On Appropriations, House of Representatives
Report To The Chairman, Subcommittee On Transportation, Committee On Appropriations, House of Representatives
March 1997
AIR TRAFFIC
CONTROL
Immature Software
Acquisition Processes
Increase FAA System
Acquisition Risks
GAO/AIMD-97-47
United States
GAO General Accounting Office
Washington, D.C. 20548
B-271531
This report responds to your request that we review the Federal Aviation Administration’s (FAA)
air traffic control modernization software acquisition maturity and improvement activities. FAA
plans to spend billions of dollars replacing its existing air traffic control systems, but has a
history of performing poorly when acquiring these software-intensive systems. We found that
FAA’s software acquisition processes are immature, and are making recommendations to the
Secretary of Transportation for strengthening them.
We are sending copies of this report to the Secretary of Transportation, the Director of the
Office of Management and Budget, the Administrator of the Federal Aviation Administration,
and other congressional committees. We will also make copies available to other interested
parties upon request. If you have questions or wish to discuss the issues in this report, please
contact me at (202) 512-6412. Major contributors to this report are listed in appendix II.
Sincerely yours,
1
High-Risk Series: An Overview (GAO/HR-95-1, Feb. 1995); High-Risk Series: Information Management
and Technology (GAO/HR-97-9, Feb. 1997).
2
We used a draft version of the model for our evaluation (version 00.03, dated April 1996). The first
published version of the model was released on October 1996, after we performed our evaluation.
According to the model’s authors, the published version differed only editorially from the draft we
used.
3
The seven KPAs relating to the repeatable level are software acquisition planning, solicitation,
requirements development and management, project office management, contract tracking and
oversight, evaluation, and transition and support.
4
GAO asked FAA to choose five projects that are: (1) major efforts with large software acquisition
components, (2) managed by different FAA product teams, (3) at different life cycle stages, and
(4) among FAA’s best managed.
5
According to the SA-CMM, solicitation is the process of ensuring that award is made to the contractor
most capable of satisfying the specified requirements, and evaluation is the process of determining
that acquired software products and services satisfy contract requirements prior to acceptance.
Principal Findings
ATC Modernization To attain a given SEI-defined maturity level, an organization must satisfy
Software Acquisition the key practices for the KPAs associated with that level. FAA’s ATC
Processes Are Immature modernization organization had too many weaknesses to satisfy any of the
“repeatable” KPAs (i.e., software acquisition planning, solicitation,
requirements development and management, project office management,
6
According to the SA-CMM, software acquisition planning is the process for ensuring that reasonable
planning for all elements of the software acquisition occur; requirements development and
management is the process for establishing an unambiguous and agreed upon set of software
requirements; project office management is the process for effective and efficient management of
project office activities; contract tracking and oversight is the process of ensuring that contractor
activities, products, and services satisfy contract requirements; transition and support is the process of
transferring acquired software products to the eventual support organization; and acquisition risk
management is the process of identifying software risks early and adjusting the acquisition strategy to
mitigate those risks.
For FAA to satisfy any of these eight KPAs, it must eliminate the key practice
weaknesses identified in this report.7 Each practice that is performed
effectively constitutes a strength, and each practice not performed or
performed poorly constitutes a weakness. While FAA’s ATC modernization
has some strengths, it has more weaknesses. Table 1 tallies these strengths
on the five projects that GAO evaluated. In summary, of the total number of
KPA practices rated, 38 percent constituted strengths, 50 percent were
weaknesses, and 12 percent were observations. An observation indicates
that the evidence was inconclusive and did not clearly support a
determination of either strength or weakness.
Additionally, GAO found that while the five projects varied as to practice
strengths, weaknesses, and observations under three of the five “common
features” or practice groupings (commitment to perform, ability to
perform, and activities performed), the projects were consistently weak in
all practices under the remaining two groupings (measurement and
analysis and verifying implementation). As a result, software project teams
and FAA management consistently lack reliable information on project
team performance.
7
SEI groups each of its KPA practices into one of five “common features” or practice categories. These
are “commitment to perform, ability to perform, activities performed, measurement and analysis, and
verifying implementation.”
FAA’s Approach for To be effective, the FAA organization responsible for software acquisition
Improving ATC process improvement must have (1) organizational and/or budgetary
Modernization Software authority over the ATC modernization units acquiring the software; and
(2) an effective plan to guide improvement efforts and measure progress
Acquisition Processes Is on each. The FAA organizational entity currently responsible for ATC
Not Effective modernization software acquisition process improvement, the SEPG, has
neither. As a result, little progress has been made over the last 4 years in
instituting definition and discipline into ATC modernization software
acquisition processes.
The SEPG and its predecessors have advocated and initiated a collection of
efforts intended to strengthen ATC modernization software-related
processes, including software acquisition processes. However, there is no
analytical basis for the focus, content, timing, and interrelationships of
these efforts. Specifically, the efforts (1) are not based on any assessment
of current software acquisition process strengths and weaknesses; and
(2) are not detailed in a formal plan that specifies measurable goals,
objectives, milestones, and needed resources, prioritizes efforts, fixes
responsibility and accountability, and defines metrics for measuring
progress. Instead, these efforts were undertaken with no sound analytical
basis and, rather than being part of a comprehensive plan, are discussed in
general terms without detail and specificity in briefing documents, minutes
of meetings, and working group recommendations. While the SEPG is now
taking steps to establish the analytical basis needed to formulate a
comprehensive software process improvement plan, that plan does not yet
exist, and no schedule has been established for completing it.
8
Air Traffic Control: Complete and Enforced Architecture Needed for FAA Systems Modernization
(GAO/AIMD-97-30, February 3, 1997).
None of these positions are valid. First, the SA-CMM, like the SW-CMM
(another SEI software-specific capability maturity model), focuses on
software because it is widely recognized as the most difficult and costly
Third, the issue is not whether FAA’s SEPG is organized as the Department
“understands” other SEPGs to be organized, but whether the SEPG, or any
FAA organizational entity responsible for implementing and enforcing
software acquisition process change, has the authority needed to
accomplish the task. Currently, no organizational entity in FAA has the
requisite authority.
provides no basis for drawing conclusions about the projects’ overall cost
or schedule performance.
Executive Summary 2
Chapter 1 16
Overview of ATC 16
Introduction The ATC Modernization Program Is Complex, Costly, and 19
Historically Problematic
ATC Modernization Now Proceeding Under a New Acquisition 20
Management System
FAA Organizations Responsible for ATC Systems Acquisition and 21
Maintenance
Objectives, Scope, and Methodology 22
Chapter 2 29
FAA’s Software Acquisition Planning Process Is Not Effective 29
Software Acquisition Conclusions 37
Planning
Chapter 3 39
Product Teams Performing Many Solicitation Practices 39
Solicitation Conclusions 51
Chapter 4 52
Requirements Development and Management Process Is Not 52
Requirements Effective
Development and Conclusions 62
Management
Chapter 5 63
FAA’s Project Office Management Process Area Is Not Effective 63
Project Office Conclusions 72
Management
Chapter 6 73
FAA Lacks an Effective Contract Tracking and Oversight Process 73
Contract Tracking and Conclusions 85
Oversight
Chapter 7 87
FAA Is Strong in Most but Not All Evaluation KPA Practices 87
Evaluation Conclusions 99
Chapter 8 101
Transition and Support Not Being Performed Effectively 101
Transition and Conclusions 113
Support
Chapter 9 115
ATC Modernization Software Risk Management Is Ineffective 115
Acquisition Risk Conclusions 126
Management
Chapter 10 127
FAA Organization Responsible for ATC Software Acquisition 127
FAA Lacks an Process Improvement Lacks Authority to Affect Change
Effective Approach FAA Planning for Software Process Improvement Has Not Been 129
Effective
for Improving ATC Improvement Initiatives Have Thus Far Not Instilled Software 130
Modernization Process Discipline
Software Acquisition Conclusions 131
Processes
Chapter 11 132
Recommendations 132
Overall Conclusions Agency Comments and Our Evaluation 133
and
Recommendations
Appendixes Appendix I: Comments From the Department of Transportation 138
Appendix II: Major Contributors to This Report 144
Figures Figure 1.1: Summary of ATC Over the Continental United States 18
and Oceans
Figure 1.2: ARA Organization Chart 22
Figure 1.3: SA-CMM Levels and Descriptions 24
Figure 2.1: Software Acquisition Planning 30
Figure 3.1: Solicitation Summary 40
Figure 4.1: Requirements Development and Management 54
Summary
Figure 5.1: Project Office Management Summary 64
Figure 6.1: Contract Tracking and Oversight Summary 74
Figure 7.1: Evaluation Summary 88
Figure 8.1: Transition and Support Summary 102
Figure 9.1: Acquisition Risk Management Summary 116
Abbreviations
Introduction
• Airport towers control aircraft on the ground and before landing and after
take-off when they are within about 5 nautical miles of the airport, and up
to 3,000 feet above the airport. Air traffic controllers rely on a combination
1
The ATC system is a major component of the National Airspace System (NAS).
2
Air Traffic Control: Status of FAA’s Modernization Program (GAO/RCED-95-175FS, May 26, 1995); Air
Traffic Control: Status of FAA’s Modernization Program (GAO/RCED-94-167FS, Apr. 15, 1994); Air
Traffic Control: Status of FAA’s Modernization Program (GAO/RCED-93-121FS, Apr. 16, 1993).
3
High-Risk Series: An Overview (GAO/HR-95-1, Feb. 1995); High-Risk Series: Information Management
and Technology (GAO/HR-97-9, Feb. 1997).
See figure 1.1 for a visual summary of air traffic control over the
continental United States and oceans.
Figure 1.1: Summary of ATC Over the Continental United States and Oceans
Oceanic
En Route
Center
ARINC
En Route
Center
TRACON
Airport Tower
TRACON
Departure Control
Airport Tower
Approach Control
Local and Ground Control
Continental
United States
FAA faced two problems in continuing to fulfill its mission to ensure safe,
The ATC orderly, and efficient air travel in the national airspace. First, the ATC
Modernization system of the late 1970s was a blend of several generations of automated
Program Is Complex, and manual equipment, much of it labor-intensive and obsolete. Second,
air traffic was projected to increase dramatically as a result of airline
Costly, and deregulations of the late 1970s. FAA recognized that it could increase ATC
Historically operating efficiency by increasing automation. It also anticipated that
meeting the demand safely and efficiently would require improved and
Problematic expanded services, additional facilities and equipment, improved work
force productivity, and the orderly replacement of aging equipment.
Accordingly, in December 1981, FAA initiated its plan to modernize,
automate, and consolidate its enormous ATC system infrastructure by the
year 2000. In doing so, it chose to acquire new ATC systems by contracting
for systems development services from vendors rather than building new
ATC systems in-house.
4
The three projects and their respective percentage increase in unit costs are the Voice Switching and
Control System (511 percent), the Integrated Terminal Weather System (129 percent), and the Aviation
Weather Observing System (51 percent).
almost 4 years. For example, the per-unit cost estimate for the Voice
Switching and Control System increased 511 percent, and the first site
implementation was delayed 6 years from the original estimate.
The new system is intended to reduce the time and cost to field new
products and services by introducing a new investment management
system that spans the investments’ entire life cycles, a new procurement
system that provides flexibility in selecting and managing contractors, and
organizational learning and cultural reform that supports the new
investment management and procurement systems.
5
Department of Transportation and Related Agencies Appropriations Act 1996, P.L. No. 104-50, sec.
348, 109 Stat. 436, 460 (1995).
Two major FAA organizations play key roles in the modernization of ATC
FAA Organizations systems—the Office of the Associate Administrator for Research and
Responsible for ATC Acquisitions (ARA) and the Office of the Associate Administrator for Air
Systems Acquisition Traffic Services (ATS). Briefly, ARA is responsible for acquiring ATC systems,
while ATS is responsible for operating and maintaining ATC systems.
and Maintenance Cross-functional integrated product teams (IPT) residing in ARA are
responsible for specific ATC system acquisition projects.
Five IPTs reside in AUA and are organized by ATC business areas (i.e., en
route, terminal, weather and flight service, air traffic management,
oceanic), and five IPTs reside in AND and are organized by ATC functional
areas (i.e., infrastructure, communications, surveillance, Global
Positioning System/navigation, aircraft/avionics). IPTs are responsible for
research, development, and acquisition as well as for ensuring that new
equipment is delivered, installed, and working properly. For example, the
en route IPT comprises product teams for the Display Channel Complex
Rehost, the Display System Replacement, the Voice Switching and Control
System, and several other en route systems. Each IPT includes systems and
specialty engineers, logistics personnel, testing personnel, contract
personnel, and lawyers as well as representatives from the organizations
responsible for operating and maintaining the ATC equipment. The
organization chart below shows the structure of the ARA organization.
Associate
Administrator
for Research and
Acquisitions
ARA
Office of Air Traffic Office of Aviation Office of Office of System William J. Hughes
Office of Acquisitions Office of Information
Systems Research Comm., Navigation, Architecture and
ASU Technology Tech Center
Development AAR and Surveillance Investment Analysis
AUA Systems, AND AIT ASD ACT
6
We used a draft version of the model for our evaluation (version 00.03, dated April 1996). The first
published version of the model was released in October 1996, after we performed our evaluation.
According to the model’s authors, the published version differed only editorially from the draft we
used.
Level 5 - Optimizing
Continuous process improvement is empowered by
quantitative feedback from the process and from piloting
Continuously
innovative ideas and technologies.
improving
process
Level 4 - Managed
Detailed measures of quality of the software acquistion
processes, products, and services are collected. The
Predictable software processes, products, and services are
process quantitatively understood and controlled.
Level 3 - Defined
The acquisition organization's software acquisition
process is documented, standardized, and established
as the standard software acquisition process. All
Standard projects use an approved, tailored version of the
consistant organization's standard software acquisition process
process for acquiring their software products and services.
Level 2 - Repeatable
Basic project management processes are established
to track performance, cost, and schedule. The
Disciplined necessary process discipline is in place to repeat
process earlier successes on projects in similar domains.
Level 1 - Initial
The software acquisiton process is characterized as
ad hoc, and occasionally even chaotic. Few processes
are defined and success depends on individual
effort.
In accordance with SEI’s SCE method, for each KPA in level 2 and the one KPA
in level 3 (risk management), we evaluated institutional FAA policies and
practices and compared project-specific guidance and practices against
the five common attributes. This project-specific comparison can result in
one of four possible outcomes: (1) project strength—an effective
implementation of the key practice; (2) project weakness—ineffective
implementation of a key practice or failure to implement a key practice;
(3) project observation—key practice evaluated but evidence inconclusive
and cannot be characterized as either strength or weakness; and (4) not
rated—key practice not currently relevant to project, therefore not
evaluated.
initiative; (3) the relative priority of each; (4) progress made on each
initiative; and (5) obstacles, if any, impeding progress. We also analyzed
process improvement plans, meeting minutes, and related documentation
to further address these areas. Finally, we interviewed representatives
from the ATC modernization product teams and the SEPG to obtain their
perspectives in assessing process improvement support, activities,
progress, and obstacles.
All five projects have some ability and/or activity strengths in this KPA. For
FAA’s Software example, every project addresses software life cycle support in planning
Acquisition Planning documents and software life cycle cost estimates were prepared for four
Process Is Not of the projects. However, we found many more process weaknesses than
strengths. For example, FAA has a systems acquisition policy, but the
Effective policy does not specifically address or provide guidance on software
acquisition. Therefore, FAA management has not formally recognized the
importance and uniqueness of software acquisition issues in the system
acquisition process, and has not formally committed to managing software
acquisition in a disciplined manner. Also, the product teams do not
measure or report on the status of software acquisition planning activities.
As a result, management is not always aware of problems in project
performance, and cannot always take corrective action expeditiously.
Additionally, none of the five projects has specific guidance on software
training or experience requirements for project participation. As a result,
software training is ad hoc, and decisions about project personnel
assignments are subjective.
Commitment 1 The acquisition organization has a written policy for Weakness Weakness Weakness Weakness Weakness
planning the software acquisition.
Ability 2 The acquisition organization has experienced Observation Observation Observation Observation Observation
software acquisition management personnel.
Ability 3 Software acquisition management personnel are Weakness Weakness Weakness Weakness Weakness
experienced in the domain of the project.
Activity 1 Software acquisition planning personnel are Weakness Strength Strength Strength Weakness
involved in system acquisition planning.
Activity 3 The software acquisition strategy for the project is Weakness Weakness Strength Weakness Weakness
developed.
Activity 5 A life cycle cost estimate for the software activity is Weakness Strength Strength Strength Strength
prepared and independently verified.
Solicitation
All five projects have strengths in many of the practices required by this
Product Teams KPA. For example, most projects have written solicitation plans, assign
Performing Many responsibility for coordinating and conducting the solicitation activities,
Solicitation Practices and prepare and review contract-related software cost and schedule
estimates.
However, the projects are weak in several areas. For example, even
though most projects had a written solicitation plan, not all projects
followed their plans. Also, none of the projects adequately identified the
resources needed to conduct solicitation activities. While FAA personnel
stated that they had adequate solicitation resources, they provided no
evidence of either a mechanism for identifying required resources or for
ensuring that required resources are provided. These weaknesses increase
the risk of FAA not adequately evaluating the offerors’ proposals, and
making a suboptimal selection. Additionally, none of the five measured or
reported on the status of product team solicitation activities. As a result,
management cannot identify solicitation problems early and resolve them
expeditiously.
Commitment 2 Responsibility for the software portion of the Weakness Observation Strength Weakness Strength
solicitation is designated.
Ability 1 A group that is responsible for coordinating and Strength Strength Strength Not rated Strength
conducting the solicitation activities exists.
Ability 2 Adequate resources are provided for the solicitation Weakness Weakness Weakness Weakness Weakness
activities.
Ability 3 Individuals performing the solicitation activities have Observation Observation Observation Observation Observation
experience or receive training.
Activity 1 The project team documents its plans for solicitation Strength Not rated Strength Observation Strength
activities.
The project's solicitation activities are performed in Observation Not rated Strength Weakness Strength
Activity 2
accordance with its plans.
Activity 3 The project team documents its plans for proposal Weakness Strength Strength Observation Strength
evaluation activities.
Activity 4 The project team's proposal evaluation activities are Weakness Not rated Strength Weakness Strength
performed in accordance with its plans.
Activity 5 A cost estimate and schedule for the software Strength Strength Strength Strength Strength
activity being sought are prepared.
Measurements are made and used to determine the Weakness Weakness Weakness Weakness Weakness
Measurement 1
status of the solicitation activities.
= Not rated Key practice not currently relevant to project, therefore not evaluated
While FAA has many strengths in this KPA, systemic weaknesses in areas
Conclusions including measurement and analysis and management verification of
practices, along with other project-specific weaknesses, render this KPA
non-repeatable and dependent upon the capabilities and commitment of
individual employees.
1
Advanced Automation System: Implications of Problems and Recent Changes (GAO/T-RCED-94-188,
Apr. 13, 1994).
requirements are correct and complete, and does not know when
management action is warranted.
We also found some practice strengths. For example, most projects (1) are
assessing the impact on software requirements of system-level
requirements changes and (2) have a mechanism to ensure that
contractor-delivered work products and services satisfied specified
software requirements.
Ability 1 A group that is responsible for performing requirements Strength Strength Strength Strength Strength
development and management activities exists.
Ability 2 Adequate resources are provided for requirements Weakness Strength Weakness Weakness Weakness
development and management activities.
= Not rated Key practice not currently relevant to project, therefore not evaluated
Commitment 1 The acquisition organization has a written policy for Weakness Weakness Strength Weakness Weakness
execution of the software project.
Commitment 2 Performance, cost, and schedule baselines are Strength Strength Strength Weakness Strength
supported.
Activity 1 The project team documents its plans for software Weakness Weakness Weakness Strength Strength
acquisition management activities.
Activity 2 The project team is organized to accomplish the Strength Strength Strength Strength Strength
project's objectives.
Activity 3 The software acquisition activities of the project team Strength Strength Not rated Weakness Strength
are directed to accomplish the project's objectives.
Activity 4 The software acquisition activities of the project team Strength Strength Not rated Strength Weakness
are controlled.
Activity 5 Measurements are used to track project status, Strength Strength Not rated Strength Strength
execution, and funding expenditures.
Measurement 1 Measurements are made and used to determine the Weakness Weakness Weakness Weakness Weakness
status of the project office management activities.
= Not rated Key practice not currently relevant to project, therefore not evaluated
The purpose of contract tracking and oversight is to ensure that (1) the
software development contractor performs according to the terms of the
contract; (2) needed contract changes are identified, negotiated, and
incorporated into the contract; and (3) contractor performance issues are
identified early, when they are easier to address. According to the SA-CMM,
a repeatable contract tracking and oversight process, among other things,
includes (1) having a written organizational policy for contract tracking
and oversight, (2) having a documented plan for contract tracking and
oversight, (3) conducting tracking and oversight activities in accordance
with the plan, and (4) ensuring that individuals performing contract
tracking and oversight are suitably experienced or trained.
Our past work on ATC modernization projects has raised concerns about
FAA Lacks an contract tracking and oversight. For example, in 1994 we reported that FAA
Effective Contract did not provide adequate oversight of its contractor during the initial
Tracking and development of the ISSS.1 As a result, development problems and lack of
progress were not always recognized in a timely manner. The results of
Oversight Process this software capability evaluation indicate that these problems persist
and pinpoint the underlying contract tracking and oversight weaknesses.
For example, FAA does not have a written organizational policy for
contract tracking and oversight, and most teams have no documented plan
for contract tracking and oversight activities. Furthermore, the team that
has a plan does not always follow the plan, and none of the teams ensure
that persons responsible for managing software contracts have suitable
experience or training. As a result, the product teams cannot formulate an
independent assessment of contract progress and are forced to rely on
data provided by the contractor. Since contractor reports do not always
identify problems expeditiously, FAA is not always positioned to correct
them promptly.
1
Advanced Automation System: Implications of Problems and Recent Changes (GAO/T-RCED-94-188,
Apr. 13, 1994).
Commitment 2 Responsibility for the contract tracking and oversight Strength Weakness Not rated Strength Strength
activities is designated.
The project's contract tracking and oversight activities Weakness Weakness Not rated Weakness Weakness
Activity 2
are performed in accordance with its plans.
Evaluation
All of the projects were strong in many evaluation practice areas. For
FAA Is Strong in Most example, all rated projects have documented test and evaluation plans and
but Not All Evaluation conduct test and evaluation activities in accordance with the plans. In
KPA Practices addition, most teams develop evaluation requirements for
contractor-conducted software tests concurrent with developing software
technical requirements, and all teams incorporate evaluation requirements
into the solicitation and resulting contract. Also, most teams track
contractor performance of test activities for compliance with the contract.
Commitment 2 Responsibility for evaluation activities is clearly Strength Strength Observation Strength Strength
defined.
Ability 2 Adequate resources are provided for evaluation Weakness Strength Weakness Weakness Weakness
activities.
Ability 3 Individuals performing evaluation activities have Observation Observation Observation Observation Observation
experience or receive training.
Activity 1 The project team documents its plans for evaluation Strength Strength Strength Strength Strength
activities.
Activity 2 The project's evaluation activities are performed in Strength Strength Not rated Strength Not rated
accordance with its plans.
Activity 4 The evaluation requirements are incorporated into the Strength Strength Strength Strength Strength
solicitation and resulting contract.
Measurement 1 Measurements are made and used to determine the Weakness Weakness Weakness Weakness Weakness
status of the evaluation activities.
= Not rated Key practice not currently relevant to project, therefore not evaluated
FAA performs most but not all evaluation KPA practices. To satisfy all
Conclusions evaluation practices and thereby have reasonable assurance that its
software acquisition projects will be effectively evaluated and tested on a
repeatable basis, FAA must ensure that its product teams identify
evaluation resource, training, and experience needs, and that they
measure and brief management on the status of all evaluation activities.
The purpose of transition and support is to provide for the effective and
efficient “hand-off” of the acquired software products to the support
organization responsible for software maintenance. According to the
SA-CMM, repeatable transition and support processes, among other things,
include (1) having a written policy for transitioning software products to
the support organization, (2) designating a group that is responsible for
coordinating transition and support activities, (3) having a complete
inventory of all software and related items that are to be transitioned,
(4) including members of the support organization in transition and
support planning, (5) requiring the support organization to demonstrate its
capability to modify and support the software, and (6) measuring and
reporting to management on the status of transition and support activities.
All of the projects were strong in several transition and support practice
Transition and areas. For example, FAA has a written policy for transitioning software
Support Not Being products to the support organization. Additionally, all five projects have
Performed Effectively designated a group responsible for transition and support. However,
various weaknesses in other practices prevented FAA from satisfying this
KPA. In particular, some of the teams lack a complete inventory of all the
software and related products to be transitioned, thus jeopardizing the
efforts of the support organization to effectively maintain the full software
configuration. Additionally, one team did not include the software support
organization in planning for transition and support, and some teams do not
have plans to require the support organization to demonstrate its
capability to maintain the software after transition. As a result, support
problems, such as the inability to perform required maintenance, may
result. Further, none of the projects are measuring and reporting to
management on the status of transition and support activities, precluding
management from addressing transition and support problems
expeditiously.
A group that is responsible for coordinating the Strength Strength Weakness Strength Strength
Ability 1
transition and support activities exists.
Ability 2 Responsibility for transition and support activities is Strength Strength Strength Strength Strength
designated.
Ability 3 Adequate resources are provided for transition and Weakness Weakness Not rated Weakness Weakness
support activities.
Ability 6 Individuals performing transition and support Observation Observation Not rated Observation Observation
activities have experience or receive training.
The project team documents its plans for transition Strength Strength Not rated Weakness Strength
Activity 1
and support activities.
Activity 2 The project team's transition and support activities Strength Weakness Not rated Observation Not rated
are performed in accordance with its plans.
The project team oversees the configuration control Strength Strength Not rated Weakness Not rated
Activity 4
of the software products throughout the transition.
Measurement 1 Measurements are made and used to determine the Weakness Weakness Weakness Weakness Weakness
status of the transition and support activities.
= Not rated Key practice not currently relevant to project, therefore not evaluated
FAA’stransition and support process for its ATC modernization suffers from
Conclusions weaknesses which render it undefined and undisciplined. In light of FAA’s
enormous investment in ATC-related software and the fact that over
65 percent of the life cycle cost of software is incurred during
maintenance, it is essential that these weaknesses be corrected.
Commitment 1 The acquisition organization has a written policy for Strength Strength Weakness Strength Strength
the management of acquisition risk.
Ability 1 A group that is responsible for coordinating Weakness Strength Strength Strength Strength
acquisition risk management activities exists.
Ability 2 Adequate resources are provided for acquisition risk Weakness Weakness Weakness Weakness Weakness
management activities.
Activity 1 Risk identification, analysis, and mitigation activities Weakness Strength Strength Weakness Strength
are integrated into software acquisition planning.
Activity 6 Software risk mitigation actions are tracked to Weakness Weakness Observation Weakness Strength
completion.
Measurement 1 Measurements are made and used to determine the Weakness Weakness Weakness Weakness Weakness
status of the acquisition risk management activities.
Measurement 2 Resources expended for acquisition risk management Weakness Weakness Weakness Weakness Weakness
activities are recorded and tracked.
In 1995, FAA established the Office of the Chief Scientist for Software
Engineering under FAA’s CIO to lead FAA’s software-related process
improvement efforts, including software acquisition. According to FAA
officials, this change was made to strengthen software process
Another of the Chief Scientist’s dozen activities was to form the SEPG as
FAA’s“focal point for initiating, planning, motivating, and facilitating the
improvement of ’acquisition life cycle processes’ for software intensive
systems.”6 Formed in October 1995, the SEPG includes representatives from
ATC modernization product teams and their parent organizations as well as
other FAA organizations, and it is chaired by the Chief Scientist for
Software Engineering. The SEPG is directed by the Software Engineering
Executive Committee (SEEC), which is chaired by the Associate
Administrator for Research and Acquisition (i.e., the head of the ATC
modernization), and is composed of senior FAA executives. The SEEC is
responsible for recommending and providing guidance on software
engineering issues. Additionally, some of the ATC modernization product
teams’ parent organizations have SEPGs.
1
“Strategies and Tactics for FAA to Improve Software, CIP Steering Committee Meeting,” May 16, 1995,
page 8.
2
See footnote 1.
3
COTS is commercial, off-the-shelf; NDI is non-development item.
4
See footnote 1.
5
See footnote 1.
6
In a document entitled “SEPG Purpose” Nov. 18, 1996, FAA defines a software intensive system as
“any system that is entirely software or whose principle (sic) functionality depends on the correct
functioning of software.”
Since 1995, the Chief Scientist has planned or initiated a dozen activities.
Improvement Some have never been started, some are behind schedule, and several
Initiatives Have Thus have proceeded according to plan. For example, while the Chief Scientist
Far Not Instilled planned to complete an assessment of ATC modernization software
acquisition capabilities using SEI level 2 and level 3 requirements by
Software Process December 1996 and June 1997, respectively, these efforts were never
Discipline performed. Other efforts are behind schedule. For example, software
engineering policies, guidance, and standards that were to be issued by
September 1996 are now scheduled for issuance the third quarter of
calendar year 1997; and a software life cycle tool that was to be developed
by October 1996 has been postponed indefinitely.
Several efforts have met their milestones and begun to build a foundation
for undertaking process improvements. For example, a software
engineering training plan was developed in May 1996, 1 month ahead of
schedule; product teams have been trained to use the SEI software
development CMM and the associated capability evaluation methodology to
evaluate contractors’ capabilities to develop software; one product team
used the results of a CMM evaluation as part of its source selection criteria;
and, according to the Chief Scientist, hundreds of members of various ATC
modernization product teams have been trained in software management
techniques, such as defining software processes, using software
management indicators, estimating software costs, and using standards
such as open systems standards.
FAA has neither an effective management structure nor plan of action for
Conclusions improving its software acquisition processes. As a result, software
acquisition processes will remain immature and will not support effective,
efficient, and economical acquisition of mission-critical software costing
billions of dollars. Until responsibility and accountability for software
acquisition process improvement is assigned to an FAA organizational
entity with the requisite authority to affect process change, and until this
organizational entity pursues a plan for change based on a complete and
objective assessment of current process strengths and weaknesses, it is
unlikely that significant ATC modernization software acquisition process
improvements will be made, and ATC software acquisition processes will
remain immature.
FAA recognizes the importance of software process maturity and the need
to improve its software acquisition processes. However, it lacks an
effective management structure for accomplishing this because the FAA
organization responsible for process improvement, the SEPG, lacks the
authority to implement management controls and instill process discipline
within the ATC modernization software acquiring organizations.
Additionally, despite 4 years of FAA process improvement activity, no
well-targeted, comprehensive, and coordinated plan of action for software
acquisition process improvement exists.
1
Air Traffic Control: Complete and Enforced Architecture Needed for FAA Systems Modernization
(GAO/AIMD-97-30, Feb. 3, 1997).
• require the CIO to develop and implement a comprehensive plan for ATC
modernization software acquisition process improvement that is based on
the software capability evaluation results contained in this report and
specifies measurable goals and time frames, prioritizes initiatives,
estimates resource requirements, and assigns roles and responsibilities;
• allocate adequate resources to ensure that these improvement efforts are
implemented and enforced; and
• require that, before being approved, every ATC modernization acquisition
project have software acquisition processes that satisfy at least SA-CMM
level 2 requirements.
First, the Department stated that FAA does not separately procure software
for its ATC systems. Rather, it procures systems that use software as a
major component. Therefore, its policies and procedures (i.e., processes)
are “geared” to system acquisitions, and evaluating only the
software-related aspects of its acquisition processes “is not an adequate
approach.”
GAO Rebuttal: All major system modernizations, like the ATC modernization,
involve the acquisition of hardware, software, and firmware operating
interdependently. However, as FAA’s own experience with the Advanced
Automation System clearly proves, the software component is the source
of most system risk, and the component most frequently associated with
late deliveries, cost increases, and performance shortfalls. Moreover, there
is widespread recognition throughout the computer industry that the
billions of dollars being invested in complex, real-time, fault tolerant
systems, like FAA ATC systems, are jeopardized by inadequate management
attention to software in general, and undisciplined, ill-defined software
acquisition and development processes in particular. This is precisely why
SEI developed its software-related CMMs, why the CMMs have been endorsed
and accepted throughout industry and government, and why the scope of
Second, the Department commented that the SA-CMM “is not widely used,
adopted, or validated” and, while it has “significant merit, it is certainly not
to be taken as the same authoritative source for process improvement
guidance as the SW-CMM,2 which has been in use worldwide by thousands of
organizations for several years.”
Third, the Department claimed that “GAO may have misapplied the model”
by (1) giving inadequate consideration to equivalent alternative practices
when determining whether SA-CMM specified practices were performed
(e.g., DSR system acquisition planning being judged as an insufficient proxy
for software acquisition planning specified in the SA-CMM), (2) not
effectively tailoring the SA-CMM to focus only on project activities that
occurred after the cancellation of the Advanced Automation System, and
(3) reporting evaluation results in a way that “does not create an
environment conducive to process improvement” (i.e., reporting the
2
SW-CMM is SEI’s software development capability maturity model.
results for each project, rather than either aggregating the results or
disguising the identity of the projects).
GAO Rebuttal: We applied the model properly and correctly, and SEI has
attested to this. Every member of our evaluation team was trained by SEI;
the team leader was an SEI designated “lead evaluator” and has been
authorized by SEI to submit results for inclusion in SEI studies; three senior
SEI professionals, two of whom are authors of the SA-CMM, participated in
the evaluation to ensure that the model was properly used; and SEI
concurred with each practice determination in the report (e.g., strength,
weakness). With respect to each of the Department’s subsidiary points
regarding our application of the model:
(2)As agreed with SEI, those practices that were deemed inapplicable were
not rated, and those performed years ago were so designated. Moreover,
even if all practices predating the Advance Automation System’s
cancellation were ignored, none of our conclusions and recommendations
would change.
(3)The model and evaluation method do not specify any reporting format.
In particular, they do not address whether results should be reported for
each project, or whether the identity of the projects should be disguised or
results reported only in the aggregate. Given the mission-critical
importance and billion dollar cost of these projects, full disclosure of all
relevant facts to the Congress and the public is both warranted and
appropriate.
Fifth, while the report states that the SEPG has neither the organizational
nor the budgetary authority to effectively implement process change, the
Department stated that its “understanding . . . is that organizations do not
normally give their SEPG authority over product teams.” In FAA’s case, the
SEPG provides advice and counsel to the Software Engineering Executive
Committee, which consists of senior managers who have authority and
responsibility to direct process improvement actions. The SEPG is the
committee’s agent for implementing these improvements.
GAO Rebuttal: The issue is not whether FAA’s SEPG is organized as the
Department “understands” other SEPGs to be organized, but whether the
SEPG, or any FAA organizational entity responsible for implementing and
enforcing software process change, has the authority needed to
accomplish this task. Currently, no organizational entity in FAA has the
requisite authority. Accordingly, we have recommended that a CIO
organizational structure similar to the department-level CIOs prescribed in
the Clinger-Cohen Act of 1996 be established for FAA, and that it be
assigned the responsibility and resources needed to affect and enforce
software acquisition process improvement.
Sixth, the Department contends that the report “leads the reader to
erroneously conclude that the five programs reviewed are in trouble”
relative to their cost and schedule goals.
Orders by mail:
or visit:
Room 1100
700 4th St. NW (corner of 4th and G Sts. NW)
U.S. General Accounting Office
Washington, DC
info@www.gao.gov
http://www.gao.gov