Case Study 3: Monitoring and Controlling Project Deliverables
Read the following articles:
• “Agencies Need to Improve the Implementation and Use of Earned Value Techniques to Help Manage Major System Acquisitions”.
• “Earned Value Management at NASA: An Integrated, Lightweight Solution”.
• “Integrating functional metrics, COCOMO II and earned value analysis for software projects using PMBoK”. SAC ’08: Proceedings of the 2008 ACM symposium on Applied computing, pages 820-825.
Write a two to three (2-3) page paper in which you:
1. List the types of investments that may be at risk from the articles provided. Evaluate the importance of providing effective measures, as described in Table 7.1 in Chapter 7 of the text, for monitoring and controlling project deliverables.
2. Explain the importance of having accurate data for performing earned value management, based on the articles provided. Provide examples on how accurate and consistent data is related to risk management and decision making within project management.
3. Suggest at least two (2) best practices mentioned in the articles that should be observed for monitoring and controlling projects effectively. Determine the potential risks that are mitigated by implementing these practices.
4. Describe how a template for reporting defects, shown in Table 7.19 in Chapter 7 of the text, may help project managers determine the root causes of project deliverables.
5. Summarize the lessons learned that can be derived from earned value management reporting and describe how these could be applied in future projects.
6. Determine the factors that contribute to implementing effective earned value management practices. Choose the single most important factor and provide an example to justify its importance in implementing earned values management.
Your assignment must follow these formatting requirements:
• Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides; citations and references must follow APA or school-specific format. Check with your professor for any additional instructions.
• Include a cover page containing the title of the assignment, the student’s name, the professor’s name, the course title, and the date. The cover page and the reference page are not included in the required assignment page length.
The specific course learning outcomes associated with this assignment are:
• Explain the methods of creating, measuring, and controlling work products and work processes.
• Explain and analyze earned value reporting techniques in software projects.
• Use technology and information resources to research issues in IT Project Leadership Strategies.
• Write clearly and concisely about topics related to IT Project Leadership Strategies using proper writing mechanics and technical style conventions.
GAO
United States Government Accountability Office
Report to the Chairman, Subcommittee on Federal
Financial Management, Government Information,
Federal Services, and International Security,
Committee on Homeland Security and
Governmental Affairs, U.S. Senate
INFORMATION
TECHNOLOGY
Agencies Need to
Improve the
Implementation and
Use of Earned Value
Techniques to Help
Manage Major System
Acquisitions
October
2009
GAO-10-2
What GAO Found
United States Government Accountability Office
Why GAO Did This Study
Highlights
Accountability Integrity Reliability
October
2009
INFORMATION TECHNOLOGY
Agencies Need to Improve the Implementation and
Use of Earned Value Techniques to Help Manage
Major System Acquisitions Highlights of GAO-10-2, a report to the
Chairman, Subcommittee on Federal
Financial Management, Government
Information, Federal Services, and
International Security, Committee on
Homeland Security and Governmental
Affairs, U.S. Senate
In fiscal year 2009, the federal
government planned to spend
about $71 billion on information
technology (IT) investments. To
more effectively manage such
investments, in 2005 the Office of
Management and Budget (OMB)
directed agencies to implement
earned value management (EVM).
EVM is a project management
approach that, if implemented
appropriately, provides objective
reports of project status, produces
early warning signs of impending
schedule delays and cost overruns,
and provides unbiased estimates of
anticipated costs at completion.
GAO was asked to assess selected
agencies’ EVM policies, determine
whether they are adequately using
earned value techniques to manage
key system acquisitions, and eval-
uate selected investments’ earned
value data to determine their cost
and schedule performances. To do
so, GAO compared agency policies
with best practices, performed case
studies, and reviewed documenta-
tion from eight agencies and 16
major investments with the highest
levels of IT development-related
spending in fiscal year 2009.
What GAO Recommends
GAO is recommending that the
selected agencies modify EVM
policies to be consistent with best
practices, implement EVM
practices that address identified
weaknesses, and manage negative
earned value trends. Seven
agencies that commented on a
draft of this report generally agreed
with GAO’s results and
recommendations.
While all eight agencies have established policies requiring the use of EVM on
major IT investments, these policies are not fully consistent with best
practices. In particular, most lack training requirements for all relevant
personnel responsible for investment oversight. Most policies also do not have
adequately defined criteria for revising program cost and schedule baselines.
Until agencies expand and enforce their EVM policies, it will be difficult for
them to gain the full benefits of EVM.
GAO’s analysis of 16 investments shows that agencies are using EVM to
manage their system acquisitions; however, the extent of implementation
varies. Specifically, for 13 of the 16 investments, key practices necessary for
sound EVM execution had not been implemented. For example, the project
schedules for these investments contained issues—such as the improper
sequencing of key activities—that undermine the quality of their performance
baselines. This inconsistent application of EVM exists in part because of the
weaknesses contained in agencies’ policies, combined with a lack of
enforcement of policies already in place. Until key EVM practices are fully
implemented, these investments face an increased risk that managers cannot
effectively optimize EVM as a management tool.
Furthermore, earned value data trends of these investments indicate that most
are currently experiencing shortfalls against cost and schedule targets. The
total life-cycle costs of these programs have increased by about $2 billion.
Based on GAO’s analysis of current performance trends, 11 programs will
likely incur cost overruns that will total about $1 billion at contract
completion—in particular, 2 of these programs account for about 80 percent
of this projection. As such, GAO estimates the total cost overrun to be about
$3 billion at program completion (see figure). However, with timely and
effective management action, it is possible to reverse negative trends so that
the projected cost overruns may be reduced.
Cost Overruns Incurred and Projected Overruns of 16 Programs
Source: GAO analysis of program data.
Billions of dollars
$3.
0
GAO-estimated total
cost overrun at
completion
0 1 2
3
$1.0
GAO-estimated
most likely cost
overruns
$2.0
Life-cycle cost
overruns already
incurred
View GAO-10-2 or key components.
For more information, contact David A.
Powner at (202) 512-9286 or
pownerd@gao.gov.
http://www.gao.gov/products/GAO-10
–
2
mailto:pownerd@gao.gov
http://www.gao.gov/cgi-bin/getrpt?GAO-10
-2
Page i GAO-10-2
Contents
Letter 1
2
7
Agencies’ Key Acquisition Programs Are Using EVM, but Are Not
Consistently Implementing Key Practices 12
Earned Value Data Show Trends of Cost Overruns and Schedule
Slippages on Most Programs 18
22
23
23
Appendix I Objectives, Scope, and Methodology 27
Appendix II Case Studies of Selected Programs’
Implementation of Earned Value Management 30
Appendix III Comments from the Department of Commerce 65
Appendix IV Comments from the Department of Defense 67
Appendix V Comments from the Department of Justice 70
Appendix VI Comments from the National Aeronautics and
Space Administration 72
Appendix VII Comments from the Department of Veterans Affairs 75
Appendix VIII
and
77
Information Technology
Related GAO Products 78
Tables
Table 1: Key Components of an Effective EVM Policy 8
Table 2: Assessment of Key Agencies’ EVM Policies 9
Table 3: Eleven Key EVM Practices for System Acquisition
Programs 12
Table 4: Assessment of EVM Practices for Case Study Programs 13
Table 5: Program Life-cycle Cost Estimate Changes 18
Table 6: Contractor Cumulative Cost and Schedule Performances 20
Table 7: Sixteen Case Study Programs 30
Table 8: GAO EVM Practice Assessment of Agriculture’s MIDAS
Program 33
Table 9: GAO EVM Practice Assessment of Commerce’s DRIS
Program 35
Table 10: GAO EVM Practice Assessment of Commerce’s FDCA
Program 37
Table 11: GAO EVM Practice Assessment of Defense’s AOC
Program 39
Table 12: GAO EVM Practice Assessment of Defense’s JTRS-HMS
Program 41
Table 13: GAO EVM Practice Assessment of Defense’s WIN-T
Program 43
Table 14: GAO EVM Practice Assessment of Homeland Security’s
ACE Program 45
Table 15: GAO EVM Practice Assessment of Homeland Security’s
Deepwater COP Program 47
Table 16: GAO EVM Practice Assessment of Homeland Security’s
WHTI Program 49
Table 17: GAO EVM Practice Assessment of Justice’s NGI Program 51
Table 18: GAO EVM Practice Assessment of NASA’s JWST Project 53
Table 19: GAO EVM Practice Assessment of NASA’s
Project 55
Table 20: GAO EVM Practice Assessment of NASA’s MSL Project 57
Table 21: GAO EVM Practice Assessment of Transportation’s
ERAM Program 59
Table 22: GAO EVM Practice Assessment of Transportation’s SBS
Program 61
Table 23: GAO EVM Practice Assessment of Veterans Affairs’
VistA-FM Program 63
Page ii GAO-10-2 Information Technology
Figures
Figure 1: GAO EV Data Analysis of Agriculture’s MIDAS Program 34
Figure 2: GAO EV Data Analysis of Commerce’s DRIS Program 36
Figure 3: GAO EV Data Analysis of Commerce’s FDCA Program 38
Figure 4: GAO EV Data Analysis of Defense’s AOC Program 40
Figure 5: GAO EV Data Analysis of Defense’s JTRS-HMS Program 42
Figure 6: GAO EV Data Analysis of Defense’s WIN-T Program 44
Figure 7: GAO EV Data Analysis of Homeland Security’s ACE
Program 46
Figure 8: GAO EV Data Analysis of Homeland Security’s Deepwater
COP Program 48
Figure 9: GAO EV Data Analysis of Homeland Security’s WHTI
Program 50
Figure 10: GAO EV Data Analysis of Justice’s NGI Program 52
Figure 11: GAO EV Data Analysis of NASA’s JWST Project 54
Figure 12: GAO EV Data Analysis of NASA’s Juno Project 56
Figure 13: GAO EV Data Analysis of NASA’s MSL Project 58
Figure 14: GAO EV Data Analysis of Transportation’s ERAM
Program 60
Figure 15: GAO EV Data Analysis of Transportation’s SBS Program 62
Figure 16: GAO EV Data Analysis of Veterans Affairs’ VistA-FM
Program 64
Page iii GAO-10-2 Information Technology
Abbreviations
ACE
ANSI American National Standards Institute
AOC
COP Integrated Deepwater System—Common Operational
Picture
DOD Department of Defense
DRIS
EIA Electronic Industries Alliance
ERAM
EV earned value
EVM earned value management
FDCA
IT information technology
JTRS-HMS Joint Tactical Radio System—Handheld, Manpack, Small
Form Fit
JWST
MIDAS
MSL
NASA National Aeronautics and Space Administration
NGI
OMB Office of Management and Budget
SBS
VistA-FM Veterans Health Information Systems and Technology
Architecture—Foundations Modernization
WHTI
WIN-T
This is a work of the U.S. government and is not subject to copyright protection in the
United States. The published product may be reproduced and distributed in its entirety
without further permission from GAO. However, because this work may contain
copyrighted images or other material, permission from the copyright holder may be
necessary if you wish to reproduce this material separately.
Page iv GAO-10-2 Information Technology
Page 1 GAO-10-2
United States Government Accountability Office
Washington, DC 20548
October 8, 2009
The Honorable Thomas R. Carper
Chairman
Subcommittee on Federal Financial Management,
Government Information, Federal Services,
and International Security
Committee on Homeland Security and
Governmental Affairs
United States Senate
Dear Mr. Chairman:
In fiscal year 2009, the federal government planned to spend over $70
billion on information technology (IT) investments, many of which involve
systems and technologies to modernize legacy systems, increase
communication and networking capabilities, and transition to new
systems designed to significantly improve the government’s ability to carry
out critical mission functions into the 21st century. To more effectively
manage such investments, the Office of Management and Budget (OMB)
has a number of key initiatives under way—one of which was established
in 2005 and directs agencies to implement earned value management
(EVM).1 EVM is a project management approach that, if implemented
appropriately, provides objective reports of project status, produces early
warning signs of impending schedule slippages and cost overruns, and
provides unbiased estimates of anticipated costs at completion.
This report responds to your request that we review the federal
government’s use of EVM. Specifically, our objectives were to (1) assess
whether key departments and agencies have appropriately established
EVM policies, (2) determine whether these agencies are adequately using
earned value techniques to manage key system acquisitions, and
(3) evaluate the earned value data of these selected investments to
determine their cost and schedule performances.
To address our objectives, we reviewed agency EVM policies and
individual programs’ EVM-related documentation, including cost
performance reports and project schedules, from eight agencies and 16
1OMB Memorandum, M-05-23 (Aug. 4, 2005).
Information Technology
major investments from those agencies, respectively.2 The eight agencies
account for about 75 percent of the planned IT spending for fiscal year
2009. The 16 programs selected for case study represent investments with
about $3.5 billion in total planned spending for system development work
in fiscal year 2009. We compared the agencies’ policies and practices with
federal standards and best practices of leading organizations to determine
the effectiveness of their use of earned value data in managing IT
investments. We also analyzed the earned value data from the programs to
determine whether they are projected to finish within planned cost and
schedule targets. In addition, we interviewed relevant agency officials,
including key personnel on programs that we selected for case study and
officials responsible for implementing EVM.
We conducted this performance audit from February to October 2009, in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain sufficient,
appropriate evidence to provide a reasonable basis for our findings and
conclusions based on our audit objective. We believe that the evidence
obtained provides a reasonable basis for our findings and conclusions
based on our audit objective. Appendix I contains further details about our
objectives, scope, and methodology. See also the page of related products
at the end of this report for previous work that we have done on certain
programs in our case studies.
Each year, OMB and federal agencies work together to determine how
much the government plans to spend on IT projects and how these funds
are to be allocated. Planned federal IT spending in fiscal year 2009 totaled
about $71 billion—of which $22 billion was planned for IT
system
development work, and the remainder was planned for operations and
maintenance of existing systems. OMB plays a key role in overseeing
federal agencies’ IT investments and how they are managed, stemming
from its functions of assisting the President in overseeing the preparation
of the federal budget and supervising budget preparation in executive
branch agencies. In helping to formulate the President’s spending plans,
OMB is responsible for evaluating the effectiveness of agency programs,
policies, and procedures; assessing competing funding demands among
Background
2The eight agencies were the Departments of Agriculture, Commerce, Defense, Homeland
Security, Justice, Transportation, and Veterans Affairs, and the National Aeronautics and
Space Administration.
Page 2 GAO-10-2 Information Technology
agencies; and setting funding priorities. To carry out these responsibilities,
OMB depends on agencies to collect and report accurate and complete
information; these activities depend, in turn, on agencies having effective
IT management practices.
To drive improvement in the implementation and management of IT
projects, Congress enacted the Clinger-Cohen Act in 1996, expanding the
responsibilities delegated to OMB and agencies under the Paperwork
Reduction Act.3 The Clinger-Cohen Act requires agencies to engage in
performance- and results-based management, and to implement and
enforce IT management policies and guidelines. The act also requires OMB
to establish processes to analyze, track, and evaluate the risks and results
of major capital investments in information systems made by executive
agencies.
Over the past several years, we have reported and testified on OMB’s
initiatives to highlight troubled projects,4 justify IT investments,5 and use
project management tools.6 We have made multiple recommendations to
OMB and federal agencies to improve these initiatives to further enhance
the oversight and transparency of federal IT projects. As a result, OMB
recently used this body of work to develop and implement improved
processes to oversee and increase transparency of IT investments.
Specifically, in June 2009, OMB publicly deployed a Web site that displays
dashboards of all major federal IT investments to provide OMB and others
with the ability to track the progress of these investments over time.
344 U.S.C. § 3504(h), 3506(h).
4GAO, Information Technology: Management and Oversight of Projects Totaling Billions
of Dollars Need Attention, GAO-09-624T (Washington, D.C.: Apr. 28, 2009); Information
Technology: Treasury Needs to Better Define and Implement Its Earned Value
Management Policy, GAO-08-951 (Washington, D.C.: Sept. 22, 2008); Information
Technology: Further Improvements Needed to Identify and Oversee Poorly Planned and
Performing Projects, GAO-07-1211T (Washington, D.C.: Sept. 20, 2007); Information
Technology: Improvements Needed to More Accurately Identify and Better Oversee Risky
Projects Totaling Billions of Dollars, GAO-06-1099T (Washington, D.C.: Sept. 7, 2006); and
Information Technology: Agencies and OMB Should Strengthen Processes for Identifying
and Overseeing High Risk Projects, GAO-06-647 (Washington, D.C.: June 15, 2006).
5GAO, Information Technology: OMB Can Make More Effective Use of Its Investment
Reviews, GAO-05-276 (Washington, D.C.: Apr. 15, 2005).
6GAO-08-951 and GAO, Air Traffic Control: FAA Uses Earned Value Techniques to Help
Manage Information Technology Acquisitions, but Needs to Clarify Policy and
Strengthen Oversight, GAO-08-756 (Washington, D.C.: July 18, 2008).
Page 3 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-624T
http://www.gao.gov/cgi-bin/getrpt?GAO-08-95
1
http://www.gao.gov/cgi-bin/getrpt?GAO-07-1211T
http://www.gao.gov/cgi-bin/getrpt?GAO-06-1099T
http://www.gao.gov/cgi-bin/getrpt?GAO-06-647
http://www.gao.gov/cgi-bin/getrpt?GAO-05-27
6
http://www.gao.gov/cgi-bin/getrpt?GAO-08-951
http://www.gao.gov/cgi-bin/getrpt?GAO-08-756
EVM Provides Insight on
Program Cost and
Schedule
Given the size and significance of the government’s investment in IT, it is
important that projects be managed effectively to ensure that public
resources are wisely invested. Effectively managing projects entails,
among other things, pulling together essential cost, schedule, and
technical information in a meaningful, coherent fashion so that managers
have an accurate view of the program’s development status. Without
meaningful and coherent cost and schedule information, program
managers can have a distorted view of a program’s status and risks. To
address this issue, in the 1960s, the Department of Defense (DOD)
developed the EVM technique, which goes beyond simply comparing
budgeted costs with actual costs. This technique measures the value of
work accomplished in a given period and compares it with the planned
value of work scheduled for that period and with the actual cost of work
accomplished.
Differences in these values are measured in both cost and schedule
variances. Cost variances compare the value of the completed work (i.e.,
the earned value) with the actual cost of the work performed. For
example, if a contractor completed $5 million worth of work and the work
actually cost $6.7 million, there would be a negative $1.7 million cost
variance. Schedule variances are also measured in dollars, but they
compare the earned value of the completed work with the value of the
work that was expected to be completed. For example, if a contractor
completed $5 million worth of work at the end of the month but was
budgeted to complete $10 million worth of work, there would be a
negative $5 million schedule variance. Positive variances indicate that
activities are costing less or are completed ahead of schedule. Negative
variances indicate activities are costing more or are falling behind
schedule. These cost and schedule variances can then be used in
estimating the cost and time needed to complete the program.
Without knowing the planned cost of completed work and work in
progress (i.e., the earned value), it is difficult to determine a program’s
true status. Earned value allows for this key information, which provides
an objective view of program status and is necessary for understanding the
health of a program. As a result, EVM can alert program managers to
potential problems sooner than using expenditures alone, thereby
reducing the chance and magnitude of cost overruns and schedule
slippages. Moreover, EVM directly supports the institutionalization of key
processes for acquiring and developing systems and the ability to
effectively manage investments—areas that are often found to be
inadequate on the basis of our assessments of major IT investments.
Page 4 GAO-10-2 Information Technology
In August 2005, OMB issued guidance outlining steps that agencies must
take for all major and high-risk development projects to better ensure
improved execution and performance and to promote more effective
oversight through the implementation of EVM.7 Specifically, this guidance
directs agencies to (1) develop comprehensive policies to ensure that their
major IT investments are using EVM to plan and manage development;
(2) include a provision and clause in major acquisition contracts or agency
in-house project charters directing the use of an EVM system that is
compliant with the American National Standards Institute (ANSI)
standard;8 (3) provide documentation demonstrating that the contractor’s
or agency’s in-house EVM system complies with the national standard;
(4) conduct periodic surveillance reviews; and (5) conduct integrated
baseline reviews9 on individual programs to finalize their cost, schedule,
and performance goals.
Federal Guidance Calls for
Using EVM to Improve IT
Management
Building on OMB’s requirements, in March 2009, we issued a guide on best
practices for estimating and managing program costs.10 This guide
highlights the policies and practices adopted by leading organizations to
implement an effective EVM program. Specifically, in the guide, we
identify the need for organizational policies that establish clear criteria for
which programs are required to use EVM, specify compliance with the
ANSI standard, require a standard product-oriented structure for defining
work products, require integrated baseline reviews, provide for specialized
training, establish criteria and conditions for rebaselining programs, and
require an ongoing surveillance function. In addition, we identify key
practices that individual programs can use to ensure that they establish a
sound EVM system, that the earned value data are reliable, and that the
data are used to support decision making.
7OMB Memorandum, M-05-23 (Aug. 4, 2005).
8Recognizing the importance of ensuring quality earned value data, ANSI and the Electronic
Industries Alliance (EIA) jointly established a national standard for EVM systems in May
1998 (ANSI/EIA-748-A-1998). This standard, commonly called the ANSI standard, is
comprised of guidelines to instruct programs on how to establish a sound EVM system.
This document was updated in July 2007 and is referred to as ANSI/EIA-748-B.
9An integrated baseline review is an evaluation of a program’s baseline plan to determine
whether all program requirements have been addressed, risks have been identified,
mitigation plans are in place, and available and planned resources are sufficient to
complete the work.
10GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and
Managing Capital Program Costs, GAO-09-3SP (Washington, D.C.: March 2009).
Page 5 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-3SP
We have previously reported on the weaknesses associated with the
implementation of sound EVM programs at various agencies, as well as on
the lack of aggressive management action to correct poor cost and
schedule performance trends based on earned value data for major system
acquisition programs:
Prior Reviews on
Agency
Use of EVM to Acquire and
Manage IT Systems Have
Identified Weaknesses
• In July 2008, we reported that the Federal Aviation Administration’s EVM
policy was not fully consistent with best practices.11 For example, the
agency required its program managers to obtain EVM training, but did not
enforce completion of this training or require other relevant personnel to
obtain this training. In addition, although the agency was using EVM to
manage IT acquisitions, not all programs were ensuring that their earned
value data were reliable. Specifically, of the three programs collecting
EVM data, only one program adequately ensured that its earned value data
were reliable. As a result, the agency faced an increased risk that
managers were not getting the information they needed to effectively
manage the programs. In response to our findings and recommendations,
the Federal Aviation Administration reported that it had initiatives under
way to improve its EVM oversight processes.
• In September 2008, we reported that the Department of the Treasury’s
EVM policy was not fully consistent with best practices.12 For example,
while the department’s policy addressed some practices, such as
establishing clear criteria for which programs are to use EVM, it did not
address others, such as requiring and enforcing EVM training. In addition,
six programs at Treasury and its bureaus were not consistently
implementing practices needed for establishing a comprehensive EVM
system. For example, when executing work plans and recording actual
costs, a key practice for ensuring that the data resulting from the EVM
system are reliable, only two of the six investments that we reviewed
incorporated government costs with contractor costs. As a result, we
reported that Treasury may not be able to effectively manage its critical
programs. In response to our findings and recommendations, Treasury
reported that it would release a revised EVM policy and further noted that
initiatives to improve EVM-related training were under way.
• In a series of reports and testimonies from September 2004 to June 2009, we
reported that the National Oceanic and Atmospheric Administration’s
National Polar-orbiting Operational Environmental Satellite System program
11GAO-08-756.
12GAO-08-951.
Page 6 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-08-756
http://www.gao.gov/cgi-bin/getrpt?GAO-08-951
was likely to overrun its contract at completion on the basis of our analysis of
contractor EVM data.13 Specifically, the program had delayed key milestones
and experienced technical issues in the development of key sensors, which
we stated would affect cost and schedule estimates. As predicted, in June
2006 the program was restructured, decreasing its complexity, delaying the
availability of the first satellite by 3 to 5 years, and increasing its
cost estimate
from $6.9 billion to $12.5 billion. However, the program has continued to face
significant technical and management issues. As of June 2009, launch of the
first satellite was delayed by 14 months, and our current projected total cost
estimate is approximately $15 billion. We made multiple recommendations to
improve this program, including establishing a realistic time frame for
revising the cost and schedule baselines, developing plans to mitigate the risk
of gaps in satellite continuity, and tracking the program executive
committee’s action items from inception to closure.
While the eight agencies we reviewed have established policies requiring
the use of EVM on their major IT investments, none of these policies are
fully consistent with best practices, such as standardizing the way work
products are defined. We recently reported14 that leading organizations
establish EVM policies that
Agencies’ EVM
Policies Are Not
Comprehensive
• establish clear criteria for which programs are to use EVM;
• require programs to comply with the ANSI standard;
• require programs to use a product-oriented structure for defining work
products;
• require programs to conduct detailed reviews of expected costs,
schedules, and deliverables (called an integrated baseline review);
13GAO, Polar-Orbiting Environmental Satellites: With Costs Increasing and Data
Continuity at Risk, Improvements Needed in Tri-agency Decision Making, GAO-09-564
(Washington, D.C.: June 17, 2009); Polar-Orbiting Operational Environmental Satellites:
Restructuring Is Under Way, but Technical Challenges and Risks Remain, GAO-07-498
(Washington, D.C.: Apr. 27, 2007); Polar-Orbiting Operational Environmental Satellites:
Cost Increases Trigger Review and Place Program’s Direction on Hold, GAO-06-573T
(Washington, D.C.: Mar. 30, 2006); Polar-Orbiting Operational Environmental Satellites:
Technical Problems, Cost Increases, and Schedule Delays Trigger Need for Difficult
Trade-off Decisions, GAO-06-249T (Washington, D.C.: Nov. 16, 2005); and Polar-Orbiting
Environmental Satellites: Information on Program Cost and Schedule Changes,
GAO-04-1054 (Washington, D.C.: Sept. 30, 2004).
14GAO-09-3SP.
Page 7 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-56
4
http://www.gao.gov/cgi-bin/getrpt?GAO-07-49
8
http://www.gao.gov/cgi-bin/getrpt?GAO-06-573T
http://www.gao.gov/cgi-bin/getrpt?GAO-06-249T
http://www.gao.gov/cgi-bin/getrpt?GAO-04-1054
http://www.gao.gov/cgi-bin/getrpt?GAO-09-3SP
• require and enforce EVM training;
• define when programs may revise cost and schedule baselines (called
rebaselining); and
• require system surveillance—that is, routine validation checks to ensure
that major acquisitions are continuing to comply with agency policies and
standards.
Table 1 describes the key components of an effective EVM policy.
Table 1: Key Components of an Effective EVM Policy
Component Description
Clear criteria for implementing
EVM on all major IT investments
OMB requires agencies to implement EVM on all major IT investments and ensure that the
corresponding contracts include provisions for using EVM systems. However, each agency is
responsible for establishing its own definition of a “major” IT investment. As a result, agencies
should clearly define the conditions under which a new or ongoing acquisition program is
required to implement EVM.
Compliance with the ANSI
standard
OMB requires agencies to use EVM systems that are compliant with a national standard
developed by ANSI and EIA (ANSI/EIA-748-B). This standard consists of 32 guidelines that an
organization can use to establish a sound EVM system, ensure that the data resulting from the
EVM system are reliable, and use earned value data for decision-making purposes.
Standard structure for defining
the work products
The work breakdown structure defines the work necessary to accomplish a program’s
objectives. It is the first criterion stated in the ANSI standard and the basis for planning the
program baseline and assigning responsibility for the work. It is a best practice to establish a
product-oriented work breakdown structure because it allows a program to track cost and
schedule by defined deliverables, such as a hardware or software component. This allows a
program manager to more precisely identify which components are causing cost or schedule
overruns and to more effectively mitigate the root cause of the overruns. Standardizing the work
breakdown structure is also considered a best practice because it enables an organization to
collect and share data among programs.
Integrated baseline review An integrated baseline review is an evaluation of the performance measurement baseline—the
foundation for an EVM system—to determine whether all program requirements have been
addressed, risks have been identified, mitigation plans are in place, and available and planned
resources are sufficient to complete the work. The main goal of an integrated baseline review is
to identify potential program risks, including risks associated with costs, management
processes, resources, schedules, and technical issues.
Training requirements EVM training should be provided and enforced for all personnel with investment oversight and
program management responsibilities. Executive personnel with oversight responsibilities need
to understand EVM terms and analysis products to make sound investment decisions. Program
managers and staff need to be able to interpret and validate earned value data to effectively
manage deliverables, costs, and schedules.
Page 8 GAO-10-2 Information Technology
Component Description
Rebaselining criteria At times, management may conclude that the remaining budget and schedule targets for
completing a program (including the contract) are significantly insufficient, and that the current
baseline is no longer valid for realistic performance measurement. Management may decide that
a revised baseline for the program is needed to restore its control of the remaining work effort.
An agency’s rebaselining criteria should define acceptable reasons for rebaselining and require
programs to (1) explain why the current plan is no longer feasible and what measures will be
implemented to prevent recurrence and (2) develop a realistic cost and schedule estimate for
remaining work that has been validated and spread over time to the new plan.
System surveillance Surveillance is the process of reviewing a program’s (including contractors’) EVM system as it is
applied to one or more programs. The purpose of surveillance is to focus on how well a program
is using its EVM system to manage cost, schedule, and technical performances. The following
two goals are associated with EVM system surveillance: (1) ensure that the program is following
corporate processes and procedures and (2) confirm that the program’s processes and
procedures continue to satisfy ANSI guidelines.
Source: GAO-09-3SP.
The eight agencies we reviewed do not have comprehensive EVM policies.
Specifically, none of the agencies’ policies are fully consistent with all
seven key components of an effective EVM policy. Table 2 provides a
detailed assessment, by agency, and a discussion of the agencies’ policies
follows the table.
Table 2: Assessment of Key Agencies’ EVM Policies
Agency
Clear criteria
for
implementing
EVM on all
major IT
investments
Compliance
with the
ANSI
standard
Standard
structure for
defining the
work
products
Integrated
baseline
review
Training
requirements
Rebaselining
criteria
System
surveillance
Agriculture ● ● ◌ ● ◐ ◐
●
Commerce ● ● ◌ ● ● ● ●
Defense ● ● ● ● ◐ ● ●
Homeland Security
● ● ◐ ● ◐ ◐ ●
Justice ● ● ◐ ● ◐ ● ●
National
Aeronautics and
Space
Administration
● ● ◐ ● ◐ ◐ ●
Transportation ● ◐ ◌ ● ◐ ◐ ●
Veterans Affairs ◐ ● ◌ ● ◐ ◐ ●
Key
●=The agency addressed all EVM practices in this policy area.
◐=The agency addressed some EVM practices in this policy area.
◌=The agency did not address any EVM practices in this policy area.
Source: GAO analysis of agency data.
Page 9 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-3SP
• Criteria for implementing EVM on all IT major investments: Seven of
the eight agencies fully defined criteria for implementing EVM on major IT
investments. The agencies with sound policies typically defined “major”
investments as those exceeding a certain cost threshold, and, in some
cases, agencies defined lower tiers of investments requiring reduced levels
of EVM compliance. Veterans Affairs only partially met this key practice
because its policy did not clearly state whether programs or major
subcomponents of programs (projects and subprojects) had to comply
with EVM requirements. According to agency officials, this lack of clarity
may cause EVM to be inconsistently applied across the investments.
Without an established policy that clearly defines the conditions under
which new or ongoing acquisition programs are required to implement
EVM, these agencies cannot ensure that EVM is being appropriately
applied on their major investments.
• Compliance with the ANSI standard: Seven of the eight agencies required
that all work activities performed on major investments be managed by an
EVM system that complies with industry standards. One agency,
Transportation, partially met this key practice because its policy contained
inconsistent criteria for when investments must comply with standards.
Specifically, in one section, the policy requires a certain class of
investments to adhere to a subset of the ANSI standard; however, in
another section, the policy merely states that the investments must comply
with general EVM principles. This latter section is vague and could be
interpreted in multiple ways, either more broadly or narrowly than the
specified subset of the ANSI standard. Without consistent criteria on
investment compliance, Transportation may be unable to ensure that the
work activities for some of its major investments are establishing sound
EVM systems that produce reliable earned value data and provide the
basis for informed decision making.
• Standard structure for defining the work products: DOD was the only
agency to fully meet this key practice by developing and requiring the use
of standard product-oriented work breakdown structures. Four agencies
did not meet this key practice, while the other three only partially
complied. Of those agencies that partially complied, National Aeronautics
and Space Administration (NASA) policy requires mission (or space flight)
projects to use a standardized product-oriented work breakdown
structure; however, IT projects do not have such a requirement. NASA
officials reported that they are working to develop a standard structure for
their IT projects; however, they were unable to provide a time frame for
completion. Homeland Security and Justice have yet to standardize their
product structures.
Page 10 GAO-10-2 Information Technology
Among the agencies that did not implement this key practice, reasons
included, among other things, the difficulty in establishing a standard
structure for component agencies that conduct different types of work
with varying complexity. While this presents a challenge, agencies could
adopt an approach similar to DOD’s and develop various standard work
structures based on the kinds of work being performed by the various
component agencies (e.g., automated information system, IT
infrastructure, and IT services). Without fully implementing a standard
product-oriented structure (or structures), agencies will be unable to
collect and share data among programs and may not have the information
they need to make decisions on specific program components.
• Integrated baseline review: All eight agencies required major IT
investments to conduct an integrated baseline review to ensure that
program baselines fully reflect the scope of work to be performed, key
risks, and available resources. For example, DOD required that these
reviews occur within 6 months of contract award and after major
modifications have taken place, among other things.
• Training requirements: Commerce was the only agency to fully meet this
key practice by requiring and enforcing EVM training for all personnel
with investment oversight and program management responsibilities.
Several of the partially compliant agencies required EVM training for
project managers—but did not extend this requirement to other program
management personnel or executives with investment oversight
responsibilities. Many agencies told us that it would be a significant
challenge to require and enforce EVM training for all relevant personnel,
especially at the executive level. Instead, most agencies have made
voluntary EVM training courses available agencywide. However, without
comprehensive EVM training requirements and enforcement, agencies
cannot effectively ensure that programs have the appropriate skills to
validate and interpret EVM data, and that their executives will be able to
make fully informed decisions based on the EVM analysis.
• Rebaselining criteria: Three of the eight agencies fully met this key
practice. For example, the Justice policy outlines acceptable reasons for
rebaselining, such as when the baseline no longer reflects the current
scope of work being performed, and requires investments to explain why
their current plans are no longer feasible and to develop realistic cost and
schedule estimates for remaining work. Among the five partially compliant
agencies, Agriculture and Veterans Affairs provided policies, but in draft
form; NASA was in the process of updating its policy to include more
detailed criteria for rebaselining; and Homeland Security did not define
acceptable reasons but did require an explanation of the root causes for
Page 11 GAO-10-2 Information Technology
cost and schedule variances and the development of new cost and
schedule estimates. In several cases, agencies were unaware of the
detailed rebaselining criteria to be included in their EVM policies. Until
their policies fully meet this key practice, agencies face an increased risk
that their executive managers will make decisions about programs with
incomplete information, and that these programs will continue to overrun
costs and schedules because their underlying problems have not been
identified or addressed.
• System surveillance: All eight agencies required ongoing EVM system
surveillance of all programs (and contracts with EVM requirements) to
ensure their continued compliance with industry standards. For example,
Agriculture required its surveillance teams to submit reports—to the
programs and the Chief Information Officer—with documented findings
and recommendations regarding compliance. Furthermore, the agency
also established a schedule to show when EVM surveillance is expected to
take place on each of its programs.
Our studies of 16 major system acquisition programs showed that all
agencies are using EVM; however, the extent of that implementation varies
among the programs. Our work on best practices in EVM identified 11 key
practices that are implemented on acquisition programs of leading
organizations. These practices can be organized into three management
areas: establishing a sound EVM system, ensuring reliable data, and using
earned value data to make decisions. Table 3 lists these 11 key EVM
practices by management area.
Agencies’ Key
Acquisition Programs
Are Using EVM, but
Are Not Consistently
Implementing Key
Practices
Table 3: Eleven Key EVM Practices for System Acquisition Programs
Program management area of responsibility EVM practice
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure
Identify who in the organization will perform the work
Schedule the work
Estimate the labor and material required to perform the work and authorize the
budgets, including management reserve
Determine objective measure of earned value
Develop the performance measurement baseline
Ensure that the data resulting from the EVM system
are
reliable
Execute the work plan and record all costs
Analyze EVM performance data and record variances from the performance
measurement baseline plan
Page 12 GAO-10-2 Information Technology
Program management area of responsibility EVM practice
Forecast estimates at completion
Ensure that the program management team is using
earned value data for decision-making purposes
Take management action to mitigate risks
Update the performance measurement baseline as changes occur
Source: GAO-09-3SP.
Of the 16 case study programs, 3 demonstrated a full level of maturity in
all three management areas; 3 had full maturity in two areas; and 4 had
reached full maturity in one area. The remaining 6 programs did not
demonstrate full levels of maturity in any of the management areas;
however, in all but 1 case, they were able to demonstrate partial
capabilities in each of the three areas. Table 4 identifies the 16 case study
programs and summarizes our results for these programs. Following the
table is a summary of the programs’ implementation of each key area of
EVM program management responsibility. Additional details on the 16
case studies are provided in appendix II.
Table 4: Assessment of EVM Practices for Case Study Programs
Agency Program
Establishing a
comprehensive EVM
system
Ensuring that data
resulting from the
EVM system are
reliable
Ensuring that the
program
management team is
using earned value
data for decision-
making purposes
Agriculture Farm Program Modernization ◐ ● ●
Commerce Decennial Response Integration System ● ● ●
Field Data Collection Automation
◐ ◐
◐
Defense Air and Space Operations Center—Weapon
System ◐ ◐ ●
Joint Tactical Radio System—Handheld,
Manpack, Small Form Fit ◐ ● ●
Warfighter Information Network—Tactical ◐ ● ◐
Homeland
Security
Automated Commercial Environment ◐ ◐ ●
Integrated Deepwater System—Common
Operational Picture ◐ ◐ ◐
Western Hemisphere Travel Initiative ◐ ◐ ◐
Justice Next Generation Identification ● ● ●
National
Aeronautics
and Space
Administration
James Webb Space Telescope
◐ ◐ ◐
Page 13 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-3SP
Agency Program
Establishing a
comprehensive EVM
system
Ensuring that data
resulting from the
EVM system are
reliable
Ensuring that the
program
management team is
using earned value
data for decision-
making purposes
Juno ◐ ● ●
Mars Science Laboratory ◐ ◐ ◐
Transportation
En Route Automation Modernization ◐ ◐ ●
Surveillance and Broadcast System ● ● ●
Veterans
Affairs
Veterans Health Information Systems and
Technology Architecture—Foundations
Modernization
◐ ◐
◌
Ke
●=The program fully implemented all EVM practices in this program management area.
y
◐=The program partially implemented the EVM practices in this program management area.
◌=The program did not implement the EVM practices in this program management area.
Source: GAO analysis of program data.
Most Programs Did Not
Fully Establish
Comprehensive EVM
Systems
Most programs did not fully implement the key practices needed to
establish comprehensive EVM systems. Of the 16 programs, 3 fully
implemented the practices in this program management area, and 13
partially implemented the practices. The Decennial Response Integration
System, Next Generation Identification, and Surveillance and Broadcast
System programs demonstrated that they had fully implemented the six
practices in this area. For example, our analysis of the Decennial
Response Integration System program schedule showed that activities
were properly sequenced, realistic durations were established, and labor
and material resources were assigned. The Surveillance and Broadcast
System program conducted a detailed integrated baseline review to
validate its performance baseline. It was also the only program to fully
institutionalize EVM at the program level—meaning that it collects
performance data on the contractor and government work efforts—in
order to get a complete view into program status.
Thirteen programs demonstrated that they partially implemented the six
key practices in this area. In most cases, programs had work breakdown
structures that defined work products to an appropriate level of detail and
had identified the personnel responsible for delivering these work
products. However, for all 13 programs, the project schedules contained
issues that undermined the quality of their performance baselines.
Page 14 GAO-10-2 Information Technology
Weaknesses in these schedules included the improper sequencing of
activities, such as incomplete or missing linkages between tasks; a lack of
resources assigned to all activities; invalid critical paths (the sequence of
activities that, if delayed, will impact the planned completion date of the
project); and the excessive or unjustified use of constraints, which impairs
the program’s ability to forecast the impact of ongoing delays on future
planned work activities. These weaknesses are of concern because the
schedule serves as the performance baseline against which earned value is
measured. As such, poor schedules undermine the overall quality of a
program’s EVM system. Other key weaknesses included the following
examples:
• Nine programs did not adequately determine an objective measure of
earned value and develop the performance baseline—that is, key practices
most appropriately addressed through a comprehensive integrated
baseline review, which none of them fully performed. For example, the Air
and Space Operations Center—Weapon System program conducted an
integrated baseline review in May 2007 to validate one segment of work
contained in the baseline; however, the program had not conducted
subsequent reviews for the remaining work because doing so would
preclude staff from completing their normal work activities. Other reasons
cited by the programs for not performing these reviews included the lack
of a fully defined scope of work or management’s decision to use ongoing
EVM surveillance to satisfy these practices. Without having performed a
comprehensive integrated baseline review, programs have not sufficiently
evaluated the validity of their baseline plan to determine whether all
significant risks contained in the plan have been identified and mitigated,
and that the metrics used to measure the progress made on planned work
elements are appropriate.
• Four programs did not define the scope of effort using a work breakdown
structure. For example, the Veterans Health Information Systems and
Technology Architecture—Foundations Modernization program provided
a list of its subprograms; however, it did not define the scope of the
detailed work elements that comprise each subprogram. Without a work
breakdown structure, programs lack a basis for planning the performance
baseline and assigning responsibility for that work, both of which are
necessary to accomplish a program’s objectives.
Many Programs Did Not
Fully Implement Practices
to Ensure Data Reliability
Many programs did not fully ensure that their EVM data were reliable. Of
the 16 programs, 7 fully implemented the practices for ensuring the
reliability of the prime contractor and government performance data, and
9 partially implemented the practices. All 7 programs that demonstrated
Page 15 GAO-10-2 Information Technology
full implementation conduct monthly reviews of earned value data with
technical engineering staff and other key personnel to ensure that the data
are consistent with actual performance; perform detailed performance
trend analyses to track program progress, cost, and schedule drivers; and
make estimates of cost at completion. Four programs that we had
previously identified as having schedule weaknesses (Farm Program
Modernization; Joint Tactical Radio System—Handheld, Manpack, Small
Form Fit; Juno; and Warfighter Information Network—Tactical) were
aware of these issues and had sufficient controls in place to mitigate them
in order to ensure that the earned value data are reliable.
Nine programs partially implemented the three practices for ensuring that
earned value data are reliable. In all cases, the program had processes in
place to review earned value data (from monthly contractor EVM reports
in all but one case), identify and record cost and schedule variances, and
forecast estimates at completion. However, 5 of these programs did not
adequately analyze EVM performance data and properly record variances
from the performance baseline. For example, 2 programs did not
adequately document justifications for cost and schedule variances,
including root causes, potential impacts, and corrective actions. Other
weaknesses in this area include anomalies in monthly performance
reports, such as negative dollars being spent for work performed, which
impacts the validity of performance data. In addition, 7 of these programs
did not demonstrate that they could adequately execute the work plan and
record costs because, among other things, they were unaware of the
schedule weaknesses we identified and did not have sufficient internal
controls in place to deal with these issues to improve the reliability of the
earned value data. Lastly, 2 of these programs could not adequately
forecast estimates at completion due, in part, to anomalies in the prime
contractor’s EVM reports, in combination with the weaknesses contained
in the project schedule.
Most Programs Used
Earned Value Data for
Decision-making Purposes
Programs were uneven in their use of earned value data to make decisions.
Of the 16 programs, 9 fully implemented the practices for using earned
value data for decision making, 6 partially implemented them, and 1 did
not implement them. Among the 9 fully implemented programs, both the
Automated Commercial Environment and Juno programs integrated their
EVM and risk management processes to support the program manager in
making better decisions. The Automated Commercial Environment
program actively recorded risks associated with major variances from the
EVM reports in the program’s risk register. Juno further used the earned
Page 16 GAO-10-2 Information Technology
value data to analyze threats against remaining management reserve and
to estimate the cost impact of these threats.
Six programs demonstrated limited capabilities in using earned value data
for making decisions. In most cases, these programs included earned value
performance trend data in monthly program management review briefings.
However, the majority had processes for taking management action to
address the cost and schedule drivers causing poor trends that were ad
hoc and separate from the programs’ risk management processes—and, in
most cases, the risks and issues found in the EVM reports did not
correspond to the risks contained in the program risk registers. In
addition, 4 of these programs were not able to adequately update the
performance baseline as changes occurred because, in many cases, the
original baseline was not appropriately validated. For example, the Mars
Science Laboratory program just recently updated its performance
baseline as part of a recent replan effort. However, without validating the
original and current baselines with a project-level integrated baseline
review, it is unclear whether the changes to the baseline were reasonable,
and whether the risks assumed in the baseline have been identified and
appropriately mitigated.
One program (Veterans Health Information Systems and Technology
Architecture—Foundations Modernization) was not using earned value
data for decision making. Specifically, the program did not actively
manage earned value performance trends, nor were these data
incorporated into programwide management reviews.
Inconsistent
Implementation Is Due in
Part to Weaknesses in
Policy and Lack of
Enforcement
The inconsistent application of EVM across the investments exists in part
because of the weaknesses we previously identified in the eight agencies’
policies, as well as a lack of enforcement of the EVM policy components
already in place. For example, deficiencies in all three management areas
can be attributed, in part, to a lack of comprehensive EVM training
requirements—which was a policy component that most agencies did not
fully address. The only 3 programs that had fully implemented all key EVM
practices either had comprehensive training requirements in their agency
EVM policy or enforced rigorous training requirements beyond that for
which the policy called. Most of the remaining programs met the minimum
requirements of their agencies’ policies. However, all programs that had
attained full maturity in two management areas had also implemented
more stringent training requirements, although none could match the
efforts made on the other 3 programs. Without making this training a
comprehensive requirement, these agencies are at risk that their major
Page 17 GAO-10-2 Information Technology
system acquisition programs will continue to have management and
technical staff who lack the skills to fully implement key EVM practices.
Our case study analysis also highlighted multiple areas in which programs
were not in compliance with their agencies’ established EVM policies. This
is an indication that agencies are not adequately enforcing program
compliance. These policy areas include requiring EVM compliance at the
start of the program, validating the baseline with an integrated baseline
review, and conducting ongoing EVM surveillance.
Until key EVM practices are fully implemented, selected programs face an
increased risk that program managers cannot effectively optimize EVM as
a management tool to mitigate and reverse poor cost and schedule
performance trends.
Earned value data trends of the 16 case study programs indicate that most
are currently experiencing cost overruns and schedule slippages, and,
based on our analysis, it is likely that when these programs are completed,
the total cost overrun will be about $3 billion. To date, these programs,
collectively, have already overrun their original life-cycle cost estimates by
almost $2 billion (see table 5).
Earned Value Data
Show Trends of Cost
Overruns and
Schedule Slippages on
Most Programs
Table 5: Program Life-cycle Cost Estimate Changes
Dollars in millions
Agency Program
Original life-
cycle cost
estimate
Current life-cycle
cost estimate
Cost overruns in
excess of original
cost estimate
Agriculture Farm Program Modernization $451.0 $451.0 $
0.0
Commerce Decennial Response Integration System 574.0a 946.0a 372.0
Field Data Collection Automation 595.7 801.1 205.4
Defense Air and Space Operations Center—Weapon System 4,425.0 4,425.0 0.0
Joint Tactical Radio System—Handheld, Manpack,
Small Form Fit
19,214.0 11,599.0 n/ab
Warfighter Information Network—Tactical 38,157.1 38,157.1 0.0
Homeland
Security
Automated Commercial Environment 1,500.0c 2,241.0c 741.0
Integrated Deepwater System—Common
Operational Picture
1,353.0c 1,353.0c 0.0
Page 18 GAO-10-2 Information Technology
Dollars in millions
Agency Program
Original life-
cycle cost
estimate
Current life-cycle
cost estimate
Cost overruns in
excess of original
cost estimate
Western Hemisphere Travel Initiative 1,228.0 1,228.0 0.0
Justice Next Generation Identification 1,075.9 1,075.9 0.0
National
Aeronautics
and Space
Administration
James Webb Space Telescope 4,964.0 4,964.0 0.0
Juno 1,050.0 1,050.0 0.0
Mars Science Laboratory 1,634.0 2,286.0 652.0
Transportation En Route Automation Modernization 3,649.4 3,649.4 0.0
Surveillance and Broadcast System 4,313.0 4,328.9 15.9
Veterans
Affairs
Veterans Health Information Systems and
Technology Architecture—Foundations
Modernization
1,897.4 1,897.4 0.0
Total $1,986.3
billion
Source: GAO analysis of program and contractor data.
aWe removed $37 million from the original estimate, which represented costs associated with the
closeout of the program. We did this because the current estimate does not include costs for these
activities. An estimate for these activities is currently being revised. In addition, the cost increase
associated with the current estimate is due, in part, to an agency-directed expansion of program
scope (related to the system’s ability to process a higher volume of paper forms) in April 2008.
bIt is not appropriate to compare the original and current life-cycle cost estimates for this program
because the scope has significantly changed since inception (such as newly imposed security
requirements). In addition, due to a change in the agency’s migration strategy for replacing legacy
radios with new tactical radios, the planned quantity of radios procured was decreased from 328,514
to 95,551. As a result, the life-cycle cost estimate was reduced and no longer represents the original
scope of the program.
cThe original and current life-cycle costs do not include operations and maintenance costs.
Taking the current earned value performance15 into account, our analysis
of the 16 case study programs indicated that most are experiencing
shortfalls against their currently planned cost and schedule targets.
Specifically, earned value performance data over a 12-month period
15In 13 cases, programs limited the use of EVM to system development work on contract.
As such, earned value data will reflect contractor performance only. In the 3 other cases,
the Farm Program Modernization, Surveillance and Broadcast System, and Veterans Health
Information Systems and Technology Architecture—Foundations Modernization, programs
expanded the use of EVM to the entire program; therefore, the earned value data will
reflect total program performance (contractor and government).
Page 19 GAO-10-2 Information Technology
showed that the 16 programs combined have exceeded their cost targets
by $275 million. During that period, they also experienced schedule
variances and were unable to accomplish almost $93 million worth of
planned work. In most cases, the negative cost and schedule performance
trends were attributed to ongoing technical issues in the development or
testing of system components.
Furthermore, our projections of future estimated costs at completion
based on our analysis of current contractor performance trends indicate
that these programs will most likely continue to experience cost overruns
to completion, totaling almost $1 billion. In contrast, the programs’
contractors estimate the cost overruns at completion will be
approximately $469.7 million. These estimates are based on the
contractors’ assumption that their efficiency in completing the remaining
work will significantly improve over what has been done to date.
Furthermore, it should be noted that in 4 cases, the contractor-estimated
overrun is smaller than the cost variances they have already
accumulated—which is an indication that these estimates are aggressively
optimistic.16
With the inclusion of the overruns already incurred to date, the total
increase in life-cycle costs will be about $3 billion. Our analysis is
presented in table 6. Additional details on the 16 case studies are provided
in appendix II.
Table 6: Contractor Cumulative Cost and Schedule Performances
Dollars in millions
Agency Program
Contractor
budget at
completion
Percentage
complete
Cumulative
cost
variance
Cumulative
schedule
variance
Contractor-
estimated
cost
overrun/
underrun at
completion
GAO most
likely cost
overrun/
underrun at
completion
Agriculture Farm Program
Modernizationa,b
$7.0 94% $<0.1 ($0.2) $<0.1 $<
0.1
Commerce
Decennial Response
Integration System
468.6 50 13.6 2.3 7.0 underrun 7.0 underrun
16These programs include the Field Data Collection Automation, Automated Commercial
Environment, Juno, and Veterans Health Information Systems and Technology
Architecture—Foundations Modernization.
Page 20 GAO-10-2 Information Technology
Dollars in millions
Agency Program
Contractor
budget at
completion
Percentage
complete
Cumulative
cost
variance
Cumulative
schedule
variance
Contractor-
estimated
cost overrun/
underrun at
completion
GAO most
likely cost
overrun/
underrun at
completion
Field Data Collection
Automation
555.6 75 (3.5) 0.4 2.9 overrun 4.6 overrun
Defense Air and Space Operations
Center—Weapon System
171.3 86 (0.1) 0.4 0.8 overrun 0.8 overrrun
Joint Tactical Radio
System—Handheld,
Manpack, Small Form Fit
530.8 74 (62.4) (8.8) 70.1 overrun 89.1 overrun
Warfighter Information
Network—Tactical
747.0 34 0.8 (12.0) 3.7 underrun 15.1 overrun
Homeland
Security
Automated Commercial
Environment
382.3 83 (18.8) (13.2) 0.5 underrun 24.1 overrun
Integrated Deepwater
System—Common
Operational Picture
130.2 99 (4.2) 0.0 4.2 overrun 4.2 overrun
Western Hemisphere
Travel Initiativec
45.3 100 n/a n/a n/a n/a
Justice
Next Generation
Identification
37.5 91 (1.4) (0.5) 1.5 overrun 1.6 overrun
National
Aeronautics
and Space
Administration
James Webb Space
Telescope
1,271.6 64 (224.7) (9.4) 448.5
overrund
448.5 overrun
Juno 369.0 32 (13.2) (12.3) 6.4 overrun 49.8 overrun
Mars Science Laboratorye 1,223.0 77 2.2 (6.2) 4.1 overrun n/a
Transportation
En Route Automation
Modernization
1,480.2 89 36.9 15.9 15.3 underrun 15.3 underrun
Surveillance and Broadcast
Systema
1,007.9 27 14.7 (24.0) 41.6 underrun 21.7 overrun
Veterans
Affairs
Veterans Health
Information Systems and
Technology Architecture—
Foundations Modernizationa
1,897.4 10 (14.9) (24.9) 0.7 underrun 350.2 overrun
Total $10,324.7 $275.0
overrun
$92.5
overrun
$469.7
overrun
$987.4
overrun
Source: GAO analysis of program and contractor data.
aEarned value data reflect performance for the full scope of the program.
bThis program is currently in the initiation phase of its life cycle, and the budget at completion reflects
only work planned to be completed in this phase.
cThe program’s contractor completed development work in June 2009.
Page 21 GAO-10-2 Information Technology
dProject officials stated that they have adequate contingency reserves built into their life-cycle cost
estimate to cover this estimated overrun and any additional overruns (should performance continue to
degrade) through contract completion.
eEVM reporting was suspended between November 2008 and February 2009 while the project was
being replanned; therefore, we did not have sufficient data to make a reliable independent estimate at
completion.
Eleven programs are expected to incur a cost overrun at contract
completion. In particular, two programs (i.e., the James Webb Space
Telescope and Veterans Health Information Systems and Technology
Architecture—Foundations Modernization programs) will likely
experience a combined overrun of $798.7 million, which accounts for
about 80 percent of our total projection.
With timely and effective action taken by program and executive
management, it is possible to reverse negative performance trends so that
the projected cost overruns at completion may be reduced. To get such
results, management at all levels could be strengthened, including
contractor management, program office management, and executive-level
management. For example, programs could strengthen program office
controls and contractor oversight by obtaining earned value data weekly
(instead of monthly) so that they can make decisions with immediate and
greater impact. Additionally, key risks could be elevated to the program
level and, if necessary, to the executive level to ensure that appropriate
mitigation plans are in place and that they are tracked to closure.
Key agencies have taken a number of important steps to improve the
management of major acquisitions through the implementation of EVM.
Specifically, the agencies have established EVM policies and require their
major system acquisition programs to use EVM. However, none of the
eight agencies that we reviewed have comprehensive EVM policies. Most
of these policies omit or lack sufficient guidance on the type of work
structure needed to effectively use EVM data and on the training
requirements for all relevant personnel. Without comprehensive policies, it
will be difficult for the agencies to gain the full benefits of EVM.
Conclusions
Few of our 16 case study programs had fully implemented EVM
capabilities, raising concerns that programs cannot efficiently produce
reliable estimates of cost at completion. Many of these weaknesses found
on these programs can be traced back to inadequate agency EVM policies
and raise questions concerning the agencies’ enforcement of the policies
already established, including the completion of the integrated baseline
reviews and system surveillance. Until agencies expand and enforce their
Page 22 GAO-10-2 Information Technology
EVM policies, it will be difficult for them to optimize the effectiveness of
this management tool, and they will face an increased risk that managers
are not getting the information they need to effectively manage the
programs.
In addition to concerns about their implementation of EVM, the programs’
earned value data show trends toward cost overruns that are likely to
collectively total about $3 billion. Without timely and aggressive
management action, this projected overrun will be realized, resulting in
the expenditure of over $1 billion more than currently planned.
To address the weaknesses identified in agencies’ policies and practices in
using EVM, we are making recommendations to the eight major agencies
included in this review. Specifically, we recommend that the following
three actions be taken by the Secretaries of the Departments of
Agriculture, Commerce, Defense, Homeland Security, Justice,
Transportation, and Veterans Affairs and the Administrator of the National
Aeronautics and Space Administration:
• modify policies governing EVM to ensure that they address the
weaknesses that we identified, taking into consideration the criteria used
in this report;
• direct key system acquisition programs to implement the EVM practices
that address the detailed weaknesses that we identified in appendix II,
taking into consideration the criteria used in this report; and
• direct key system acquisition programs to take action to reverse current
negative performance trends, as shown in the earned value data, to
mitigate the potential cost and schedule overruns.
We provided the selected eight agencies with a draft of our report for
review and comment. The Department of Homeland Security responded
that it had no comments. The remaining seven agencies generally agreed
with our results and recommendations. Agencies also provided technical
comments, which we incorporated in the report as appropriate.
Recommendations for
Executive Action
Agency Comments
and Our Evaluation
The comments of the agencies are summarized in the following text:
• In e-mail comments on a draft of the report, officials from the U.S.
Department of Agriculture’s Office of the Chief Information Officer stated
Page 23 GAO-10-2 Information Technology
that the department has begun to address the weaknesses in its EVM
policy identified in the report.
• In written comments on a draft of the report, the Secretary of Commerce
stated that, regarding the second and third recommendations, the
Department of Commerce was pleased that the Decennial Response
Integration System was found to have fully implemented all 11 key EVM
practices, and that the Field Data Collection Automation program fully
implemented six key practices. The department added that its recent
actions on the Field Data Collection Automation program should move
this program to full compliance with the key EVM practices. Furthermore,
regarding the first recommendation, the Secretary stated that while the
department understands and appreciates the value of standardized work
breakdown structures, it maintained that the development of these work
structures should take place at the department’s operating units (e.g.,
Census Bureau), given the wide diversity of missions and project
complexity among these units. As noted in our report, we agree that
agencies could develop standard work structures based on the kinds of
work being performed by the various component agencies. Therefore, we
support these efforts described by the department because they are
generally consistent with the intent of our recommendation. Commerce’s
comments are printed in appendix III.
• In written comments on a draft of the report, the Department of Defense’s
Director of Defense Procurement and Acquisition Policy stated that the
department concurred with our recommendations. Among other things,
DOD stated that it is essential to maintain the appropriate oversight of
acquisition programs, including the use of EVM data to understand
program status and anticipate potential problems. DOD’s comments are
printed in appendix IV.
• In written comments on a draft of the report, the Department of Justice’s
Assistant Attorney General for Administration stated that, after discussion
with our office, it was agreed that the second recommendation, related to
implementing EVM practices that address identified weakness, was
inadvertently directed to the department, and that no response was
necessary. We agreed because the case study program reviewed fully met
all key EVM practices. The department concurred with the two remaining
recommendations related to modifying EVM policies and reversing
negative performance trends. Furthermore, the Assistant Attorney General
noted that Justice had begun to take steps to improve its use of EVM, such
as modifying its policy to require EVM training for all personnel with
investment oversight and program management responsibilities. Justice’s
comments are printed in appendix V.
Page 24 GAO-10-2 Information Technology
• In written comments on a draft of the report, the National Aeronautics and
Space Administration’s Deputy Administrator stated that the agency
concurred with two recommendations and partially concurred with one
recommendation. In particular, the Deputy Administrator agreed that
opportunities exist for improving the implementation of EVM, but stated
that NASA classifies the projects included in the scope of the audit as space
flight projects (not as IT-specific projects), which affects the applicability of
the agency’s EVM policies and guidance that were reviewed. We recognize
that different classifications of IT exist; however, consistent with other
programs included in the audit, the selected NASA projects integrate and
rely on various elements of IT. As such, we reviewed both the agency’s
space flight and IT-specific guidance. Furthermore, the agency partially
concurred with one recommendation because it stated that efforts were
either under way or planned that will address the weaknesses we identified.
We support the efforts that NASA described in its comments because they
are generally consistent with the intent of our recommendation. NASA’s
comments are printed in appendix VI.
• In e-mail comments on a draft of the report, the Department of
Transportation’s Director of Audit Relations stated that the department is
taking immediate steps to modify its policies governing EVM, taking into
consideration the criteria used in the draft report.
• In written comments on a draft of the report, the Secretary of Veterans
Affairs stated that the Department of Veterans Affairs generally agreed
with our conclusions and concurred with our recommendations.
Furthermore, the Secretary stated that Veterans Affairs has initiatives
under way to address the weaknesses identified in the report. Veterans
Affairs’ comments are printed in appendix VII.
As agreed with your office, unless you publicly announce the contents of
this report earlier, we plan no further distribution until 30 days from the
report date. At that time, we will send copies of this report to interested
congressional committees; the Secretaries of the Departments of
Agriculture, Commerce, Defense, Homeland Security, Justice,
Transportation, and Veterans Affairs; the Administrator of the National
Aeronautics and Space Administration; and other interested parties. In
addition, the report will be available at no charge on our Web site at
http://www.gao.gov.
Page 25 GAO-10-2 Information Technology
http://www.gao.gov/
If you or your staff have any questions on the matters discussed in this
report, please contact me at (202) 512-9286 or pownerd@gao.gov. Contact
points for our Offices of Congressional Relations and Public Affairs may
be found on the last page of this report. GAO staff who made major
contributions to this report are listed in appendix VIII.
Sincerely yours,
David A. Powner
Director, Information Technology
ues Management Iss
Page 26 GAO-10-2 Information Technology
Appendix I: Objectives, Scope, and
Methodology
Page 27 GAO-10-2
Appendix I: Objectives, Scope, and
Methodology
Our objectives were to (1) assess whether key departments and agencies
have appropriately established earned value management (EVM) policies,
(2) determine whether these agencies are adequately using earned value
techniques to manage key system acquisitions, and (3) evaluate the earned
value data of these selected investments to determine their cost and
schedule performances.
For this governmentwide review, we assessed eight agencies and 16
investments. We initially identified the 10 agencies with the highest
amount of spending for information technology (IT) development,
modernization, and enhancement work as reported in the Office of
Management and Budget’s (OMB) Fiscal Year 2009 Exhibit 53. These
agencies were the Departments of Agriculture, Commerce, Defense,
Health and Human Services, Homeland Security, Justice, Transportation,
the Treasury, and Veterans Affairs and the National Aeronautics and Space
Administration. We excluded Treasury from our selection because we
recently performed an extensive review of EVM at that agency.1 We also
subsequently removed Health and Human Services from our selection
because the agency did not have investments in system acquisition that
met our dollar threshold (as defined in the following text). The resulting
eight agencies also made up about 75 percent of the government’s planned
IT spending for fiscal year 2009.
To ensure that we examined significant investments, we chose from
investments (related to system acquisition) that were expected to receive
development, modernization, and enhancement funding in fiscal year 2009
in excess of $90 million.2 We limited the number of selected investments to
a maximum of 3 per agency. For agencies with more than 3 investments
that met our threshold, we selected the top 3 investments with the highest
planned spending. For agencies with 3 or fewer such investments, we
chose all of the investments meeting our dollar threshold. Lastly, we
excluded investments with related EVM work already under way at GAO.3
1GAO, Information Technology: Treasury Needs to Better Define and Implement Its
Earned Value Management Policy, GAO-08-951 (Washington, D.C.: Sept. 22, 2008).
2There were 30 investments that met this criterion.
3These investments include the Department of Defense’s Navy Enterprise Resource
Planning, and the Department of Homeland Security’s Secure Border Initiative net and U.S.
Visitor and Immigration Status Indicator Technology.
Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-08-951
Appendix I: Objectives, Scope, and
Methodology
To assess whether key agencies have appropriately established EVM
policies, we analyzed agency policies and guidance for EVM. Specifically,
we compared these policies and guidance documents with both OMB’s
requirements and key best practices recognized within the federal
government and industry for the implementation of EVM. These best
practices are contained in the GAO cost guide.4 We also interviewed key
agency officials to obtain information on their ongoing and future EVM
plans.
To determine whether these agencies are adequately using earned value
techniques to manage key system acquisitions, we analyzed program
documentation, including project work breakdown structures, project
schedules, integrated baseline review briefings, risk registers, and monthly
management briefings for the 16 selected investments. Specifically, we
compared program documentation with EVM and scheduling best
practices as identified in the cost guide.5 We determined whether the
program implemented, partially implemented, or did not implement each
of the 11 practices. We also interviewed program officials (and observed
key program status review meetings) to obtain clarification on how EVM
practices are implemented and how the data are used for decision-making
purposes.
To evaluate the earned value data of the selected investments to determine
their cost and schedule performances, we analyzed the earned value data
contained in contractor EVM performance reports obtained from the
programs. To perform this analysis, we compared the cost of work
completed with budgeted costs for scheduled work for a 12-month period
to show trends in cost and schedule performances. We also used data from
these reports to estimate the likely costs at completion through
established earned value formulas. This resulted in three different values,
with the middle value being the most likely. To assess the reliability of the
cost data, we compared it with other available supporting documents
(including OMB and agency financial reports); electronically tested the
data to identify obvious problems with completeness or accuracy; and
interviewed agency and program officials about the data. For the purposes
of this report, we determined that the cost data were sufficiently reliable.
We did not test the adequacy of the agency or contractor cost-accounting
4GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and
Managing Capital Program Costs, GAO-09-3SP (Washington, D.C.: March 2009).
5GAO-09-3SP.
Page 28 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-3SP
http://www.gao.gov/cgi-bin/getrpt?GAO-09-3SP
Appendix I: Objectives, Scope, and
Methodology
systems. Our evaluation of these cost data was based on what we were
told by the agency and the information they could provide.
We conducted this performance audit from February to October 2009 at
the agencies’ offices in the Washington, D.C., metropolitan area; Fort
Monmouth, New Jersey; Jet Propulsion Lab, Pasadena, California;
Hanscom Air Force Base, Massachusetts; and Naval Base San Diego,
California. Our work was done in accordance with generally accepted
government auditing standards. Those standards require that we plan and
perform the audit to obtain sufficient, appropriate evidence to provide a
reasonable basis for our findings and conclusions based on our audit
objectives. We believe that the evidence obtained provides a reasonable
basis for our findings and conclusions based on our audit objectives.
Page 29 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
We conducted case studies of 16 major system acquisition programs (see
table 7). For each of these programs, the remaining sections of this
appendix provide the following: a brief description of the program,
including a graphic illustration of the investment’s life cycle; an
assessment of the program’s implementation of the 11 key EVM practices;
and an analysis of the program’s recent earned value (EV) data and trends.
These data and trends are often described in terms of cost and schedule
variances. Cost variances compare the earned value of the completed
work with the actual cost of the work performed. Schedule variances are
also measured in dollars, but they compare the earned value of the
completed work with the value of the work that was expected to be
completed. Positive variances are good—they indicate that activities are
costing less than expected or are completed ahead of schedule. Negative
variances are bad—they indicate activities are costing more than expected
or are falling behind schedule.
Table 7: Sixteen Case Study Programs
Agency Program
Agriculture Farm Program Modernization
Commerce Decennial Response Integration System
Field Data Collection Automation
Defense Air and Space Operations Center—Weapon System
Warfighter Information Network—Tactical
Homeland Security Automated Commercial Environment
Western Hemisphere Travel Initiative
Justice Next Generation Identification
National Aeronautics
and Space
Administration
James Webb Space Telescope
Juno
Mars Science Laboratory
Transportation En Route Automation Modernization
Surveillance and Broadcast System
Veterans Affairs Veterans Health Information Systems and Technology
Architecture—Foundations Modernization
Source: GAO analysis of program data.
Page 30 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
The following information describes the key that we used in tables 8
through 23 to convey the results of our assessment of the 16 case study
programs’ implementation of the 11 EVM practices.
Key description Key symbol
The program fully implemented all EVM
practices in this program management area.
●
The program partially implemented the EVM
practices in this program management area.
◐
The program did not implement the EVM
practices in this program management area.
◌
Page 31 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
[This page is intentionally left blank.]
Page 32 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Page 33 GAO-10-2
The Farm Program Modernization (MIDAS) program is intended to
address the long-term needs in delivering farm benefit programs via
business process reengineering and implementation of a commercial off-
the-shelf enterprise resource planning solution. MIDAS is an initiative of
the Farm Service Agency, which is responsible for administering 35 farm
benefit programs. To support these programs, the agency uses two
primary systems—a distributed network of legacy computers and a
centralized Web farm (to store customer data and host Web-based
applications)—both of which have shortcomings. While MIDAS is to
replace these computers, it is also intended to provide new applications
and redesigned business processes. The Web farm is expected to remain in
operation in a supporting role for the program. Currently, MIDAS is in the
initiation phase of its life cycle and plans to award the system integration
contract in the first quarter of fiscal year 2010.
Farm Program
Modernization
Investment Details
Department of Agriculture
(Farm Service Agency)
Program start date: 2004
Total life-cycle cost:
• Current: $451 million
• Original: $451 million
Program end date:
• Current: 2018
• Original: 2017
Rebaselines: 1 (September 2008)
Major contractor: Prime contract to be
awarded in the first quarter of FY 20
10
Initiation Development
Operations and
maintenance
Source: GAO analysis of U.S. Department of Agriculture (Farm Service Agency) data.
Table 8: GAO EVM Practice Assessment of Agriculture’s MIDAS Program
Program management area of responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ◐
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work and
authorize the budgets, including management reserve ◐
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the EVM
system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management team is
using earned value data for decision-making
purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of U.S. Department of Agriculture (Farm Service Agency) data.
Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
MIDAS fully met 6 of the 11 key practices for implementing EVM and
partially met 5 practices. Specifically, a key weakness in the EVM system
is the lack of a comprehensive integrated baseline review. Instead, MIDAS
focused solely on evaluating the program’s compliance with industry
standards and chose not to validate the quality of the baseline. Program
officials stated that they plan to conduct a full review to address the risks
and realism of the baseline after the prime contract has been awarded.
Furthermore, while the MIDAS schedule is generally sound, resources
were not assigned to all activities, and the critical path (the longest
duration path through the sequenced list of key activities) could not be
identified because the current schedule ends in September 2009. Finally,
MIDAS met all key practices associated with data reliability, such as
executing the work plan and recording costs, as well as all key practices
for decision making.
EV Performance Details
Based on performance data from June
2008 to May 2009, MIDAS generally met
its planned cost targets. However, at the
same time the program consistently has
had negative schedule variances,
indicating that work is slightly behind
schedule. Reasons for this slippage
include work being accomplished less
efficiently than planned, with some
activities, such as the acquisition of a
project management information system,
being delayed. We concur with the
program’s estimate that it will meet its
current budget at completion—worth
approximately $7.0 million—for program
initiation activities.
Program percent complete: 94%
Estimates at completion:
• Program: $6.9 million
• GAO: $6.9 million
Figure 1: GAO EV Data Analysis of Agriculture’s MIDAS Program
Cumulative schedule variance
Cumulative cost variance
-0.
5
-0.4
–
0.3
–
0.2
-0.1
0.0
0.1
0.2
0.3
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of U.S. Department of Agriculture (Farm Service Agency) data.
2008
Year
2009
Page 34 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Decennial Response
Integration System
The Decennial Response Integration System (DRIS) is to be used during
the 2010 Census for collecting and integrating census responses from all
sources, including forms and telephone interviews. The system is to
improve accuracy and timeliness by standardizing the response data and
providing the data to other Census Bureau systems for analysis and
processing. Among other things, DRIS is expected to process census data
provided by respondents via census forms, telephone agents, and
enumerators; assist the public via telephone; and monitor the quality and
status of data capture operations. The DRIS program’s estimated life-cycle
costs have increased by $372 million, which is mostly due to increases in
both paper and telephone workloads. For example, the paper workload
increased due to an April 2008 redesign of the 2010 Census that reverted
planned automated operations to paper-based processes and requires
DRIS to process an additional estimated 40 million paper forms.
Investment Details
Department of Commerce
(Census Bureau)
Program start date: March 2006
Total life-cycle cost:
• Current: $946 million
• Original: $574 million
Program end date:
• Current: September 2013
• Original: September 2013
Rebaselines: 0
Major contractor: Lockheed Martin Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Commerce (Census Bureau) data.
Table 9: GAO EVM Practice Assessment of Commerce’s DRIS Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ●
Estimate the labor and material required to perform the work and
authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the EVM
system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management team is
using earned value data for decision-making
purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Commerce (Census Bureau) data.
Page 35 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
DRIS fully implemented all 11 of the key EVM practices necessary to
manage its system acquisition program. Specifically, the program
implemented all practices for establishing a comprehensive EVM system,
such as defining the scope of work and scheduling the work. The
program’s schedule appropriately captured and sequenced key activities
and assigned realistic resources to all key activities. Furthermore, the
DRIS team ensured that the resulting EVM data were appropriately
verified and validated for reliability by analyzing performance data to
identify the magnitude and effect of problems causing key variances,
tracking related risks in the program’s risks register, and performing
quality checks of the schedule and critical path. Lastly, the DRIS program
management team conducted rigorous reviews of EV performance on a
monthly basis and took the appropriate management actions to mitigate
risks.
EV Performance Details
Based on performance data from June
2008 to May 2009, the DRIS contractor
has outperformed its planned cost targets
by $13.6 million. For this same period, it
has also outperformed its schedule
targets by completing $2.3 million worth
of work ahead of schedule. We concur
with the contractor’s estimate that it will
underrun its current budget—worth
approximately $468.6 million—by $7.0
million.
Contract percent complete: 50%
Estimates at completion:
• Contractor: $461.7 million
• GAO: $461.7 million
Note: The DRIS contractor did not report
EV data in November 2008.
Figure 2: GAO EV Data Analysis of Commerce’s DRIS Program
Cumulative schedule variance
Cumulative cost variance
-2
0
2
4
6
8
10
12
14
16
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Commerce (Census Bureau) data.
2008
Year
2009
Page 36 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
The Field Data Collection Automation (FDCA) program is intended to
provide automation support for the 2010 Census field data collection
operations. The program includes the development of handheld computers
for identifying and correcting addresses for all known living quarters in the
United States (known as address canvassing) and the systems, equipment,
and infrastructure that field staff will use to collect data. FDCA handheld
computers were originally to be used for other census field operations,
such as following up with nonrespondents through personal interviews.
However, in April 2008, due to problems identified during testing and cost
overruns and schedule slippages in the FDCA program, the Secretary of
Commerce announced a redesign of the 2010 Census, and rebaselined
FDCA in October 2008. As a result, FDCA’s life-cycle costs have increased
from an estimated $596 million to $801 million, a $205 million increase.
Furthermore, the responsibility for the design, development, and testing of
IT systems for other key field operations was moved from the FDCA
contractor to the Census Bureau.
Field Data Collection
Automation
Investment Details
Department of Commerce
(Census Bureau)
Program start date: March 2006
Total life-cycle cost:
• Current: $801.1 million
• Original: $595.7 million
Program end date:
• Current: December 2011
• Original: December 2011
Rebaselines: 1 (October 2008)
Major contractor: Harris Corporation
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Commerce (Census Bureau) data.
Table 10: GAO EVM Practice Assessment of Commerce’s FDCA Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work and
authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from the
performance measurement baseline plan ◐
Forecast estimates at completion ◐
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks
◐
Update the performance measurement baseline as changes occur ●
Source: GAO analysis of Department of Commerce (Census Bureau) data.
Page 37 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
FDCA fully met 6 of the 11 key practices for implementing EVM and
partially met 5 others. Specifically, the program fully met most practices
for establishing a comprehensive EVM system, such as defining the scope
of the work effort; however, it only partially met the practice for
scheduling the work. Specifically, the program schedule contained
weaknesses, including key milestones with fixed completion dates—which
hampers the program’s ability to see the impact of delays experienced on
open tasks on successor tasks. As such, the FDCA program cannot use the
schedule as an active management tool. Furthermore, anomalies in the
prime contractor’s EVM reports, combined with weaknesses in the master
schedule, affect FDCA’s ability to execute the work plan, analyze
variances, and make reliable estimates of cost at completion. Lastly, cost
and schedule drivers identified in EVM reports were not fully consistent
with the program’s risk register, which prevents the program from taking
the appropriate management action to mitigate risks and effectively using
EV data for decisions.
EV Performance Details
Due to contractor performance issues, the
FDCA program established a new
program baseline in October 2008. Based
on performance data from October 2008
to May 2009, the contractor has currently
exceeded its revised cost target by $3.5
million. We estimate that the FDCA
contract will overrun its current
budget—worth approximately $555.6
million—by $4.6 million. Our analysis
indicates that the rebaselined contract is
currently on schedule.
Contract percent complete: 75%
Estimates at completion:
• Contractor: $558.5 million
• GAO: $560.2 million
Note: EV data between June 2008 and
September 2008 did not reflect actual
program performance because the
program was rebaselining; therefore,
these data have been omitted.
Figure 3: GAO EV Data Analysis of Commerce’s FDCA Program
Cumulative schedule variance
Cumulative cost variance
-6
-4
-2
0
2
4
6
8
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Commerce (Census Bureau) data.
2008
Year
2009
Page 38 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Air and Space
Operations Center—
Weapon System
The Air and Space Operations Center—Weapon System (AOC) is the air
and space operations planning, execution, and assessment system for the
Joint Force Air Component Commander. According to the agency, there
are currently 11 AOCs located around the world, each aligned to the
Combatant Commands of the Unified Command Plan, with additional
support units for training, help desk, testing, and contingency manpower
augmentation. Each AOC is designed to enable commanders to exercise
command and control of air, space, information operations, and combat
support forces to achieve the objectives of the joint force commander and
combatant commander in joint and coalition military operations. As such,
the AOC system is intended as the planning and execution engine of any
air campaign.
Investment Details
Department of Defense
(Department of the Air Force)
Program start date: September 2000
Total life-cycle cost:
• Current: $4.425 billion
• Original: $4.425 billion
Program end date:
• Current: September 2023
• Original: September 2023
Rebaselines: 0
Major contractor: Lockheed Martin
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Defense (Department of the Air Force) data.
Table 11: GAO EVM Practice Assessment of Defense’s AOC Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Defense (Department of the Air Force) data.
Page 39 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
AOC fully met 7 of the 11 key practices and partially met 4 others. AOC
applied EVM at the contract level and has a capable government team that
has made it an integral part of project management. AOC performed
detailed analyses of the EV data and reviews the data with engineering
staff to ensure that the appropriate metrics have been applied for accurate
reporting. AOC has also integrated EVM with its risk management
processes to ensure that resources are applied to watch or mitigate risks
associated with the cost and schedule drivers reported in the EVM reports.
Weaknesses found in AOC’s EVM processes relate to the development and
validation of the contractor baseline. In particular, AOC has not performed
an integrated baseline review for all work that is currently on contract.
The master schedule also contained issues, such as a high number of
converging tasks and out-of-sequence tasks, that hamper AOC’s ability to
determine the start dates of future tasks. Taken together, these issues
undermine the reliability of the schedule as a baseline to measure EV
performance.
Information Technology
Figure 4: GAO EV Data Analysis of Defense’s AOC Program
EV Performance Details
As of April 2009, the AOC contractor has
overrun its planned cost targets by $58,000.
However, for this same period, it has
completed $422,000 worth of work ahead of
schedule. Based on the performance data
from May 2008 to April 2009, we concur
with the contractor’s estimate that it will
overrun its current budget—worth
approximately $171.3 million—by $793,000.
Contract percent complete: 86%
Estimates at completion:
• Contractor: $172.1 million
• GAO: $172.1 million
Page 40 GAO-10-2
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Joint Tactical Radio
System—Handheld,
Manpack, Small Form
Fit
The Joint Tactical Radio System (JTRS) program is developing software-
defined radios that are expected to interoperate with existing radios and
increase communications and networking capabilities. The JTRS-
Handheld, Manpack, Small Form Fit (HMS) product office, within the
JTRS Ground Domain program office, is developing handheld, manpack,
and small form fit radios. In 2006, the program was restructured to include
two concurrent phases of development. Phase I includes select small form
fit radios, while Phase II includes small form fit radios with enhanced
security as well as handheld and manpack variants. Subsequent to the
program’s restructure, the department updated its migration strategy for
replacing legacy radios with new tactical radios. As such, the total planned
quantity of JTRS-HMS radios was reduced from an original baseline of
328,514—established in May 2004—to 95,551. As a result, the total life-
cycle cost of the JTRS-HMS program was reduced from an estimated $19.2
billion to $11.6 billion, a $7.6 billion decrease.
Investment Details
Department of Defense
(Joint—Department of the Navy Lead)
Program start date: April 2004
Total life-cycle cost:
• Current: $11.559 billion
• Original: $19.214 billion
Program end date:
• Current: 2048
• Original: 2045
Rebaselines: 1 (June 2006)
Major contractor: General Dynamics C4
Systems
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Defense (Joint—Department of the Navy Lead) data.
Table 12: GAO EVM Practice Assessment of Defense’s JTRS-HMS Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work and
authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes occur ●
Source: GAO analysis of Department of Defense (Joint—Department of the Navy Lead) data.
Page 41 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
JTRS-HMS fully met 10 of the 11 key practices and partially met 1 practice.
Specifically, JTRS-HMS implemented most practices for establishing a
comprehensive EVM system, such as performing rigorous reviews to
validate the baseline; however, the current schedule contained some
weaknesses, such as out-of-sequence logic and activities without
resources assigned. Program officials were aware of these issues and
attributed them to weaknesses in subcontractor schedules that are
integrated on a monthly basis. The JTRS-HMS program fully met practices
for ensuring that the resulting EV data were appropriately verified and
validated for reliability and demonstrated that the program management
team was using these data for decision-making purposes.
EV Performance Details
Based on performance data from June 2008
to May 2009, the JTRS-HMS contractor has
experienced negative cost and schedule
variances. Specifically, as of May 2009, the
contractor has exceeded its planned cost
target by $62.4 million. We estimate that the
JTRS-HMS contract will overrun its current
budget—worth approximately $530.8
million—by $89.1 million. Furthermore, as of
May 2009, JTRS-HMS has not completed
$8.8 million in planned work. Both cost and
schedule variances are primarily due to
radio hardware development, including
design issues related to hardware
miniaturization.
Contract percent complete: 74%
Estimates at completion:
• Contractor: $600.9 million
• GAO: $619.9 million
Figure 5: GAO EV Data Analysis of Defense’s JTRS-HMS Program
Cumulative schedule variance
Cumulative cost variance
-70
-60
-50
–
40
–
30
–
20
-10
0
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Defense (Joint—Department of the Navy Lead) data.
2008
Year
2009
Page 42 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Warfighter
Information
Network—Tactical
The Warfighter Information Network—Tactical (WIN-T) program is
designed to be the Army’s high-speed and high-capacity backbone
communications network. The program connects Department of the Army
units with higher levels of command and provides the Army’s tactical
portion of the Global Information Grid—a Department of Defense
initiative aimed at building a secure network and set of information
capabilities modeled after the Internet. WIN-T was restructured in June
2007 following a unit cost increase above the critical cost growth
threshold (known as a Nunn-McCurdy breach). As a result of the
restructuring, it was determined that WIN-T would be fielded in four
increments. The third increment is expected to provide the Army with a
full networking on-the-move capability and fully support the Army’s
Future Combat Systems. In May 2009, the Increment 3 program baseline
was approved, and the life-cycle cost for the program was estimated at
$38.2 billion. Our assessment of EVM practices and EV data was
performed on WIN-T Increment 3.
Investment Details
Department of Defense
(Department of the Army)
Program start date: July 2003
Total life-cycle cost:
• Current: $38.157 billion
• Original: $38.157 billion
Program end date:
• Current: 2025
• Original: 20
25
Rebaselines: 1 (June 2007)
Major contractor: General Dynamics C4
Systems
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Defense (Department of the Army) data.
Table 13: GAO EVM Practice Assessment of Defense’s WIN-T Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◌
Develop the performance measurement baseline ◌
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ◌
Source: GAO analysis of Department of Defense (Department of the Army) data.
Page 43 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
WIN-T fully met 7 of the 11 key practices for implementing EVM, partially
met 1 practice, and did not meet 3 practices. Specifically, WIN-T only
partially met the practices for establishing a comprehensive EVM system.
The schedule contained weaknesses, including fixed completion dates—
which prevented the schedule from showing the impact of delays
experienced on open or successor tasks or the expected completion dates
of key activities. Furthermore, WIN-T has not conducted an integrated
baseline review on the current scope of work since rebaselining the prime
contract in December 2007. According to program officials, this review has
not been conducted because they have not yet finalized the contract.
However, as of August 2009, it has been 20 months since work began,
which increases the risk that the program has not been measuring
progress against a reasonable baseline. Without conducting this review to
validate the performance baseline, the baseline cannot be adequately
updated as changes occur, and EV data cannot be used effectively for
decision-making purposes.
EV Performance Details
Based on contractor performance data from
June 2008 to May 2009, the WIN-T contract
has outperformed its planned cost targets
by $880,000. However, for the same period,
it has not completed $12.0 million in
planned work. These schedule variances
are due, in part, to issues found during initial
testing that were addressed in subsequent
software releases, resulting in planned
software development work being delayed
to future releases. Based on these data, we
estimate that the WIN-T contract will
overrun its current budget—worth
approximately $747.0 million—by $15.1
million.
Contract percent complete: 34%
Estimates at completion:
• Contractor: $743.3 million
• GAO: $762.1 million
Figure 6: GAO EV Data Analysis of Defense’s WIN-T Program
Cumulative schedule variance
Cumulative cost variance
–
15
-12
-9
-6
-3
0
3
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Defense (Department of the Army) data.
2008
Year
2009
Page 44 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Automated
Commercial
Environment
The Automated Commercial Environment (ACE) program is the
commercial trade processing system being developed by the U.S. Customs
and Border Protection to facilitate trade while strengthening border
security. The program is to provide trade compliance and border security
staff with the right information at the right time, while minimizing
administrative burden. Deployed in phases, ACE is expected to be
expanded to provide cargo processing capabilities across all modes of
transportation and intended to replace existing systems with a single,
multimodal manifest system for land, air, rail, and sea cargo. Ultimately,
ACE is expected to become the central data collection system for the
federal agencies that, by law, require international trade data, and should
deliver these capabilities in a secure, paper-free, Web-enabled
environment. As a result of poorly managed requirements, the total life-
cycle development cost of the ACE program increased from an estimated
$1.5 billion to $2.2 billion—a $700 million increase.
Investment Details
Department of Homeland Security
(U.S. Customs and Border Protection)
Program start date: 2001
Total life-cycle development cost:
• Current: $2.2 billion
• Original: $1.5 billion
Program end date:
• Current: 2016
• Original: 2016
Rebaselines: 0
Major contractor: IBM
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Homeland Security (U.S. Customs and Border Protection) data.
Table 14: GAO EVM Practice Assessment of Homeland Security’s ACE Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from
the performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Homeland Security (U.S. Customs and Border Protection) data.
Page 45 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Page 46 GAO-10-2 Information Technology
ACE fully met 9 of the 11 key practices for implementing EVM and
partially met the remaining 2 practices. Specifically, ACE fully met 5 of 6
practices for establishing a comprehensive EVM system, such as defining
the scope of the work effort and developing the performance baseline, but
partially met the practice for scheduling the work, in part, because
resources were not assigned to all activities in the master schedule. ACE
fully met 2 practices for ensuring that the data resulting from the EVM
system were reliable, such as adequately analyzing EV performance data,
but could not fully execute the work plan because of the weaknesses
found in the schedule. Lastly, ACE demonstrated that the program
management team was basing decisions on EVM data.
It should be noted that the ACE program is being defined incrementally—
whereby the performance baseline is continuously updated as task orders
for new work are issued. As such, the use of EVM to determine the true
progress made and to project reliable final costs at completion is limited.
Figure 7: GAO EV Data Analysis of Homeland Security’s ACE Program
EV Performance Details
Based on contractor performance data from
June 2008 to May 2009, the ACE program
has experienced negative cost and
schedule variances. Specifically, as of May
2009, the program has exceeded its
planned cost target by $19.0 million. These
variances are due, in part, to additional
development and testing work needed to
meet program milestones. We estimate that
the program will overrun its current
budget—approximately $382.3 million—by
$24.1 million.
Contract percent complete: 83%
Estimates at completion:
• Contractor: $381.8 million
• GAO: $406.4 million
Cumulative schedule variance
Cumulative cost variance
-20
-18
-16
-14
-12
-10
-8
-6
-4
-2
0
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Homeland Security (U.S. Customs and Border Protection) data.
2008
Year
2009
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Integrated Deepwater
System—Common
Operational Picture
The Integrated Deepwater System is a 25-year, $24 billion major
acquisition program to recapitalize the U.S. Coast Guard’s aging fleet of
boats, airplanes, and helicopters, ensuring that all work together through a
modern, capable communications system. This initiative is designed to
enhance maritime domain awareness and enable the Coast Guard to meet
its post-September 11 mission requirements. The program is composed of
15 major acquisition projects, including the Common Operational Picture
(COP) program.
Investment Details
Department of Homeland Security
(U.S. Coast Guard)
Program start date: August 2002
Total life-cycle development cost:
• Current: $1.4 billion
• Original: $1.4 billion
Program end date:
• Current: 2014
• Original: 2014
Rebaselines: 1 (July 2007)
Major contractors: Lockheed Martin and
Northrop Grumman
Deepwater COP is to provide relevant, real-time operational intelligence
and surveillance data to human capital managers, allowing them to direct
and monitor all assigned forces and first responders. This is expected to
allow commanders to distribute critical information to federal, state, and
local agencies quickly; reduce duplication; enable earlier alerting; and
enhance maritime awareness.
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Homeland Security (U.S. Coast Guard) data.
Table 15: GAO EVM Practice Assessment of Homeland Security’s Deepwater COP Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from the
performance measurement baseline plan ◐
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks
◐
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Homeland Security (U.S. Coast Guard) data.
Page 47 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Deepwater COP fully met 7 of the 11 key practices and partially met 4
others. Specifically, COP fully met 5 of the 6 practices for establishing a
comprehensive EVM system, such as adequately defining all major
elements of the work breakdown structure and developing the
performance baseline. However, the program’s master schedule contained
weaknesses, such as a large number of concurrent tasks and activities
without resources assigned. Officials were aware of some, but not all, of
the weaknesses in the schedule and had controls in place to mitigate the
weakness they were aware of in order to improve the reliability of the
resulting EV data. Lastly, COP was unable to fully meet 1 of the practices
for using EV data for management decisions because it could not
demonstrate that cost and schedule drivers impacting EV performance
were linked to its risk management processes.
EV Performance Details
Based on performance data from June 2008
to May 2009, the Deepwater COP
contractor has experienced negative cost
and schedule variances. Specifically, as of
May 2009, the contractor has exceeded its
planned cost target by $4.2 million. These
cost variances are due, in part, to design
and development tasks requiring more work
than originally planned. We estimate that
the contract will overrun its current
budget—worth approximately $130.2
million—by $4.2 million. Our analysis
indicates that the contract is currently on
schedule.
Contract percent complete: 99%
Estimates at completion:
• Contractor: $134.3 million
• GAO: $134.3 million
Figure 8: GAO EV Data Analysis of Homeland Security’s Deepwater COP Program
Cumulative schedule variance
Cumulative cost variance
-5
-4
-3
-2
-1
0
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Homeland Security (U.S. Coast Guard) data.
2008
Year
2009
Page 48 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Western Hemisphere
Travel Initiative
The Western Hemisphere Travel Initiative (WHTI) program made
modifications to vehicle processing lanes at ports of entry on the nation’s
northern and southern borders. WHTI is designed to allow U.S. Customs
and Border Protection to effectively address new requirements imposed
by the Intelligence Reform and Terrorism Prevention Act of 2004
(completing these requirements by June 1, 2009). WHTI development was
completed and its implementation addressed the 39 highest volume ports
of entry, which support 95 percent of land border traffic. The initiative
requires travelers to present a passport or other authorized travel
document that denotes identity and citizenship when entering the United
States.
Investment Details
Department of Homeland Security
(U.S. Customs and Border Protection)
Program start date: January 2007
Total life-cycle cost:
• Current: $1.2 billion
• Original: $1.2 billion
Program end date: June 1, 2009
Rebaselines: 1 (March 2008)
Major contractor: Unisys
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Homeland Security (U.S. Customs and Border Protection) data.
Table 16: GAO EVM Practice Assessment of Homeland Security’s WHTI Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ◐
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks
◐
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Homeland Security (U.S. Customs and Border Protection) data.
Page 49 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
WHTI fully met 6 of the 11 key practices for implementing EVM and
partially met the remaining 5 practices. Specifically, weaknesses identified
in validating the performance baseline and scheduling the work limited the
program’s ability to establish a comprehensive EVM system. Although the
program held an integrated baseline review to validate the baseline in
March 2008, the review did not cover many key aspects, such as
identifying corrective actions needed to mitigate program risks.
Furthermore, the master schedule contained deficiencies, such as
activities that were out of sequence or lacking dependencies. While
program officials described their use of processes for ensuring the
reliability of the EVM system’s data, such as capturing significant cost and
schedule drivers in the risk register, the provided documentation did not
corroborate what we were told. When combined, these weaknesses
preclude the program from effectively making decisions about the
program based on EV data.
EV Performance Details
Based on performance data from June 2008
to May 2009, the WHTI contractor
experienced schedule variances. However,
as of June 2009, program officials stated
that the WHTI contract was successfully
completed on time. The contractor did not
report any cost variances because it was a
firm-fixed-price contract. Additionally,
program officials stated that the contract
was completed on budget.
Contract percent complete: 100%
Figure 9: GAO EV Data Analysis of Homeland Security’s WHTI Program
-16
-14
-12
-10
-8
-6
-4
-2
0
2
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Homeland Security (U.S. Customs and Border Protection) data.
2008
Year
2009
Cumulative schedule variance
Cumulative cost variance
Page 50 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Next Generation
Identification
The Next Generation Identification (NGI) program is designed to support
the Federal Bureau of Investigation’s mission to reduce terrorist and
criminal activities by providing timely, relevant criminal justice
information to the law enforcement community. Today, the bureau
operates and maintains one of the largest repositories of biometric-
supported criminal history records in the world. The electronic
identification and criminal history services support more than 82,000
criminal justice agencies, authorized civil agencies, and international
organizations. NGI is intended to ensure that the bureau’s biometric
systems are able to seamlessly share data that are complete, accurate,
current, and timely. To accomplish this, the current system will be
replaced or upgraded with new functionalities and state-of-the-art
equipment. NGI is expected to be scaleable to accommodate five times the
current workload volume with no increase in support manpower and will
be flexible to respond to changing requirements.
Investment Details
Department of Justice
(Federal Bureau of Investigation)
Program start date: February 2008
Total life-cycle cost:
• Current: $1.076 billion
• Original: $1.076 billion
Program end date:
• Current: June 2018
• Original: April 2018
Rebaselines: 0
Major contractor: Lockheed Martin
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Justice (Federal Bureau of Investigation) data.
Table 17: GAO EVM Practice Assessment of Justice’s NGI Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ●
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Justice (Federal Bureau of Investigation) data.
Page 51 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
NGI fully implemented all 11 key EVM practices. Specifically, the program
implemented all practices for establishing a comprehensive EVM system,
such as defining the scope of work and scheduling the work. For example,
the schedule properly captured key activities, established reasonable
durations, and established a sound critical path, all of which contribute to
establishing a reliable baseline that performance can be measured against.
Furthermore, the NGI team ensured that the resulting EV data were
appropriately verified and validated for reliability by, for example,
integrating the analysis of cost and schedule variances with the program’s
risk register to mitigate emerging and existing risks associated with key
drivers causing major variances. In addition, the program’s risk register
includes cost and schedule impacts for every risk and links to the
management reserve process. Lastly, NGI demonstrated that it is using EV
data to make decisions by performing continuous quality checks of the
schedule, reviewing open risks and opportunities, and reviewing EV data
in weekly management reports.
EV Performance Details
Based on contractor performance data from
October 2008 to April 2009, NGI
experienced negative cost and schedule
variances. Specifically, as of April 2009, the
contractor has exceeded its planned cost
targets by $1.4 million. Furthermore, as of
April 2009, the contractor has not completed
$0.5 million in planned work. These
variances were due, in part, to the need for
additional testing resources. We estimate
that the NGI contract will overrun its current
budget—worth approximately $37.5
million—by $1.6 million.
Contract percent complete: 91%
Estimates at completion:
• Contractor: $39.0 million
• GAO: $39.1 million
Note: NGI established its EV reporting
baseline in October 2008.
Figure 10: GAO EV Data Analysis of Justice’s NGI Program
Cumulative schedule variance
Cumulative cost variance
-2
-1
0
1
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Justice (Federal Bureau of Investigation) data.
2008
Year
2009
Page 52 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
James Webb Space
Telescope
The James Webb Space Telescope (JWST) is designed to be the scientific
successor to the Hubble Space Telescope and expected to be the premier
observatory of the next decade. It is intended to seek to study and answer
fundamental astrophysical questions, ranging from the formation and
structure of the Universe to the origin of planetary systems and the origins
of life. The telescope is an international collaboration of the National
Aeronautics and Space Administration (NASA), the Canadian Space
Agency, and the European Space Agency. JWST required the development
of several new technologies, including a folding segmented primary mirror
that will unfold after launch and a cryocooler for cooling midinfrared
detectors to 7 degrees Kelvin.
Investment Details
National Aeronautics and Space
Administration
Project start date: March 1999
Total life-cycle cost:
• Current: $4.964 billion
• Original: $4.964 billion
Project end date:
• Current: December 2021
• Original: December 2021
Rebaselines: 0
Major contractor: Northrop Grumman
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of National Aeronautics and Space Administration data.
Table 18: GAO EVM Practice Assessment of NASA’s JWST Project
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ◐
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks
◐
Update the performance measurement baseline as changes
occur
◐
Source: GAO analysis of National Aeronautics and Space Administration data.
Page 53 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
JWST fully met 4 of the 11 key practices and partially met 7 practices. The
project only partially met practices for establishing a comprehensive EVM
system because of weaknesses in the work breakdown structure, in which
the prime contractor has not fully defined the scope of each work element.
In addition, the project only partially met the practice for scheduling work
because of weaknesses resulting from manual integration of
approximately 30 schedules, although officials did explain some
mitigations for this risk. We also found deficiencies in the lower-level
schedules, such as missing linkages between tasks, resources not being
assigned, and excessively high durations. Furthermore, JWST only
partially implemented practices to ensure that the data resulting from the
EVM system are reliable, due, in part, to variance analysis reports being
done quarterly (instead of monthly), which limits the project’s ability to
analyze and respond to cost and schedule variances in a timely manner.
When combined, these weaknesses preclude the program from effectively
making decisions about the program based on EV data.
EV Performance Details
EVM for the JWST project is being
performed by the prime contractor and its
major subcontractors. The scope of this
work includes designing and developing the
telescope, the spacecraft, and the
sunshield; integrating and testing the
observatory; and supporting launch
operations.
Based on contractor performance data from
June 2008 to May 2009, the JWST project
has experienced negative cost and
schedule variances. Specifically, as of May
2009, the contractor has exceeded its
planned cost target by $224.7 million. A key
driver in this cost overrun was
greater-than-expected complexity in the
work, which required additional resources.
We concur with the contractor estimate that
it will overrun its budget—worth
approximately $1.3 billion—by $448.5
million. Furthermore, as of May 2009, the
project has not completed $9.4 million in
planned work.
Contract percent complete: 64%
Estimates at completion:
• Contractor: $1.7 billion
• GAO: $1.7 billion
Note: The project suspended earned value
reporting during November 2008 while
undergoing a replan.
Figure 11: GAO EV Data Analysis of NASA’s JWST Project
Cumulative schedule variance
Cumulative cost variance
-225
-200
-175
-150
-125
-100
-75
-50
-25
0
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of National Aeronautics and Space Administration data.
2008
Year
2009
Page 54 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Juno Juno is part of the New Frontiers Program. The overarching scientific goal
of the Juno mission is to improve our understanding of the origin and
evolution of Jupiter. As the archetype of giant planets, Jupiter may provide
knowledge that will improve our understanding of both the origin of our
solar system and the planetary systems being discovered around other
stars. The Juno project is expected to use a solar-powered spacecraft to
make global maps of the gravity, magnetic fields, and atmospheric
composition of Jupiter. The spacecraft is to make 33 orbits of Jupiter to
sample the planet’s full range of latitudes and longitudes.
Investment Details
National Aeronautics and Space
Administration
Project start date: June 2005
Total life-cycle cost:
• Current: $1.05 billion
• Original: $1.05 billion
Project end date:
• Current: October 2018
• Original: October 2018
Rebaselines: 0
Major contractor: Lockheed Martin
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of National Aeronautics and Space Administration data.
Table 19: GAO EVM Practice Assessment of NASA’s Juno Project
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from
the performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of National Aeronautics and Space Administration data.
Page 55 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Juno fully met 8 of the 11 key practices for implementing EVM and
partially met 3 practices. Specifically, the project fully met 3 practices for
establishing a comprehensive EVM system, but only partially met the
practices for scheduling the work, determining the objective measure of
earned value, and establishing the performance baseline. Juno was unable
to fully meet these practices because the project’s master schedule
contained issues with the sequencing of work activities and lacked a
comprehensive integrated baseline review. Although an integrated
baseline review was conducted for a major contract in February 2009, the
program did not validate the baseline, scope of work to be performed, or
key risks and mitigation plans for the Juno project as a whole, which
increases the risk that the project is measuring performance against an
unreasonable baseline. Juno fully implemented all 3 practices associated
with data reliability and the 2 practices associated with using EV data for
decision-making purposes.
EV Performance Details
Based on performance data from December
2008 to May 2009, the Juno project has
experienced negative cost and schedule
variances. Specifically, as of May 2009, the
project has exceeded its cost target by
$13.2 million. Based on these data, we
estimate that the Juno project will overrun
its current budget—worth approximately
$369.0 million—by $49.8 million.
Furthermore, as of May 2009, the project
has not completed $12.3 million in planned
work.
Project percent complete: 32%
Estimates at completion:
• Project: $375.5 million
• GAO: $418.8 million
Note: Juno established its EV reporting
baseline in December 2008.
Figure 12: GAO EV Data Analysis of NASA’s Juno Project
Cumulative schedule variance
Cumulative cost variance
-14
-12
-10
-8
-6
-4
-2
0
2
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of National Aeronautics and Space Administration data.
2008
Year
2009
Page 56 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
Mars Science
Laboratory
The Mars Science Laboratory (MSL) is part of the Mars Exploration
Program. The program seeks to understand whether Mars was, is, or can
be a habitable world. To answer this question, the MSL project is expected
to investigate how geologic, climatic, and other processes have worked to
shape Mars and its environment over time, as well as how they interact
today. To accomplish this, the MSL project plans to place a mobile science
laboratory on the surface of Mars to quantitatively assess a local site as a
potential habitat for life, past or present. The project is considered one of
NASA’s flagship projects and designed to be the most advanced rover ever
sent to explore the surface of Mars. Due to technical issues identified
during the development of key components, the MSL launch date has
recently slipped 2 years—from September 2009 to October 2011, and the
project’s life-cycle cost estimate has increased from about $1.63 billion to
$2.29 billion, a $652 million increase.
Investment Details
National Aeronautics and Space
Administration
Project start date: November 2003
Total life-cycle cost:
• Current: $2.286 billion
• Original: $1.634 billion
Project end date:
• Current: September 2015
• Original: September 2013
Rebaselines: 1 (March 2009)
Major contractor: None—in-house
development
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of National Aeronautics and Space Administration data.
Table 20: GAO EVM Practice Assessment of NASA’s MSL Project
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ◐
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks
◐
Update the performance measurement baseline as changes
occur ◐
Source: GAO analysis of National Aeronautics and Space Administration data.
Page 57 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
MSL fully met 5 of the 11 key practices and partially met 6 others.
Specifically, MSL fully met 3 practices for establishing a comprehensive
EVM system, but only partially met 3 others because of weaknesses in the
sequencing of all activities in the schedule and the lack of an integrated
baseline review to validate the baseline and assess the achievability of the
plan. While the project has taken steps to mitigate the latter weakness by
requiring work agreements that document, among other things, the
objective value of work and related risks for planned work packages, this
is not a comprehensive review of the project’s baseline. Furthermore, MSL
only partially implemented practices associated with data reliability
because its analysis of cost and schedule variances did not include the
root causes for variances and corrective actions, which prevents the
project from tracking and mitigating related risks. Lastly, without an initial
validation of the performance baseline, the baseline cannot be
appropriately updated to reflect program changes, thereby limiting the use
of EV data for management decisions.
EV Performance Details
Due to significant cost and schedule
overruns, the MSL project recently
completed a project replan between
November 2008 and February 2009.
Specifically, as of October 2008, MSL had
exceeded its cost targets by $189.8 million
and had not completed $24.1 million in
planned work, due primarily to technical
issues experienced in the development of
rover’s mechanical gears and avionics
components. As a result of the replan, the
project’s launch date was delayed 2 years,
and the budget was increased from $768.7
million to $1.223 billion. Since the replan,
the project is meeting cost targets but, as of
May 2009, has not completed $6.2 million in
planned work.
Project percent complete: 77%
Estimates at completion:
• Project: $1.227 billion
• GAO: N/A
Note: MSL suspended EVM reporting
between November 2008 and February
2009 while undergoing a project replan.
Therefore, we did not have sufficient data to
make a reliable independent estimate at
completion. The project’s EV baseline does
not include components being provided by
external parties, such as other NASA
centers and the Department
of Energy.
Figure 13: GAO EV Data Analysis of NASA’s MSL Project
Cumulative schedule variance
Cumulative cost variance
-200
-175
-150
-125
-100
-75
-50
-25
0
25
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of National Aeronautics and Space Administration data.
2008
Year
2009
Page 58 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
En Route Automation
Modernization
The En Route Automation Modernization (ERAM) program is to replace
existing software and hardware in the air traffic control automation
computer system and its backup system, the Direct Radar Channel, and
other associated interfaces, communications, and support infrastructure at
en route centers across the country. This is a critical effort because ERAM
is expected to upgrade hardware and software for facilities that control
high-altitude air traffic. ERAM consists of two major components. One
component has been fully deployed and is currently in operation at
facilities across the country. The other component is scheduled for
deployment through fiscal year 2011.
Investment Details
Department of Transportation
(Federal Aviation Administration)
Program start date: August 2002
Total life-cycle cost:
• Current: $3.65 billion
• Original: $3.65 billion
Program end date:
• Current: September 2020
• Original: September 2020
Rebaselines: 0
Major contractor: Lockheed Martin
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Transportation (Federal Aviation Administration) data.
Table 21: GAO EVM Practice Assessment of Transportation’s ERAM Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ◐
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ◐
Develop the performance measurement baseline ◐
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Transportation (Federal Aviation Administration) data.
Page 59 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
ERAM fully met 7 of the 11 key practices and partially met 4 others. ERAM
applies EVM at the contract level and incorporates EV data into its overall
management of the program. However, ERAM did not perform a
comprehensive review of the baseline when the contract was finalized, or
take similar actions to validate the baseline and ensure that the appropriate
EV metrics had been applied. While ERAM does perform limited checks of
the contractor schedule, our analysis showed some issues with the
sequencing of activities and the use of constraints that may undermine the
reliability of the schedule as a baseline to measure performance.
However, it should be noted that the EV data are not a reflection of the
total ERAM program. The government is also responsible for acquisition
work—to which EVM is not being applied. Our analysis of the master
schedule showed that ERAM would be unable to meet four major
upcoming initial operating capability milestones due to issues associated
with government work activities. Program officials noted that these
milestones have since been pushed out. Since EVM is not applied at the
program level, it is unclear whether these delays will impact overall cost.
EV Performance Details
As of April 2009, the ERAM contractor has
outperformed its planned cost targets by
$36.9 million; for this same period, it has
also outperformed its schedule targets by
completing $15.9 million worth of work
ahead of schedule. This strong performance
is attributed to significant cost savings in
hardware production and unplanned
efficiencies in integration and testing at
ERAM deployment sites. This has offset
cost overruns associated with software
development, such as code growth; an
unexpectedly high number of defects
delivered; and the resolution of defects at
lower productivity rates than planned.
Based on performance data from May 2008
to April 2009, we concur with the contractor
estimate that it will underrun the current
budget—worth $1.5 billion—by $15.0 million
at completion.
Contract percent complete: 89%
Estimates at completion:
• Contractor: $1.465 billion
• GAO: $1.465 billion
Figure 14: GAO EV Data Analysis of Transportation’s ERAM Program
Cumulative schedule variance
Cumulative cost variance
-5
0
5
10
15
20
25
30
35
40
Apr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJuneMay
Dollars in millions
Source: GAO analysis of Department of Transportation (Federal Aviation Administration) data.
2008
Year
2009
Page 60 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
The Surveillance and Broadcast System (SBS) is to provide new
surveillance solutions that employ technology using avionics and ground
stations for improved accuracy and update rates and to provide shared
situational awareness (including visual updates of traffic, weather, and
flight notices) between pilots and air traffic control. These technologies
are considered critical to achieving the Federal Aviation Administration’s
strategic goals of decreasing the rate of accidents and incursions,
improving the efficiency of air traffic, and reducing congestion.
Surveillance and
Broadcast System
Investment Details
Department of Transportation
(Federal Aviation Administration)
Program start date: August 2007
Total life-cycle cost:
• Current: $4.33 billion
• Original: $4.31 billion
Program end date:
• Current: September 2035
• Original: September 2035
Rebaselines: 0
Major contractor: ITT Corporation
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Transportation (Federal Aviation Administration) data.
Table 22: GAO EVM Practice Assessment of Transportation’s SBS Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ●
Identify who in the organization will perform the work ●
Schedule the work ●
Estimate the labor and material required to perform the work
and authorize the budgets, including management reserve ●
Determine objective measure of earned value ●
Develop the performance measurement baseline ●
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ●
Analyze EVM performance data and record variances from the
performance measurement baseline plan ●
Forecast estimates at completion ●
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks ●
Update the performance measurement baseline as changes
occur ●
Source: GAO analysis of Department of Transportation (Federal Aviation Administration) data.
Page 61 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
SBS fully implemented all 11 key EVM practices. Specifically, SBS has
institutionalized EVM at the program level—meaning that it collects and
manages performance data on the contractor and government work
efforts—in order to get a comprehensive view into program status. As part
of this initiative, SBS performed detailed validation reviews of the
contractor and program baselines; issued various process rules on
resource planning, EV metrics, and data analysis; and collected
government timecard data in order to ensure consistent EV application. In
addition, the program management team conducted rigorous reviews of
EV performance with the SBS program manger and the program’s internal
management review board on a monthly basis. Our analysis of the SBS
master schedule showed that it was developed in accordance with
scheduling best practices. For example, the schedule was properly
sequenced, and the resources were assigned. Furthermore, SBS briefed
the program manager monthly on the quality of the schedule to identify,
for example, tasks without predecessors.
EV Performance Details
As of May 2009, SBS outperformed its
planned cost targets by $14.7 million.
However, for this same period, it has been
unable to complete $24.0 million worth of
work. The strong cost performance is
attributed to the ITT Corporation’s
overestimation of systems engineering
resources needed to complete work and
better-than-expected performance for
activities associated with system safety,
among other things. The negative schedule
variances are due in part to delays caused
by the resolution of radio hardware issues
found during testing.
Based on performance data from June 2008
to May 2009, we estimate that SBS will
most likely exceed the program’s current
budget—which is currently worth about $1
billion—by about $21 million.
Program percent complete: 27%
Estimates at completion:
• Program: $966.3 million
• GAO: $1.015 billion
Figure 15: GAO EV Data Analysis of Transportation’s SBS Program
Cumulative schedule variance
Cumulative cost variance
-25
-20
-15
-10
-5
0
5
10
15
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Transportation (Federal Aviation Administration) data.
2008
Year
2009
Page 62 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
The Veterans Health Information Systems and Technology Architecture—
Foundations Modernization (VistA-FM) program addresses the need to
transition the Veterans Affairs electronic medical record system to a new
architecture. According to the department, the current system is costly
and difficult to maintain and does not integrate well with newer software
packages. VistA-FM is designed to provide a new architectural framework
as well as additional standardization and common services components.
This is intended to eliminate redundancies in coding and support
interoperability among applications. Ultimately, the new architecture will
lay the foundation for a new generation of computer systems in support of
caring for America’s veterans. During the course of our review, the
department’s Chief Information Officer suspended multiple components of
the VistA-FM program until a new development plan can be put in place.
This action was taken as part of a new departmentwide initiative to
identify troubled IT projects and improve their execution.
Veterans Health
Information Systems
and Technology
Architecture—
Foundations
Modernization
Investment Details
Department of Veterans Affairs
Program start date: 2006
Total life-cycle cost:
• Current: $1.897 billion
• Original: $1.897 billion
Program end date:
• Current: 2016
• Original: 2016
Rebaselines: 0
Major contractor: None—in-house
development
Operations and
maintenance
DevelopmentInitiation
Source: GAO analysis of Department of Veterans Affairs data.
Table 23: GAO EVM Practice Assessment of Veterans Affairs’ VistA-FM Program
Program management area of
responsibility Key practice GAO assessment
Establish a comprehensive EVM system Define the scope of effort using a work breakdown structure ◌
Identify who in the organization will perform the work ◌
Schedule the work ◌
Estimate the labor and material required to perform the work and
authorize the budgets, including management reserve ◐
Determine objective measure of earned value ◌
Develop the performance measurement baseline ◌
Ensure that the data resulting from the
EVM system are reliable
Execute the work plan and record all costs ◐
Analyze EVM performance data and record variances from the
performance measurement baseline plan ◐
Forecast estimates at completion ◐
Ensure that the program management
team is using earned value data for
decision-making purposes
Take management action to mitigate risks
◌
Update the performance measurement baseline as changes occur ◌
Source: GAO analysis of Department of Veterans Affairs data.
Page 63 GAO-10-2 Information Technology
Appendix II: Case Studies of Selected
Programs’ Implementation of Earned Value
Management
VistA-FM partially met 4 key practices and did not meet 7 others, despite
reporting compliance with the American National Standards Institute
(ANSI) standard in its 2010 business case submission. Specifically, the
program is still working to establish a comprehensive EVM system to meet
ANSI compliance, among other things. For example, the work breakdown
structure is organized around key program milestones instead of product
deliverables, and does not fully describe the scope of work to be
performed. Although the program’s subprojects maintain their own
schedules, VistA-FM does not currently have an integrated master
schedule at the program level. This is of concern because it is not possible
to establish the program’s critical path and the time-phased budget
baseline, a key component of EVM. The reliability of the data is also a
potential issue because the program’s EVM reports do not offer adequate
detail to provide insight into data reliability issues. Additionally, the
performance baseline has not been appropriately updated; program
officials stated this update is in progress, but they did not have a
completion date.
EV Performance Details
Based on performance data from June 2008
to May 2009, VistA-FM has experienced
continual negative cost and schedule
variances. Specifically, as of May 2009, the
program has exceeded its planned cost
target by $14.9 million, and has not
completed $24.9 million in planned work.
Program officials cited resource availability
and interdependencies among projects as
key drivers of cost and schedule variances.
We estimate that the program will overrun
its current budget—worth approximately
$1.897 billion—by $350.2 million.
Program percent complete: 10%
Estimates at complete:
• Program: $1.897 billion
• GAO: $2.248 billion
Figure 16: GAO EV Data Analysis of Veterans Affairs’ VistA-FM Program
Cumulative schedule variance
Cumulative cost variance
-28
-24
-20
-16
-12
-8
-4
0
4
MayApr.Mar.Feb.Jan.Dec.Nov.Oct.Sept.Aug.JulyJune
Dollars in millions
Source: GAO analysis of Department of Veterans Affairs data.
2008
Year
2009
Page 64 GAO-10-2 Information Technology
Appendix III: Comments from the Department
of Commerce
Appendix III: Comments from the
Department of Commerce
Page 65 GAO-10-2 Information Technology
Appendix III: Comments from the Department
of Commerce
Page 66 GAO-10-2 Information Technology
Appendix IV: Comments from the Department
of Defense
Appendix IV: Comments from the
Department of Defense
Page 67 GAO-10-2 Information Technology
Appendix IV: Comments from the Department
of Defense
Page 68 GAO-10-2 Information Technology
Appendix IV: Comments from the Department
of Defense
Page 69 GAO-10-2 Information Technology
Appendix V: Comments from the Department
of Justice
Appendix V: Comments from the Department
of Justice
Page 70 GAO-10-2 Information Technology
Appendix V: Comments from the Department
of Justice
Page 71 GAO-10-2 Information Technology
Appendix VI: Comments from the National
Aeronautics and Space Administration
Appendix VI: Comments from the National
Aeronautics and Space Administration
Page 72 GAO-10-2 Information Technology
Appendix VI: Comments from the National
Aeronautics and Space Administration
Page 73 GAO-10-2 Information Technology
Appendix VI: Comments from the National
Aeronautics and Space Administration
Page 74 GAO-10-2 Information Technology
Appendix VII: Comments from the
Department of Veterans Affairs
Appendix VII: Comments from the
Department of Veterans Affairs
Page 75 GAO-10-2 Information Technology
Appendix VII: Comments from the
Department of Veterans Affairs
Page 76 GAO-10-2 Information Technology
Appendix VIII:
A
GAO Contact and Staff
cknowledgments
Page 77 GAO-10-2
Appendix VIII: GAO Contact and
Staff
Acknowledgments
David A. Powner, (202) 512-9286 or pownerd@gao.gov
In addition to the contact name above, individuals making contributions to
this report included Carol Cha (Assistant Director), Neil Doherty, Kaelin
Kuhn, Jason Lee, Lee McCracken, Colleen Phillips, Karen Richey, Teresa
Smith, Matthew Snyder, Jonathan Ticehurst, Kevin Walsh, and China
Williams.
GAO Contact
Staff
Acknowledgments
Information Technology
mailto:pownerd@gao.gov
Related GAO Products
Related GAO Products
Defense Acquisitions: Assessments of Selected Weapon Programs.
GAO-09-326SP. Washington, D.C.: March 30, 2009.
Discusses the Department of Defense’s Joint Tactical Radio System—
Handheld, Manpack, Small Form Fit and Warfighter Information
Network—Tactical programs.
Information Technology: Census Bureau Testing of 2010 Decennial
Systems Can Be Strengthened. GAO-09-262. Washington, D.C.: March 5,
2009.
Discusses the Department of Commerce’s Decennial Response Integration
System and Field Data Collection Automation programs.
NASA: Assessments of Selected Large-Scale Projects. GAO-09-306SP.
Washington, D.C.: March 2, 2009.
Discusses the National Aeronautics and Space Administration’s James
Webb Space Telescope and Mars Science Laboratory programs.
Air Traffic Control: FAA Uses Earned Value Techniques to Help Manage
Information Technology Acquisitions, but Needs to Clarify Policy and
Strengthen Oversight. GAO-08-756. Washington, D.C.: July 18, 2008.
Discusses the Department of Transportation’s En Route Automation
Modernization and Surveillance and Broadcast System programs.
Information Technology: Agriculture Needs to Strengthen Management
Practices for Stabilizing and Modernizing Its Farm Program Delivery
Systems. GAO-08-657. Washington, D.C.: May 16, 2008.
Discusses the U.S. Department of Agriculture’s Farm Program
Modernization program.
Information Technology: Improvements for Acquisition of Customs
Trade Processing System Continue, but Further Efforts Needed to Avoid
More Cost and Schedule Shortfalls. GAO-08-46. Washington, D.C.: October
25, 2007.
Discusses the Department of Homeland Security’s Automated Commercial
Environment program.
Defense Acquisitions: The Global Information Grid and Challenges
Facing Its Implementation. GAO-04-858. Washington, D.C.: July 28, 2004.
Discusses the Department of Defense’s Warfighter Information Network—
Tactical program.
(310894)
Page 78 GAO-10-2 Information Technology
http://www.gao.gov/cgi-bin/getrpt?GAO-09-326SP
http://www.gao.gov/cgi-bin/getrpt?GAO-09-262
http://www.gao.gov/cgi-bin/getrpt?GAO-09-306SP
http://www.gao.gov/cgi-bin/getrpt?GAO-08-756
http://www.gao.gov/cgi-bin/getrpt?GAO-08-657
http://www.gao.gov/cgi-bin/getrpt?GAO-08-46
http://www.gao.gov/cgi-bin/getrpt?GAO-04-858
GAO’s Mission The Government Accountability Office, the audit, evaluation, and
investigative arm of Congress, exists to support Congress in meeting its
constitutional responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO’s
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.
The fastest and easiest way to obtain copies of GAO documents at no cost
is through GAO’s Web site (www.gao.gov). Each weekday afternoon, GAO
posts on its Web site newly released reports, testimony, and
correspondence. To have GAO e-mail you a list of newly posted products,
go to www.gao.gov and select “E-mail Updates.”
Obtaining Copies of
GAO Reports and
Testimony
Order by Phone The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site,
http://www.gao.gov/ordering.htm.
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional information.
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470
Ralph Dawn, Managing Director, dawnr@gao.gov, (202) 512-4400
U.S. Government Accountability Office, 441 G Street NW, Room 7125
Washington, DC 20548
To Report Fraud,
Waste, and Abuse in
Federal Programs
Congressional
Relations
Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, DC 20548
Public Affairs
Please Print on Recycled Paper
http://www.gao.gov/
http://www.gao.gov/
http://www.gao.gov/ordering.htm
http://www.gao.gov/fraudnet/fraudnet.htm
mailto:fraudnet@gao.gov
mailto:dawnr@gao.gov
mailto:youngc1@gao.gov
- In July 2008, we reported that the Federal Aviation Administration’s EVM policy was not fully consistent with best practices. For example, the agency required its program managers to obtain EVM training, but did not enforce completion of this training or require other relevant personnel to obtain this training. In addition, although the agency was using EVM to manage IT acquisitions, not all programs were ensuring that their earned value data were reliable. Specifically, of the three programs collecting EVM data, only one program adequately ensured that its earned value data were reliable. As a result, the agency faced an increased risk that managers were not getting the information they needed to effectively manage the programs. In response to our findings and recommendations, the Federal Aviation Administration reported that it had initiatives under way to improve its EVM oversight processes.
- In September 2008, we reported that the Department of the Treasury’s EVM policy was not fully consistent with best practices. For example, while the department’s policy addressed some practices, such as establishing clear criteria for which programs are to use EVM, it did not address others, such as requiring and enforcing EVM training. In addition, six programs at Treasury and its bureaus were not consistently implementing practices needed for establishing a comprehensive EVM system. For example, when executing work plans and recording actual costs, a key practice for ensuring that the data resulting from the EVM system are reliable, only two of the six investments that we reviewed incorporated government costs with contractor costs. As a result, we reported that Treasury may not be able to effectively manage its critical programs. In response to our findings and recommendations, Treasury reported that it would release a revised EVM policy and further noted that initiatives to improve EVM-related training were under way.
- In a series of reports and testimonies from September 2004 to June 2009, we reported that the National Oceanic and Atmospheric Administration’s National Polar-orbiting Operational Environmental Satellite System program was likely to overrun its contract at completion on the basis of our analysis of contractor EVM data. Specifically, the program had delayed key milestones and experienced technical issues in the development of key sensors, which we stated would affect cost and schedule estimates. As predicted, in June 2006 the program was restructured, decreasing its complexity, delaying the availability of the first satellite by 3 to 5 years, and increasing its cost estimate from $6.9 billion to $12.5 billion. However, the program has continued to face significant technical and management issues. As of June 2009, launch of the first satellite was delayed by 14 months, and our current projected total cost estimate is approximately $15 billion. We made multiple recommendations to improve this program, including establishing a realistic time frame for revising the cost and schedule baselines, developing plans to mitigate the risk of gaps in satellite continuity, and tracking the program executive committee’s action items from inception to closure.
- establish clear criteria for which programs are to use EVM;
- require programs to comply with the ANSI standard;
- require programs to use a product-oriented structure for defining work products;
- require programs to conduct detailed reviews of expected costs, schedules, and deliverables (called an integrated baseline review);
- require and enforce EVM training;
- define when programs may revise cost and schedule baselines (called rebaselining); and
- require system surveillance—that is, routine validation checks to ensure that major acquisitions are continuing to comply with agency policies and standards.
- Criteria for implementing EVM on all IT major investments: Seven of the eight agencies fully defined criteria for implementing EVM on major IT investments. The agencies with sound policies typically defined “major” investments as those exceeding a certain cost threshold, and, in some cases, agencies defined lower tiers of investments requiring reduced levels of EVM compliance. Veterans Affairs only partially met this key practice because its policy did not clearly state whether programs or major subcomponents of programs (projects and subprojects) had to comply with EVM requirements. According to agency officials, this lack of clarity may cause EVM to be inconsistently applied across the investments. Without an established policy that clearly defines the conditions under which new or ongoing acquisition programs are required to implement EVM, these agencies cannot ensure that EVM is being appropriately applied on their major investments.
- Compliance with the ANSI standard: Seven of the eight agencies required that all work activities performed on major investments be managed by an EVM system that complies with industry standards. One agency, Transportation, partially met this key practice because its policy contained inconsistent criteria for when investments must comply with standards. Specifically, in one section, the policy requires a certain class of investments to adhere to a subset of the ANSI standard; however, in another section, the policy merely states that the investments must comply with general EVM principles. This latter section is vague and could be interpreted in multiple ways, either more broadly or narrowly than the specified subset of the ANSI standard. Without consistent criteria on investment compliance, Transportation may be unable to ensure that the work activities for some of its major investments are establishing sound EVM systems that produce reliable earned value data and provide the basis for informed decision making.
- Standard structure for defining the work products: DOD was the only agency to fully meet this key practice by developing and requiring the use of standard product-oriented work breakdown structures. Four agencies did not meet this key practice, while the other three only partially complied. Of those agencies that partially complied, National Aeronautics and Space Administration (NASA) policy requires mission (or space flight) projects to use a standardized product-oriented work breakdown structure; however, IT projects do not have such a requirement. NASA officials reported that they are working to develop a standard structure for their IT projects; however, they were unable to provide a time frame for completion. Homeland Security and Justice have yet to standardize their product structures.
- Integrated baseline review: All eight agencies required major IT investments to conduct an integrated baseline review to ensure that program baselines fully reflect the scope of work to be performed, key risks, and available resources. For example, DOD required that these reviews occur within 6 months of contract award and after major modifications have taken place, among other things.
- Training requirements: Commerce was the only agency to fully meet this key practice by requiring and enforcing EVM training for all personnel with investment oversight and program management responsibilities. Several of the partially compliant agencies required EVM training for project managers—but did not extend this requirement to other program management personnel or executives with investment oversight responsibilities. Many agencies told us that it would be a significant challenge to require and enforce EVM training for all relevant personnel, especially at the executive level. Instead, most agencies have made voluntary EVM training courses available agencywide. However, without comprehensive EVM training requirements and enforcement, agencies cannot effectively ensure that programs have the appropriate skills to validate and interpret EVM data, and that their executives will be able to make fully informed decisions based on the EVM analysis.
- Rebaselining criteria: Three of the eight agencies fully met this key practice. For example, the Justice policy outlines acceptable reasons for rebaselining, such as when the baseline no longer reflects the current scope of work being performed, and requires investments to explain why their current plans are no longer feasible and to develop realistic cost and schedule estimates for remaining work. Among the five partially compliant agencies, Agriculture and Veterans Affairs provided policies, but in draft form; NASA was in the process of updating its policy to include more detailed criteria for rebaselining; and Homeland Security did not define acceptable reasons but did require an explanation of the root causes for cost and schedule variances and the development of new cost and schedule estimates. In several cases, agencies were unaware of the detailed rebaselining criteria to be included in their EVM policies. Until their policies fully meet this key practice, agencies face an increased risk that their executive managers will make decisions about programs with incomplete information, and that these programs will continue to overrun costs and schedules because their underlying problems have not been identified or addressed.
- System surveillance: All eight agencies required ongoing EVM system surveillance of all programs (and contracts with EVM requirements) to ensure their continued compliance with industry standards. For example, Agriculture required its surveillance teams to submit reports—to the programs and the Chief Information Officer—with documented findings and recommendations regarding compliance. Furthermore, the agency also established a schedule to show when EVM surveillance is expected to take place on each of its programs.
- Agencies’ Key Acquisition Programs Are Using EVM, but Are Not Consistently Implementing Key Practices
- Nine programs did not adequately determine an objective measure of earned value and develop the performance baseline—that is, key practices most appropriately addressed through a comprehensive integrated baseline review, which none of them fully performed. For example, the Air and Space Operations Center—Weapon System program conducted an integrated baseline review in May 2007 to validate one segment of work contained in the baseline; however, the program had not conducted subsequent reviews for the remaining work because doing so would preclude staff from completing their normal work activities. Other reasons cited by the programs for not performing these reviews included the lack of a fully defined scope of work or management’s decision to use ongoing EVM surveillance to satisfy these practices. Without having performed a comprehensive integrated baseline review, programs have not sufficiently evaluated the validity of their baseline plan to determine whether all significant risks contained in the plan have been identified and mitigated, and that the metrics used to measure the progress made on planned work elements are appropriate.
- Four programs did not define the scope of effort using a work breakdown structure. For example, the
- Veterans Health Information Systems and Technology Architecture—Foundations Modernization
- Earned Value Data Show Trends of Cost Overruns and Schedule Slippages on Most Programs
- modify policies governing EVM to ensure that they address the weaknesses that we identified, taking into consideration the criteria used in this report;
- direct key system acquisition programs to implement the EVM practices that address the detailed weaknesses that we identified in appendix II, taking into consideration the criteria used in this report; and
- direct key system acquisition programs to take action to reverse current negative performance trends, as shown in the earned value data, to mitigate the potential cost and schedule overruns.
- In e-mail comments on a draft of the report, officials from the U.S. Department of Agriculture’s Office of the Chief Information Officer stated that the department has begun to address the weaknesses in its EVM policy identified in the report.
- In written comments on a draft of the report, the Secretary of Commerce stated that, regarding the second and third recommendations, the Department of Commerce was pleased that the Decennial Response Integration System was found to have fully implemented all 11 key EVM practices, and that the Field Data Collection Automation program fully implemented six key practices. The department added that its recent actions on the Field Data Collection Automation program should move this program to full compliance with the key EVM practices. Furthermore, regarding the first recommendation, the Secretary stated that while the department understands and appreciates the value of standardized work breakdown structures, it maintained that the development of these work structures should take place at the department’s operating units (e.g., Census Bureau), given the wide diversity of missions and project complexity among these units. As noted in our report, we agree that agencies could develop standard work structures based on the kinds of work being performed by the various component agencies. Therefore, we support these efforts described by the department because they are generally consistent with the intent of our recommendation. Commerce’s comments are printed in appendix III.
- In written comments on a draft of the report, the Department of Defense’s Director of Defense Procurement and Acquisition Policy stated that the department concurred with our recommendations. Among other things, DOD stated that it is essential to maintain the appropriate oversight of acquisition programs, including the use of EVM data to understand program status and anticipate potential problems. DOD’s comments are printed in appendix IV.
- In written comments on a draft of the report, the Department of Justice’s Assistant Attorney General for Administration stated that, after discussion with our office, it was agreed that the second recommendation, related to implementing EVM practices that address identified weakness, was inadvertently directed to the department, and that no response was necessary. We agreed because the case study program reviewed fully met all key EVM practices. The department concurred with the two remaining recommendations related to modifying EVM policies and reversing negative performance trends. Furthermore, the Assistant Attorney General noted that Justice had begun to take steps to improve its use of EVM, such as modifying its policy to require EVM training for all personnel with investment oversight and program management responsibilities. Justice’s comments are printed in appendix V.
- In written comments on a draft of the report, the National Aeronautics and Space Administration’s Deputy Administrator stated that the agency concurred with two recommendations and partially concurred with one recommendation. In particular, the Deputy Administrator agreed that opportunities exist for improving the implementation of EVM, but stated that NASA classifies the projects included in the scope of the audit as space flight projects (not as IT-specific projects), which affects the applicability of the agency’s EVM policies and guidance that were reviewed. We recognize that different classifications of IT exist; however, consistent with other programs included in the audit, the selected NASA projects integrate and rely on various elements of IT. As such, we reviewed both the agency’s space flight and IT-specific guidance. Furthermore, the agency partially concurred with one recommendation because it stated that efforts were either under way or planned that will address the weaknesses we identified. We support the efforts that NASA described in its comments because they are generally consistent with the intent of our recommendation. NASA’s comments are printed in appendix VI.
- In e-mail comments on a draft of the report, the Department of Transportation’s Director of Audit Relations stated that the department is taking immediate steps to modify its policies governing EVM, taking into consideration the criteria used in the draft report.
- In written comments on a draft of the report, the Secretary of Veterans Affairs stated that the Department of Veterans Affairs generally agreed with our conclusions and concurred with our recommendations. Furthermore, the Secretary stated that Veterans Affairs has initiatives under way to address the weaknesses identified in the report. Veterans Affairs’ comments are printed in appendix VII.
- Obtaining Copies of GAO Reports and Testimony
Background
EVM Provides Insight on Program Cost and Schedule
Federal Guidance Calls for Using EVM to Improve IT Management
Prior Reviews on Agency Use of EVM to Acquire and Manage IT Systems Have Identified Weaknesses
Agencies’ EVM Policies Are Not Comprehensive
Most Programs Did Not Fully Establish Comprehensive EVM Systems
program provided a list of its subprograms; however, it did not define the scope of the detailed work elements that comprise each subprogram. Without a work breakdown structure, programs lack a basis for planning the performance baseline and assigning responsibility for that work, both of which are necessary to accomplish a program’s objectives.
Many Programs Did Not Fully Implement Practices to Ensure Data Reliability
Most Programs Used Earned Value Data for Decision-making Purposes
Inconsistent Implementation Is Due in Part to Weaknesses in Policy and Lack of Enforcement
Conclusions
Recommendations for Executive Action
Agency Comments and Our Evaluation
Appendix I: Objectives, Scope, and Methodology
Appendix II: Case Studies of Selected Programs’ Implementation of Earned Value Management
Farm Program Modernization
Decennial Response Integration System
Field Data Collection Automation
Air and Space Operations Center—Weapon System
Joint Tactical Radio System—Handheld, Manpack, Small Form Fit
Warfighter Information Network—Tactical
Automated Commercial Environment
Integrated Deepwater System—Common Operational Picture
Western Hemisphere Travel Initiative
Next Generation Identification
James Webb Space Telescope
Juno
Mars Science Laboratory
En Route Automation Modernization
Surveillance and Broadcast System
Veterans Health Information Systems and Technology Architecture—Foundations Modernization
Appendix III: Comments from the Department of Commerce
Appendix IV: Comments from the Department of Defense
Appendix V: Comments from the Department of Justice
Appendix VI: Comments from the National Aeronautics and Space Administration
Appendix VII: Comments from the Department of Veterans Affairs
Appendix VIII: GAO Contact and Staff Acknowledgments
GAO Contact
Staff Acknowledgments
Related GAO Products
Order by Phone
<<
/ASCII85EncodePages false
/AllowTransparency false
/AutoPositionEPSFiles true
/AutoRotatePages /PageByPage
/Binding /Left
/CalGrayProfile (Dot Gain 20%)
/CalRGBProfile (sRGB IEC61966-2.1)
/CalCMYKProfile (U.S. Web Coated \050SWOP\051 v2)
/sRGBProfile (sRGB IEC61966-2.1)
/CannotEmbedFontPolicy /Warning
/CompatibilityLevel 1.4
/CompressObjects /Off
/CompressPages true
/ConvertImagesToIndexed true
/PassThroughJPEGImages true
/CreateJobTicket false
/DefaultRenderingIntent /Default
/DetectBlends true
/DetectCurves 0.1000
/ColorConversionStrategy /LeaveColorUnchanged
/DoThumbnails false
/EmbedAllFonts true
/EmbedOpenType false
/ParseICCProfilesInComments true
/EmbedJobOptions true
/DSCReportingLevel 0
/EmitDSCWarnings false
/EndPage -1
/ImageMemory 1048576
/LockDistillerParams true
/MaxSubsetPct 100
/Optimize true
/OPM 1
/ParseDSCComments true
/ParseDSCCommentsForDocInfo true
/PreserveCopyPage true
/PreserveDICMYKValues true
/PreserveEPSInfo true
/PreserveFlatness false
/PreserveHalftoneInfo false
/PreserveOPIComments false
/PreserveOverprintSettings true
/StartPage 1
/SubsetFonts true
/TransferFunctionInfo /Preserve
/UCRandBGInfo /Preserve
/UsePrologue false
/ColorSettingsFile ()
/AlwaysEmbed [ true
]
/NeverEmbed [ true
]
/AntiAliasColorImages false
/CropColorImages false
/ColorImageMinResolution 150
/ColorImageMinResolutionPolicy /Warning
/DownsampleColorImages true
/ColorImageDownsampleType /Bicubic
/ColorImageResolution 300
/ColorImageDepth -1
/ColorImageMinDownsampleDepth 1
/ColorImageDownsampleThreshold 1.50000
/EncodeColorImages true
/ColorImageFilter /DCTEncode
/AutoFilterColorImages true
/ColorImageAutoFilterStrategy /JPEG
/ColorACSImageDict <<
/QFactor 0.15
/HSamples [1 1 1 1] /VSamples [1 1 1 1]
>>
/ColorImageDict <<
/QFactor 0.76
/HSamples [2 1 1 2] /VSamples [2 1 1 2]
>>
/JPEG2000ColorACSImageDict <<
/TileWidth 256
/TileHeight 256
/Quality 15
>>
/JPEG2000ColorImageDict <<
/TileWidth 256
/TileHeight 256
/Quality 15
>>
/AntiAliasGrayImages false
/CropGrayImages false
/GrayImageMinResolution 150
/GrayImageMinResolutionPolicy /Warning
/DownsampleGrayImages true
/GrayImageDownsampleType /Bicubic
/GrayImageResolution 300
/GrayImageDepth -1
/GrayImageMinDownsampleDepth 2
/GrayImageDownsampleThreshold 1.50000
/EncodeGrayImages true
/GrayImageFilter /DCTEncode
/AutoFilterGrayImages true
/GrayImageAutoFilterStrategy /JPEG
/GrayACSImageDict <<
/QFactor 0.15
/HSamples [1 1 1 1] /VSamples [1 1 1 1]
>>
/GrayImageDict <<
/QFactor 0.76
/HSamples [2 1 1 2] /VSamples [2 1 1 2]
>>
/JPEG2000GrayACSImageDict <<
/TileWidth 256
/TileHeight 256
/Quality 15
>>
/JPEG2000GrayImageDict <<
/TileWidth 256
/TileHeight 256
/Quality 15
>>
/AntiAliasMonoImages false
/CropMonoImages false
/MonoImageMinResolution 1200
/MonoImageMinResolutionPolicy /Warning
/DownsampleMonoImages true
/MonoImageDownsampleType /Bicubic
/MonoImageResolution 1200
/MonoImageDepth -1
/MonoImageDownsampleThreshold 1.50000
/EncodeMonoImages true
/MonoImageFilter /CCITTFaxEncode
/MonoImageDict <<
/K -1
>>
/AllowPSXObjects false
/CheckCompliance [
/None
]
/PDFX1aCheck false
/PDFX3Check false
/PDFXCompliantPDFOnly false
/PDFXNoTrimBoxError true
/PDFXTrimBoxToMediaBoxOffset [
0.00000
0.00000
0.00000
0.00000
]
/PDFXSetBleedBoxToMediaBox true
/PDFXBleedBoxToTrimBoxOffset [
0.00000
0.00000
0.00000
0.00000
]
/PDFXOutputIntentProfile (U.S. Web Coated \050SWOP\051 v2)
/PDFXOutputConditionIdentifier (CGATS TR 001)
/PDFXOutputCondition ()
/PDFXRegistryName (http://www.color.org)
/PDFXTrapped /False
/CreateJDFFile false
/Description <<
/ENU (Use these settings to create Adobe PDF documents suitable for reliable viewing and printing of business documents. Created PDF documents can be opened with Acrobat and Adobe Reader 5.0 and later.)
>>
/Namespace [
(Adobe)
(Common)
(1.0)
]
/OtherNamespaces [
<<
/AsReaderSpreads false
/CropImagesToFrames true
/ErrorControl /WarnAndContinue
/FlattenerIgnoreSpreadOverrides false
/IncludeGuidesGrids false
/IncludeNonPrinting true
/IncludeSlug false
/Namespace [
(Adobe)
(InDesign)
(4.0)
]
/OmitPlacedBitmaps false
/OmitPlacedEPS false
/OmitPlacedPDF false
/SimulateOverprint /Legacy
>>
<<
/AllowImageBreaks true
/AllowTableBreaks true
/ExpandPage false
/HonorBaseURL true
/HonorRolloverEffect false
/IgnoreHTMLPageBreaks false
/IncludeHeaderFooter false
/MarginOffset [
0
0
0
0
]
/MetadataAuthor ()
/MetadataKeywords ()
/MetadataSubject ()
/MetadataTitle ()
/MetricPageSize [
0
0
]
/MetricUnit /inch
/MobileCompatible 0
/Namespace [
(Adobe)
(GoLive)
(8.0)
]
/OpenZoomToHTMLFontSize false
/PageOrientation /Portrait
/RemoveBackground false
/ShrinkContent true
/TreatColorsAs /MainMonitorColors
/UseEmbeddedProfiles false
/UseHTMLTitleAsMetadata true
>>
<<
/AddBleedMarks false
/AddColorBars false
/AddCropMarks false
/AddPageInfo false
/AddRegMarks false
/BleedOffset [
0
0
0
0
]
/ConvertColors /ConvertToCMYK
/DestinationProfileName (U.S. Web Coated \(SWOP\) v2)
/DestinationProfileSelector /UseName
/Downsample16BitImages true
/FlattenerPreset <<
/ClipComplexRegions true
/ConvertStrokesToOutlines false
/ConvertTextToOutlines false
/GradientResolution 300
/LineArtTextResolution 1200
/PresetName ([High Resolution])
/PresetSelector /HighResolution
/RasterVectorBalance 1
>>
/FormElements true
/GenerateStructure true
/IncludeBookmarks true
/IncludeHyperlinks true
/IncludeInteractive false
/IncludeLayers false
/IncludeProfiles false
/MarksOffset 6
/MarksWeight 0.250000
/MultimediaHandling /UseObjectSettings
/Namespace [
(Adobe)
(CreativeSuite)
(2.0)
]
/PDFXOutputIntentProfileSelector /UseName
/PageMarksFile /RomanDefault
/PreserveEditing true
/UntaggedCMYKHandling /LeaveUntagged
/UntaggedRGBHandling /UseDocumentProfile
/UseDocumentBleed false
>>
]
>> setdistillerparams
<<
/HWResolution [2400 2400]
/PageSize [612.000 792.000]
>> setpagedevice
1
Earned Value Management at NASA:
An Integrated, Lightweight Solution
Peter Putz1, David A. Maluf2, David G. Bell1, Mohana M. Gurram1, Jennifer Hsu3, Hemil N. Patel3, Keith J. Swanson2
1Universities Space Research Association
NASA Ames Research Center
Moffett Field, CA 94035
650-604-2137
pputz@riacs.edu
dbell@riacs.edu
mgurram@riacs.edu
2NASA Ames Research Center
Moffett Field, CA 94035
david.a.maluf@nasa.gov
keith.j.swanson@nasa.gov
3QSS Group Inc
NASA Ames Research Center
Moffett Field, CA 94035
jennifer@email.arc.nasa.gov
patel@email.arc.nasa.gov
Abstract12—This paper describes a fresh approach to
Earned Value Management (EVM) at the U.S. National
Aeronautics and Space Administration (NASA). The goal
of this approach is to provide a lightweight tool that allows
project managers to apply earned value performance
measurements with minimal effort in terms of data entry,
and without the need to learn the highly specialized jargon
that mystifies many EVM solutions. The presented technical
and managerial solution addresses the practical challenges
of applying EVM in the messy realm of project
management. An empirical case study involving five
projects at the NASA Ames Research Center illustrates the
challenges of creating a consistent performance
measurement baseline under the constraints of schedule,
budget, and labor requirements, and of matching actual
costs with budgeted costs on the level of granularity needed.
The case study also highlights the benefits of using the
implemented EVM solution in terms of data quality and
time savings. The paper concludes with general
recommendations for the design and application of EVM
tools with the focus on ease of use.
TABLE OF CONTENTS
1. INTRODUCTION………………………………………………1
2. CREATING A PERFORMANCE BASELINE ……………2
3. REPORTING MONTHLY PROGRESS …………………..4
4. CUSTOMIZING EVM REPORT FORMATS…………..5
5. LESSONS LEARNED …………………………………………6
6. CONCLUSIONS ……………………………………………….7
REFERENCES …………………………………………………….7
BIOGRAPHY ………………………………………………………8
1. INTRODUCTION
Earned Value Management (EVM) is a management
technique that allows project managers to determine the true
cost and schedule variance between plan and material work
accomplished of a given project at any time. More
1 1-4244-0525-4/07/$20.00 ©2007 IEEE.
2 IEEEAC paper #1282, Version 2, Updated December 11, 2006
importantly it allows management to predict the total costs
at completion and the date of completion. A key benefit of
EVM is that it serves as an early-warning system against
cost overruns and schedule delays. This is the main reason
why NASA and other government agencies—most
prominently the Department of Defense—require projects to
produce EVM reports on a regular basis (NPR 7120.5C
[1]).
Project managers reacted with skepticism when, in 2004,
NASA’s newly founded Exploration Systems Mission
Directorate (ESMD) mandated that even relatively small
projects with a budget from 1 to 10 million dollars had to
account for their project performance in the form of a highly
specific monthly EVM report. They did not understand the
details of the EVM method and, more severely, they were
not provided with the necessary tools to integrate their
schedule and financial data. After attending multiple-day-
long EVM training sessions most of them ended with either
creating their own set of Excel spreadsheets to manually
calculate the required metrics or struggling with oversized
and rigid software packages. These helped in getting the
core EVM metrics, but project managers were still not able
to automatically produce the customized report formats
required by ESMD.
For some reason the practical difficulties of a highly
specialized language paired with a lack of adequate tool
seem to keep haunting EVM. Fleming and Koppelman [2]
complain: “What started out originally as a simple concept
on the factory floor has evolved into a sort of vocational
cultist confederation in which one must be specifically
trained to use a foreign language in order to be a member of
the team” (p. 73). Fleming and Koppelman continue on a
positive note: “There is nothing difficult or complicated
about the earned value concept. It does not require highly
trained people to grasp the fundamentals. In fact, many
people use the concept in their daily routines and are not
even aware they are employing earned value” (p. 73).
This paper describes a novel solution to creating EVM
reports in a way that is as easy to understand as the earned
2
value concept itself. After ESMD issued the mandatory
EVM requirement, some project managers found that the
existing NASA Program Management Tool (PMT) [3],
which they were using for project planning and reporting,
captured already most of the financial and schedule data
necessary for EVM reporting. What was missing was the
report itself and some minor changes in the data input
templates. Given this assessment, PMT was used to
implement a full set of EVM reporting capabilities.
PMT is a NASA-developed program management tool suite
that supports program and project managers in all their
essential activities, like creating and monitoring annual task
plans, analyzing variances between budget plans and actual
costs, creating periodic status reports on technical, schedule,
budget, and management status, identifying program risks
and tracking mitigation strategies, creating aggregated
program dashboard views and other customized reports for
single subprojects or the entire program.
The PMT software architecture is built around two
distinguishing features:
(1) The main user interfaces are standard business
documents like spreadsheets, presentation slides, and
text documents. The use of standard business
document templates not only enables automated
exchange of information between machines, but also
supports natural and often ad-hoc on-line and off-line
workflows between humans gathering data or making
decisions.
(2) The backend is a ‘schema-less’ XML3 database, which
enables easy data integration and query-based
document composition. At the same time it eliminates
the need for database administration by automating the
integration of information that have diverse schemas.
The following graphical representation illustrates the typical
PMT document workflow:
Drag and Drop
Email
Web Upload
Decompose to XML
User
Store in XML
Database
Enter Context / Content
Query
<É> É É>
Query result in XML
Compose from XML
3 Extensible Markup Lanaguage – http://www.w3.org/XML/
Figure 1 – PMT Document Workflow
PMT is an ideal software platform for EVM reporting for
the following reasons:
(1) Seamless integration of heterogeneous and distributed
information: EVM reporting requires the integration of
a variety of data which are partly user entered and
partly retrieved from existing databases like the
financial system, scheduling tools, risk management
systems and so on. The XML database allows adding
new data elements without the need of changing any
predefined schema, in fact, without even touching the
database at all.
(2) Automatic composition of analyses and reports: EVM
reports are often multi-page documents covering not
only core EVM metrics, but also graphical
representations of the work breakdown structure, the
project master schedule, risk and mitigation status and
financial performance. PMT has the capability to
automatically produce comprehensive slide decks or
multi-page spreadsheet documents. The document
composition is built upon an advanced XML query
module.
(3) Easy communication of complex information among
diverse subject matter experts and stakeholders: The
gathering of financial, schedule, and various other
project status data is usually an effort involving the
collaboration of multiple subject matter experts. With
PMT, data entry and report templates can be accessed,
distributed, archived in a variety of ways. As standard
business documents they can be down-loaded to a
local desktop for off-line access, the can be distributed
as email-attachments (e.g. to resource analysts who do
not even need to aware that a particular template is a
PMT input form), and they can be archived in third-
party document repositories.
Given those features, PMT seemed to be an ideal software
platform for the implementation of an Earned Value
Management system. The following sections discuss in
detail, how users can setup and use PMT for EVM
reporting, including empirical results on the system use in
practice.
2. CREATING A PERFORMANCE BASELINE
A consistent project baseline is an indispensable
prerequisite for any kind of project performance assessment.
For Earned Value Management a performance baseline has
to include a work breakdown structure (WBS) for the entire
project; a master schedule of work packages with budgeted
costs for each work package; and a time-phased budget plan
for each WBS element.
3
Figure 2 – Building a Performance Baseline
Creating a work breakdown structure
The work breakdown structure defines the scope of a given
project. A WBS is a hierarchical structure of sub-projects,
sub-tasks, or sub-products. Graphical representations of a
work breakdown structure look very much like an
organizational chart, even though its elements are not
organizational units like groups, divisions, directorates etc.,
but sub-tasks.
In PMT the WBS is entered into a configuration spreadsheet
in form of a flat list.
Figure 3 – Defining a WBS in PMT
Figure 3 shows the data elements for the WBS structure.
Noteworthy are the columns “Level”, “PMT WBS #”, and
“BW WBS #”. “Level” indicates the hierarchical position of
a particular WBS element, “PMT WBS #” is a string of
characters chosen by a project manager to systematically
name the WBS elements, and “BW WBS #” links each
WBS element to NASA’s financial system. Fortunately the
accounts in NASA’s financial system—or Business
Warehouse (BW)—are structured as a work breakdown
structure including all current space missions and programs.
In most instances this makes it very easy to link a PMT
WBS element to an account in the financial system. We will
describe in section 4 below how this connection between
PMT and the financial system enables an automatic
calculation of actual costs for each WBS element.
The users upload the configuration spreadsheet to the PMT
server and the system automatically produces three data
entry spreadsheets per WBS element: (1) a Taskplan for
entering schedule and budget information; (2) a Monthly
Report for entering the progress accomplished on each work
package; and (3) a Budget Report for analyzing cost plans
versus actual costs and for entering subjective cost
estimates.
Entering schedule information
Schedule information for work packages—which are called
deliverables in PMT—is entered into the Taskplan
spreadsheets. Breaking down the total project schedule into
work packages per WBS element facilitates the distributed
collaboration between sub-project managers who are
accountable for their on-time completion.
Figure 4 – Scheduling and Costing Work Packages
Entering time-phased budget information
The Taskplan includes a spreadsheet called “Budget
Section” for entering the time-phased budget plan (or
phasing plan). This again is done on the level of each WBS
element. The phasing plan section is custom designed to
match NASA’s accounting system. A phasing plan is
typically:
(1) broken down into full cost elements (civil service
labor, civil service travel, procurement, etc.)
(2) entered for multiple fiscal years
(3) broken down into the various NASA Centers involved
in the sub-project.
4
In the EVM jargon the phasing plan total is called “planned
value” or “Budgeted Costs for Work Scheduled (BCWS)”.
Validating the performance baseline
In the context of Earned Value Management it is crucial that
the schedule information and the time-phased budget
information are integrated into a consistent performance
plan. The following graphic shows the connection between
schedule and budget data.
In the (messy) reality of project management it is not a
trivial task to create a consistent performance baseline.
Dependent on the particular project environment the
concrete steps for this planning process may vary
substantially. Some projects might start by planning in
terms of work hours, others in terms of dollar-values; some
projects might optimize their schedule around predefined
delivery deadlines and assign the required workforce
afterwards, others might optimize the schedule for a given
level of workforce utilization and move the work packages
around to accommodate that. Many other variants are
feasible, too.
Figure 5 – Creating a Consistent Performance Baseline
Given the variety of project planning methods, we found it
not helpful to provide the project managers with a one-size-
fits-all planning tool within PMT. Rather, we left it to the
individual project managers to use tools of their own
choice. However, once the final schedule and phasing plan
information is entered, PMT provides a baseline validation
tool.
The baseline validation tool is a spreadsheet that calculates
the Schedule Performance Index (SPI) for all WBS
elements of a given work breakdown structure under the
assumption of the following best-case-scenario:
(1) all work packages are finished on time;
(2) for all periods, the actual costs are equal to the planned
costs.
Under these assumptions the SPI and the Cost Performance
Index (CPI) are always 1.00 for consistent performance
baselines4. However, if schedule and budget information do
not match up, the SPI is either bigger or smaller then 1.00.
In such a case a project manager needs to go back and
revise the plan. Section 5 below will describe our lessons
learned this regard.
3. REPORTING MONTHLY PROGRESS
Earned Value Management is nothing more than comparing
for a given point in time the planned value (BCWS) with
two other numbers: the earned value (BCWP) and the actual
costs (ACWP)5.
With PMT the actual costs are loaded automatically from
NASA’s financial system (see above) and do not require
any manual data input into PMT.
Figure 6 – Entering Percentage Complete
The Earned Value (BCWP) is a dollar value representing
the material progress made on a particular work package.
4 In fact, under assumption (2)—budgeted value equals actual cost—CPI
and SPI are identical.
5 BCWS stands for Budgeted Cost for Work Scheduled, BCWP for
Budgeted Cost for Work Performed, and ACWP for Actual Cost for Work
Performed.
5
For each given WBS element it is the sum of the budgeted
costs per work package multiplied by the percentage
complete. As pointed out above, in PMT the budgeted cost
per work package is entered into the schedule section of the
Taskplan. Hence the only missing data point is the
percentage complete per work package. This is done in a
spreadsheet entitled “Monthly Report”.
Figure 6 shows that the individual work packages are
graphically depicted as timelines. The percentages complete
are updated once a month by the project manager and
entered as numbers directly under the timeline for a given
work package. In the same graphical display start and end
dates for milestones can be moved if necessary.
Note that after the performance baseline is created, at a
minimum all a project manager needs to do for EVM
reporting is to access the Monthly Report and update the
percentage complete!
In the case that the final EVM report contains additional
information besides the core EVM metrics it can also be
entered into the “Monthly Report” spreadsheet. E.g., in the
EVM report for NASA’s Exploration Systems Mission
Directorate required entering project risks.
4. CUSTOMIZING EVM REPORT FORMATS
Figure 7 shows how PMT integrates the performance
baseline data and the monthly progress data in order to
calculate the core EVM metrics.
6
Figure 7 – Calculation of Core EVM metrics
The system is built such that an EVM report can be run
against any individual WBS element. By default PMT
calculates the core EVM metrics for the selected WBS
element (e.g. the level 1 project) and for the next level
down. This level is often called Control Account Plan
(CAP), which is simply the management control point
where the performance measurement has to take place. PMT
dynamically rolls up all lower level schedule, budget and
cost information to the CAPs.
Many organizations follow widely accepted standard
formats like Contract Performance Report, which was
originally mandated for Department of Defense (DoD)
acquisition contracts. However, in some other cases,
executive management wants to see the EVM metrics in the
broader project context including financial reports, risk
reports and other status updates. This is a challenge for
project managers since it requires time consuming and
error-prone manual data manipulation. For this reason PMT
was designed to support the production of highly
customized EVM reports. The XML database and the XML
query protocol used in PMT enable the composition of
customized reports with a minimum of software coding
efforts.
5. LESSONS LEARNED
The EVM reporting capabilities were implemented in 2004
when NASA’s newly-founded Exploration Systems Mission
Directorate mandated earned value project management
even for relatively small sized research projects. The
software development was done in close cooperation with
five ESMD project managers who provided valuable
feedback and requirements for a system that is easy to use in
a NASA environment.
Inconsistent Baselines
It turned out that challenge number one was the formulation
of a consistent performance baseline. In a small case study
we analyzed the originally submitted baselines that were
produced with a variety of ad-hoc tools—mostly self-
created spreadsheets.
We applied a best-case scenario—assuming that a) all work
packages were delivered exactly on schedule and that b) the
actual costs equal the budgeted costs in any point in time—
and calculated the predicted cost and schedule indexes. For
a consistent baseline SPI and CPI necessarily need to be
1.00 for all months. However, if the baseline is inconsistent,
that is, the schedule for earning value and the time-phased
budget do not match—SPI and CPI would be either higher
or lower then 1.00.
In accordance with the ESMD EVM format we coded the
SPI/CPI numbers with green (0.9 ! index ! 1.1), yellow
(1.1 ! index ! 1.2 or 0.8 ! index ! 0.9), and red (index >
1.2 or index < 0.8) color, indicating the amount of the
variance from the performance baseline. The following
table shows the results:
Figure 8 – Case Study Results
The results indicate that the performance baselines were
highly inconsistent. Even under the best case scenario of the
project being on budget and on schedule at all times:
(1) not a single project would achieve green each month,
meaning not a single project had a consistent baseline
(2) 31.7% of all reported months were in the red or yellow
(3) one particular performance baseline had even 58% of
reported months in the red or yellow at project level.
These results proved the urgent need to implement a
performance validation tool that project managers could use
before they submitted the final plan.
Need for Subjective Cost Estimates
When we showed our EVM pilot implementation to the
NASA project managers, their immediate question was:
Where do you get the actual costs from? When they learned
that we imported the costs directly from NASA’s financial
system, they responded: “Then we can’t use it”.
As it turned out the project managers had good reasons for
their objection. In fact, under certain circumstances the
financial data out of the accounting system are problematic
for performance measurement. For example when
contractors complete a job it takes up to a few weeks until
the contractor receipt is received, reviewed, paid and
entered into the accounting system. Other examples are
arbitrarily timed assessments of organizational overhead
costs or costs charged to the wrong WBS element within a
project.
In those cases the accounting system is not “wrong” but the
time lag in the accounting data is too great for someone who
wants to do project performance measurement. Therefore,
7
project managers need a way to “adjust” the accounting
data. With PMT we took the approach to allow project
managers to enter ‘estimated costs’ replacing the system of
record data. However, for reasons of data transparency it is
essential that the EVM system keeps the numbers of the
accounting system and the ‘subjective’ cost estimates
logically and visually strictly separated.
Figure 9 – Entering Estimated Costs
The figure above shows the PMT Budget Report. The actual
costs out of NASA’s accounting system are depicted as
thick vertical bars. The subjective cost estimates are added
as thin vertical lines on top of the bars.
6. CONCLUSIONS
The pilot implementation of the described EVM solution at
the NASA Ames Research Center was highly successful in
terms of time savings and in terms of reliability and
consistency of reports across very different projects.
Before project managers started to use the PMT EVM
module it took them between one and two weeks to gather
the relevant schedule and cost data from all the sub-projects,
to manually calculate the EVM metrics, to create the
required charts, and to integrate everything into the final
slide deck. With PMT this time span came down to one to
two days and instead of data gathering and manual data
manipulation the project managers could focus on variance
analysis and on creating actions where necessary.
We feel that our experiences with the presented approach
are encouraging. The positive results are based on the
following factors, which can be seen as general
recommendations for EVM tools:
(1) Use standard business documents as the main user
interface. Project managers live and breathe
spreadsheets and slide sets. A tool that uses standard
business documents significantly increases user
acceptance, reduces the need for training, and allows
for complex off-line workflows.
(2) Provide the capability to automatically produce
custom report formats. Senior management wants to
see high-level project status information in a
customized, easy to read format. Since those standards
change frequently an EVM tool should provide the
flexibility to use any output format required.
(3) Minimize the need for manual data entry. Project
managers do not like to enter information twice. A
seamless integration with existent financial and
scheduling systems is highly recommended.
(4) Provide a baseline validation tool. There are two ways
to support project managers in creating a performance
measurement baseline. One is to provide a set of
predefined input templates that produce automatically
a consistent baseline. The other way is to have a
baseline validation tool. In a highly heterogeneous
project environment it might be more successful to
leave it to the project managers to choose their own
tools and processes for creating a performance
baseline. However, to ensure that the performance
baseline is self-consistent, a baseline validation tool
has proved highly valuable.
(5) Provide a means for entering “cost estimates”. As
shown above corporate accounting systems do not
provide the right costs information for performance
reporting in all cases. It is frustrating for project
managers to explain performance variances that are
merely undesired accounting artifacts. One approach
to avoid that is to “allow” project managers to enter
“cost estimates” based on their accurate knowledge of
project activities.
REFERENCES
[1] NASA Program and Project Management Processes and
Requirements NPR 7120.5C, Web site:
http://www.hq.nasa.gov/office/codeq/doctree/71205.htm
[2] W. Quentin Fleming and Joel M. Koppelman, “The
Essence and Evolution of Earned Value,” Transactions of
AACE International, 73–79, 1994.
[3] Bell, David G., et al. “The NASA Program Management
Tool: A New Vision in Business Intelligence,” 2006 IEEE
Aerospace Conference Proceedings, March 4-11, 2006.
8
BIOGRAPHY
Peter Putz is a management scientist with the Research
Institute for Advanced Computer Science (RIACS) at the
NASA Ames Research Center. Previously he was a member
of research staff with the Xerox Palo Alto Research Center
(which is now PARC Inc.) where he was doing research on
learning and knowledge sharing strategies together with an
interdisciplinary group of social scientists in the
Knowledge, Interaction and Practice Area. Peter received
his Ph.D. for the Johannes Kepler University Linz, Austria.
There he was an assistant professor with the Department of
Business Information Systems and the Department of
Management for more then ten years.
David A. Maluf leads the NASA Advanced Exploration
Network laboratory (AEN), a laboratory consisting of 20+
staff with an average of 12 projects/year. He has over 70
technical publications in journals and conference
proceedings, over hundreds of presentations at
international conferences and is an inventor of numerous
patents. He has taught courses on system engineering and
databases, and has written two books. He received his PhD
from McGill University and conducted post-doctoral
research in information integration at Stanford University.
David G. Bell is Director and Senior Scientist at the
Research Institute for Advanced Computer Science, located
at the NASA Ames Research Center. Prior to working at
NASA, David worked for ten years at the Xerox Palo Alto
Research Center, and previously held an appointment at
MIT where he led a research program in the Center for
Innovation in Product Development. David is co-inventor
of multiple patent and patent-pending information system
technologies, including XML query technologies related to
NETMARK and the NASA Program Management Tool,
extensible blog technology called Sparrow Web, and
distributed knowledge management software called Eureka.
David received his Ph.D. from Cornell University with a
dissertation on the dynamics of product development
processes.
Mohana M. Gurram is a computer scientist for Universities
Space Research Association at NASA Ames Research
Center. He has worked with Mars Exploration Rover
mission and has been presented with awards like TIGR,
Honors Award at NASA. His area of interest is Data
Visualization especially context-sensitive data. He earned
his Masters in Computer & Information Science.
Jennifer Hsu is a Systems Analyst with QSS Group Inc..
She is a member of the Program Management Tool
Development Team at Advanced Exploration Networks
Laboratory, NASA Ames Research Center. She also worked
on Mars Exploration Rover Mission’s Rover Activity
Planner, MAPGEN project. She received her Ph.D. degree
in Biochemistry from Cornell University.
Hemil N. Patel is a computer scientist with the QSS Group
at the NASA Ames Research Center. Previously he worked
on the Aviation Data Integration System (ADIS) where he
received numerous awards including NASA’s Space Act
Board Award. Hemil has a Master’s Degree in Electrical
Engineering from University of North Carolina at
Charlotte.
Keith Swanson is a computer scientist in the Advanced
Exploration Networks laboratory of the Intelligent Systems
Division at NASA Ames Research Center. He has over 20
years of technology management and development
experience in the areas of system health management,
planning and scheduling, and knowledge-based systems.
Keith has a Master’s degree in Computer Science from
Stanford University and a Master’s Degree in Engineering
from UC Berkeley.