Accuracy Has Improved, and Additional Efforts Are Under Way to Better Inform Decision Making



Why GAO Did This Study

Each year the federal government spends billions of dollars on information technology (IT) investments. Given the importance of program oversight, the Office of Management and Budget (OMB) established a public website, referred to as the IT Dashboard, that provides detailed information on about 800 federal IT investments, including assessments of actual performance against cost and schedule targets (referred to as ratings). According to OMB, these data are intended to provide both a near-real-time and historical perspective of performance. In the third of a series of Dashboard reviews, GAO was asked to examine the accuracy of the Dashboard’s cost and schedule performance ratings. To do so, GAO compared the performance of eight major investments undergoing development from four agencies with large IT budgets (the Departments of Commerce, the Interior, and State, as well as the General Services Administration) against the corresponding ratings on the Dashboard, and interviewed OMB and agency officials.

What GAO Found

Since GAO’s first report in July 2010, the accuracy of investment ratings has improved because of OMB’s refinement of the Dashboard’s cost and schedule calculations. Most of the Dashboard’s cost and schedule ratings for the eight selected investments were accurate; however, they did not sufficiently emphasize recent performance for informed oversight and decision making.

  • Cost ratings were accurate for four of the investments that GAO reviewed, and schedule ratings were accurate for seven. In general,the number of discrepancies found in GAO’s reviews has decreased. Ineach case where GAO found rating discrepancies, the Dashboard’sratings showed poorer performance than GAO’s assessment. Reasons forinaccurate Dashboard ratings included missing or incomplete agencydata submissions, erroneous data submissions, and inconsistentinvestment baseline information. In all cases, the selected agenciesfound and corrected these inaccuracies in subsequent Dashboard datasubmissions. Such continued diligence by agencies to report completeand timely data will help ensure that the Dashboard’s performanceratings are accurate. In the case of the General ServicesAdministration, officials did not disclose that performance data onthe Dashboard were unreliable for one investment because of an ongoingbaseline change. Without proper disclosure of pending baselinechanges, OMB and other external oversight bodies may not have theappropriate information needed to make informed decisions.
  • While the Dashboard’s cost and schedule ratings provide a cumulative view of performance, they did not emphasize current performance—-whichis needed to meet OMB’s goal of reporting near-real-time performance.GAO’s past work has shown cost and schedule performance informationfrom the most recent 6 months to be a reliable benchmark for providinga near-real-time perspective on investment status. By combining recentand historical performance, the Dashboard’s ratings may mask thecurrent status of the investment, especially for lengthy acquisitions.GAO found that this discrepancy between cumulative and currentperformance ratings was reflected in two of the selected investments.For example, a Department of the Interior investment’s Dashboard costrating indicated normal performance from December 2010 through March2011, whereas GAO’s analysis of current performance showed that costperformance needed attention for those months. If fully implemented,OMB’s recent and ongoing changes to the Dashboard, including new costand schedule rating calculations and updated investment baselinereporting, should address this issue. These Dashboard changes could beimportant steps toward improving insight into current performance andthe utility of the Dashboard for effective executive oversight. GAOplans to evaluate the new version of the Dashboard once it is publiclyavailable in 2012.