Government

GOVERNMENT IT REPORT

Federal Data Center Reductions May Not Shrink Costs

In a drive to reduce government computing costs, federal agencies have been required to cut and consolidate data centers. So far, agencies have reduced hundreds of operational centers — but they still have fallen short of a five-year target set when the consolidation program was launched in February 2010.

In addition to the consolidation and closure target, the Office of Management and Budget set a financial savings goal for federal agencies. However, progress in meeting that goal hasn’t been determined — mainly as a result of failures in how to follow the money.

“The 24 agencies participating in the federal data center consolidation initiative have made progress towards the goal to close 40 percent, or 1,253 of the 3,133 total federal data centers, by the end of 2015,” said David Powner, director for information technology management issues at the General Accountability Office, at a House Oversight and Government Reform Committee hearing in May. Agencies closed 420 data centers by the end of December 2012, and have plans to close an additional 548 to reach 968 by December 2015.

Big Financial Data Gap

While that is an impressive record, it is still short of the OMB target by 285 closures. Also, OMB has not determined how much — or how little — progress the agencies have made in meeting its other goal of achieving US$3 billion in cost savings by 2015, according to the GAO. The reason is that OMB has failed to develop “a consistent and repeatable method for tracking cost savings,” said GAO.

“This lack of information makes it uncertain whether the $3 billion in savings is achievable by the end of 2015. Until OMB tracks and reports on performance measures such as cost savings, it will be limited in its ability to oversee agencies’ progress against key goals,” Powner said at the hearing.

The lack of documentation on consolidation savings also was a major finding in a survey of federal agencies conducted by MeriTalk. The survey, released in May, was sponsored by NetApp.

On a positive note, 60 percent of federal agency respondents to the survey reported better use of IT staff as a result of data consolidation. Another 57 percent reported reduced energy consumption, while 47 percent reported increased use of innovative and more efficient computing platforms and technologies. Nearly 40 percent reported improved IT security.

However, hard data on actual savings remained elusive, at best. Only 32 percent of respondents to the survey reported quantifiable cost savings. The inability to track cost savings is frustrating agency IT leaders in their efforts to encourage cooperation in pursuing more efficient management of data centers, according to the MeriTalk report. Just over half of federal IT professionals are unsure if the cost of closing data centers outweighs the savings their agency will realize.

Developing confidence among federal managers for improving data storage and management is important for vendors offering platforms and other resources to achieve efficiencies.

“Agency mission owners want to know what’s in it for them. It is a fair question,” said Mark Weber, president of U.S. public sector at NetApp.

“Data center consolidation benefits are real, but federal CIOs need more and better tools to help draw a clear, direct line between cost savings from consolidation and payback to mission owners,” he said.

“It’s true that several of the measures that commercial data centers employ are difficult for federal data centers to measure, since they do not necessarily have access to information like energy and real estate costs. Still, it’s possible to create a model that estimates savings based on square footage and amount of equipment,” Weber told the E-Commerce Times.

“One other place where we think agencies could quickly and easily measure progress is in the area of storage utilization. Storage accounts for a significant portion of data center costs — nearly 30 percent, according to industry analysts — but many storage systems are only at 20-40 percent utilization,” he pointed out. “That’s a lot of capacity that is sitting unused. We would recommend measuring the current utilization and working to bring that utilization up to a much higher level to create a more efficient infrastructure.”

2 Departments Show the Way

“We would prefer to have the documentation on cost reductions, but the lack of that information at present doesn’t mean there hasn’t been any saving. The sheer reduction of centers implicitly indicates efficiency, especially in the data storage aspect,” GAO’s Powner told the E-Commerce Times.

Both the Defense Department and the Department of Homeland Security stood out as exceptions that had adequately developed cost-savings data, he noted.

“If those agencies can generate performance information, then others can look to them as models. It may take until 2017 or longer to meet the original 2015 goal, but the goal is achievable,” Powner said.

“The important thing to keep in mind is that you have to do what’s right for the operation, not just think about checking the box on the OMB mandate,” noted NetApp’s Weber.

“If you think in that way, you’ll make real progress. There have been stories about people taking credit for closing data centers when they’re really just decommissioning some equipment in the closet. That’s not really in the spirit of the initiative,” he said.

“Thinking like a business person and picking real measures of success and then taking steps to achieve it is what needs to happen to make significant progress — and that’s in everyone’s reach,” Weber maintained.

The Cloud Connection

Discussions of data center consolidation inevitably lead to a consideration of cloud computing as a tool in the storage and consolidation process — a point that was addressed at the House hearing in May.

“The reality is that cost savings are only part of the picture, and that is why we think that it’s fundamental to clearly link the transition to cloud computing with federal data center consolidation in order to achieve the maximum benefits of federal data center optimization,” Teresa Carlson, vice president of worldwide public sector at Amazon Web Services, said at the hearing.

“One of the main reasons customers are adopting the cloud is that it allows you to trade capital expense for variable expense. Instead of having to lay out all that capital for data centers and servers, before you know how you’re going to use them, you just pay as you consume resources on a variable basis,” AWS said in a statement provided to the E-Commerce Times by spokesperson Rena Lunak.

“Customers love the fact that in the cloud they get to pay a lower variable expense than they could do on their own data center. Government agencies today should continue to look to the cloud as a way to shift expenses from capital investments and move them to innovative new projects,” AWS said.

“I’d say that the cloud is very helpful, but it’s not the only answer — and it’s not just servers that you need to think about, of course. When you’re talking about computing resources, it makes a ton of sense to scale up in the cloud, especially when you have varying demands,” NetApp’s Weber said.

“For storage, though, we’d argue that there are very few situations where you need to store less. In that case, it makes sense to have a level of ownership and physical resources,” he added. “Now, whether you need to keep those on your own premises or own a portion of another agency’s resources or house them with a provider — that all depends on your agency’s demands, resources and requirements.”

John K. Higgins is a career business writer, with broad experience for a major publisher in a wide range of topics including energy, finance, environment and government policy. In his current freelance role, he reports mainly on government information technology issues for ECT News Network.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by John K. Higgins
More in Government

E-Commerce Times Channels