Federal data center optimization an ‘evolution’ to cloud computing

FedRAMP isn’t a silver bullet for government cloud computing, but the Office of Management and Budget’s latest data center initiative might as well be an on-ramp.

Federal enterprise leaders urged an audience of private sector and government employees, gathered at the July 26 ATARC Federal Cloud Computing Summit in Washington, not to be afraid of the Federal Data Center Optimization Initiative, but instead to embrace it for its cloud service promotion.

“I think you’re going to hear any time we utter ‘data center optimization’ it’s always going to be ‘… and the subsequent move to the cloud,'” said Dominic Sale, deputy associate administrator for information integrity and access at the General Services Administration. “I tell my team this is the biggest on-ramp to the cloud since FedRAMP was set up. I think this is going to be another key component of that cross-agency cloud initiative.”

Earn 1 CPE credit and learn about the expansion of risk management in government with analysis from GAO and Justice OIG. Register now for the free webinar.

While still a draft policy, the DCOI establishes a framework for “data center consolidation and optimization,” as required by the Federal Information Technology Acquisition Reform Act, according to the memo.

Advertisement

The DCOI is based on a four-part strategy, Sale said. It includes freezing the current footprint and future data center expansions, closing existing data centers, moving to the cloud and promoting shared services.

“Those last two kind of speak to the data center marketplace, shared services and industry providers, and how do we bring agencies closer to those and bring the  tools to bear to make their jobs easier, because data center owners have a very tough job ahead of them here,” Sale said. “I think data center optimization is a great way, kind of an enticement, to get on the cloud.”

OMB reported in November that agencies had more than 11,700 data centers, up from 9,000 in 2014 and 1,100 in 2009 when the administration initially set the goal to reduce this infrastructure.

The DCOI objective is three-fold, Sale said: cost reduction through efficiency gains, cost reduction through the elimination of facilities, and sustainability.

“Just the focus, just the wording of the initiative itself, optimization versus consolidation, speaks to the nuances required to make this successful,” Sale said. “This is not just moving servers around on the chessboard. It’s all about carefully choosing what we’re measuring here and then really holding agencies, holding ourselves to account on hitting those measures.”

Sale said it’s very important to engage with industry to get better visibility into the market, and that at the end of the day, data optimization is about “cleaning out the closet, going through every item and figuring out what to do with it,” Sale said. “The unit of analysis here is the application, not the data center.”

Sale said he does not see a future — at least not in the next decade — where data centers are non-existent, but that doesn’t mean there won’t be changes.

“The whole ecosystem is going to need to adapt and change and everyone’s going to need to adapt those skillsets,” Sale said. “This is a  change management, HR effort. We should think about all the things that we need from an IT perspective and retrain our folks, versus think of them as obsolete when a data center closes. How can we adapt them to the future of IT within the government?”

Chat with Alastair Thomson, CIO of NIH’s National Heart, Lung, and Blood Institute, Aug. 16 at 11 a.m. Sign up here.

Melonie Parker-Hill, division chief at the State Department’s Enterprise Operations Center, echoed similar sentiments, saying it was unnecessary to “throw the baby out with the bathwater” as agencies make advancements with data center consolidations.

“Actually look at retraining staff members, retraining ourselves,” Parker-Hill said. “One of the things here at the Department of State, we have to get the buy-in from stakeholders … we have to be forward thinking.”

Bang for the buck

John Hale, chief of the Defense Information Systems Agency (DISA) cloud portfolio, also acknowledged the need to embrace change.

“With mobile … being the number one technology disrupter in IT, what we’re doing from a cloud computing perspective is really facilitating all of that information sharing, which our mission partners need,” Hale said. “We’re still pushing through all of the issues related to policy and security and how we do cloud computing at the enterprise level. There is no silver bullet when it comes to the cloud.”

When it comes to DoD, there’s a three-pillar plan for how to move forward with cloud computing.

One column relates to traditional computing, Hale said, because there is still work that can’t be modernized or there is no budget to update it. There’s also work that Defense doesn’t feel comfortable holding in a commercial cloud, such as nuclear capabilities, Hale said, and would prefer having “on our concrete.”

The third pillar is commercial off premises private cloud, which Hale said, and “that’s really where we see the biggest bang for our buck.”

“We have to facilitate cost savings, reduction of overhead within the department,” Hale said. “Our budget, much like the rest of the federal government’s budget, continues to shrink every year. If we don’t take the right steps now in order to facilitate cost savings, we’re going to be in big trouble here in a couple of years.”