An interagency working group is about to turn the government’s concept of cloud computing on its head.
The Cloud Center of Excellence this week will release a draft best practices guide — please don’t use playbook, it’s so last year — that will give agency contracting officers, chief information officers and CFOs a new way of thinking about buying cloud services.
The guidance tries to bring together seven years of mistakes, successes and false starts around cloud computing.
“It’s about how can we do acquisition faster? How can we identify the opportunities for bulk buys, much like we do for hardware?” said David Bray, chairman of the interagency working group and the chief information officer of the Federal Communications Commission.
Bray said instead of agencies negotiating these commodity services like collaboration or office productivity as one-off procurements, GSA or another agency should develop a vehicle where agencies can buy “seats” off of the governmentwide contract at lower rates.
“The guide will share what the interagency group recommends for OMB to tee up,” he said.
Bray said the center of excellence wants comments on the draft document from government and industry alike.
GSA launched the center of excellence in January with a goal of accelerating an agency’s move to cloud services by simplifying the contracting and funding approaches. About a dozen agencies are part of the center drawing from their own experiences in moving simple and complex systems to the cloud.
The new document comes as the White House plans to make a full-court press for agencies to utilize shared services, including the cloud, particularly for cybersecurity. President Donald Trump signed a cyber executive order May 11 with a strong preference for cloud and shared services.
Additionally, the House will vote this week on Rep. Will Hurd’s (R-Texas) Modernizing Government Technology (MGT) Act. Majority leader Kevin McCarthy (R-Calif.), said on Twitter Sunday night the bill will help usher in a new era of constituent services and “restructures the government’s flawed approach to IT.”
This bill would create in each agency a working capital fund based on the savings from moving to the cloud.
While saying agencies should be partial to moving to the cloud is one thing, changing the acquisition and funding rules is much more difficult. The cyber EO gives the policy hammer while MGT would address some of the funding challenges.
But the center of excellence is trying to pull all the pieces together.
Bray, who moved the FCC’s data center to a commercial cloud almost two years ago and now is spending 35 percent less on legacy IT, wants to the help agencies move beyond the “low-hanging fruit” that many already have moved to the cloud.
One way to do that would be changing the funding mechanism for cloud.
“As a result, it currently accepts a set of funding mechanisms that cost (at best) the federal government 67 percent more than necessary for those services or routinely accepts risk of anti-deficiency,” the draft guide stated. “The current structure of federal funds systems works directly against the advantages and intended business and value elegances of cloud computing.”
The draft guidance describes the pros and cons of three current approaches.
One approach is to buy cloud services through the GSA schedule using a firm fixed price contract.
A second way agencies are currently buying cloud is through “drawdown” accounts where agencies obligate a certain amount of money per year and either the agency or the vendor monitors usage, and when the accounts gets below a certain threshold, the department adds more money.
The third approach is the subscription model where the agency pays a monthly bill based on a standard set of services.
“The federal government’s existing methods of buying cloud services (e.g., optional CLINs, drawdown accounts, and subscription models) do not effectively address the problem of demand elasticity and portability. They are ultimately minor variants in contracting structure, business financial process emphasis, or product re-characterizations that help only incrementally by shifting trade-offs without providing complete solutions. None of these methods provide for a complete realization of benefits of cloud computing by providing effective means for the government to both consume and pay only for the resources it needs and uses,” the draft guidance stated. “A potential solution to this might explicitly allow for cloud computing resource units to be treated, including associated oversight risk, like labor hour rates (fixed unit price) in T&M contracts.”
Several agencies are trying to solve this acquisition challenge. The Defense Information Systems Agency (DISA) is following the working capital fund approach where vendors bill against an existing pot of money.
John Hale, DISA’s chief of the cloud portfolio, called it the “Easy Pass” approach where more money is added to the fund if it ever drops below a certain threshold.
But Hale said DISA also is facing the challenge of not being able to directly contract with the Microsofts or Amazons of the world.
“The way DoD is acquiring commercial cloud is through third parties. We have no relationship with Microsoft or Amazon or any cloud provider,” he said. “I have a contract with Lockheed Martin or another integrator and they use other direct costs (ODCs) to buy the services. So Lockheed has the relationship with Amazon, not us. It’s a bad model because when something goes wrong, I can’t call a cloud provider. I have to call the third party.”
Bray said the center of excellence may end up recommending that GSA or another agency create a governmentwide cloud contract.
“What would a software-as-a-service schedule look like and what would feed into that,” Bray said. “Another thing we have seen is the National Institute of Standards and Technology apparently has the prototype ability to express service level agreements (SLAs) in machine code so one of the challenges I want to put forward, and it may take a while to do, but show we can do a new cloud procurement in two weeks or less. If NIST has the ability to express an SLA in code up front and vendors do that, then I as a CIO or another CIO could check for things I’m looking for. We could survey those SLAs that vendors submitted and make sure they are still valid. Vendors could get back to GSA within a week and then you’ve procured cloud services.”
Bray said the goal is to get the different players in the space — GSA, NASA and the National Institutes of Health — to rethink how they deliver software-as-a-service.
Another major challenge with cloud is security. While the Federal Risk Authorization and Management Program (FedRAMP) created standards and a baseline of security controls, agencies continue to struggle with reciprocity.
Bray said many times other agency FedRAMP documentation isn’t as useful as it could be because risk tolerance among agencies is so different, and therefore agencies feel compelled to redo a significant portion of the authority to operate (ATO).
“Over the long term, if we have some common cloud services that are available through the GSA schedule or another contract, DHS should be the one who scans it and gives it a governmentwide ATO instead of each agency doing it on their own,” he said. “DHS would test a minimum set of controls, and then the CIO makes the choice if the controls are enough or if they want more.”
If nothing else, Bray said the draft guide should get the community talking and debating how to change the current process that is not working.
“The biggest challenge to rapid cloud adoption with speed with the incumbent contractors is how they are profiting off the on-premise model,” he said. “The challenge is how do we engage the private sector to make the jump from the on-premise model, which is the known, safe way to work with government and make a profit, versus adopt cloud, which is the unknown, potentially disruptive model, which we have to do for the good of the taxpayers, but incumbents may not be as inclined to adopt.”