Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

2 days of DoD cloud chatter leaves us all continuing to guess

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Defense Department is leaving almost every technology vendor on edge for another few days. It posted a message on the FedBizOpps.gov website saying it would post version 2 of the Joint Enterprise Defense Infrastructure (JEDI) cloud contract sometime this week.

Part of the reason for the delay is DoD received 1,089 comments from 46 vendors, two associations, and three government agencies in response to the first draft solicitation.

While DoD said it wouldn’t be sharing any details of the comments or commenters, it didn’t stop a full court press from Capitol Hill and vendors to influence and change on the Pentagon’s plans for a single source award for cloud services that many estimate could be worth as much as $10 billion over 10 years.

Contractors and former Defense officials have been ringing the alarm bells over what many see is an internal preference for the military to move to Amazon Web Services.

These concerns made Thursday and Friday of last week even more interesting when there was a series of non-response responses from Defense leaders, former Defense officials, lawmakers and even Amazon’s head of its global public sector.

Thursday, April 12, 10 a.m.

During a House Armed Services Committee hearing, Rep. Jacky Rosen (D-Nev.) asked Defense Secretary James Mattis about DoD’s plans for the cloud contract.

Rosen: What are the cloud’s implications if we do public and private partnerships, as we — if we move to the cloud, who’s going to own some of that proprietary information? What if some of those private businesses go out of business?

Mattis: The movement to the cloud, congresswoman, is to enhance the availability of the information among us right now. We have to also quickly advance our security. We have over 400 different basic data centers that we have to protect, and we have watched very closely what CIA got in terms of security and service from their movement to the cloud.

It is a fair and open competition for anyone who wants to come in. It’s only two years. If you’ve read something about 10 years in the press, that’s not the case at all.

So it will be a full and open competition. Not sole sourced, by the way, to make certain we don’t fall into just one and I’m very confident that we can get it to your horizon on anyone bidding ought to know with certainty, they will not be folding.”

So Mattis reaffirmed to lawmakers that the JEDI procurement will be fair and it’s only for two years. Now that may be a two-year base with several option years, but no matter, Mattis, I’m sure, is hearing the concerns from Congress and industry alike.

April 12, 12 p.m.

The Hudson Institute hosted a panel discussion with two former Defense officials — John Stenbit, who served as the assistant secretary of Defense for command, control, communications and intelligence during the George W. Bush administration, and Stephen Bryen, who served as the director of the Defense Technology Security Administration from 1981 to 1988.

Both Stenbit and Bryen as well as William Schneider, a senior fellow at Hudson and a former staff member in the House and Senate and State Department executive, expressed deep concerns about DoD’s acquisition strategy.

“The DoD has laid down its own standards or guidelines, if you want to call them that, on what it expects the security of the system it will procure should look like,” Bryen said. “Basically, what they’ve done, for the most part, is two things: one, of course, is to make sure the employees who are working in the cloud environment that they proposed are cleared American employees. That, by the way, creates a significant problem in being able to find enough cleared American employees to do the job. I’m not sure they are so readily available so that is definitely a challenge that is out there. The second is to take some of the procedures that are used to procure DoD’s existing computers, servers and equipment and apply that to the cloud. I’m wondering if DoD has such confidence in these standards. There is not a new standard for the cloud. They are just taking what they have in the Security Technical Implementation Guidelines (STIG). Basically, there are about 400 of these and they are massive checklists that you go through and make sure you are in compliance.”

Bryen said it’s unclear how the STIGs would apply the cloud and that’s a serious problem because fixing the cyber vulnerabilities can require taking a system offline.

Additionally, Bryen questioned DoD’s approach because it’s not clear what or who the backup is if Amazon’s services go down.

“My guess is the backup is actually the existing system, and what they really are trying to do is keep two systems going — a cloud system over here, and the old system here. We already know the old system has a set of problems. We don’t know all the set of problems with new cloud system,” he said. “If you can do denial of service attacks on a cloud, which is one risk, and shut it down, you could shut down DoD if it was only on one [provider.]”

Bryen said keeping the old system online as the backup also would require having skilled and cleared employees run those systems, which adds to the first challenge.

“I think this whole thing is really in need of a lot more study, a lot more investigation and particularly on the security side, which I think what we have is a simplistic approach to security right now that says we can put the old standards to the new system, it will work and everything will be fine. I think that is wishful thinking,” he said. “It seems to me that a much more ambitious effort should be made. I think cloud computing makes sense, but I think it has to be secure computing.”

Stenbit added that he would suggest to DoD that the Defense Science Board, which includes 45 private sector and academic experts that give the Pentagon advice and recommendations.

DoD also created a Defense Innovation Board, which includes private sector experts such as Dr. Neal DeGrassse Tyson, Eric Schmidt of Google and Marne Levine of Instgram.

What’s even more interesting about the Hudson Institute event is it was sponsored by Oracle. Industry sources say Oracle is aggressively lobbying against DoD’s single source strategy. The software giant may also be driving a wedge across industry as all the large cloud and technology players are paying close attention to JEDI.

Bloomberg reported on April 13 that Oracle “is holding regular calls with tech allies, courting trade and mainstream media and lobbying lawmakers, defense officials and the White House.”

Of course, this wouldn’t be the first time Oracle played the role of aggressor. When the Trump administration released its draft IT modernization strategy in September, Oracle submitted comments that trashed Obama administration efforts to move off of legacy IT.

Friday, April 13

Less than 24 hours later, Teresa Carlson, vice president of worldwide public sector for Amazon Web Services, stood before a packed room in McLean, Virginia, during a Northern Virginia Technology Council (NVTC) breakfast and said nothing about JEDI or the ongoing e-commerce portal effort at the General Services Administration.

But if you read between the lines, Carlson’s points certainly were there to send a message.

“We have a leadership principle called customer obsession. It’s the one thing we think about all the time. It’s the way that listening to our customer has allowed us to move fast in this [public sector] community. We listen and then innovate on behalf of the customer. We don’t go in with preconceived ideas and we are pretty flexible on the way we are actually dealing with them,” Carlson said. “The one thing I’ve told my team from day one is that we are not going to settle to do things lesser than we should be because we are disruptive and we are changing the way our customers are thinking and taking advantage of technology.”

Carlson, then, went right after the government’s current approach to technology, acquisition and innovation, and maybe even all of those contractors who are pushing back against DoD’s approach.

“When you are creating new technologies — and a lot of people in this room are doing these kinds of things — you can’t settle for old and outdated policy or acquisition legislation. You can’t settle for security controls and modules that don’t meet the needs of our nation anymore,” she said. “It’s important we take a stand and we are proactive in how we are doing that. So we listen, we’ve innovated and we’ve brought those tools available to our customers.”

Later on in the speech, Carlson took another shot at the status quo companies.

She said AWS has dropped its prices more than 65 times since 2006, and increased the number of capabilities provided through the cloud.

“In 2012, we released 160 significant services and features. Fast forward to today, in 2017, we’ve launched over 1,400 new services. Why is that important? That’s important because it shows you with cloud computing how fast you can move, and how our partners and customers can take advantage of that. You don’t have to sacrifice innovation for speed or security. You can have all of those,” Carlson said.

It’s easy to see how all of these facts and figures are direct messages to lawmakers and DoD officials about why Amazon is the right choice.

So what does all this mean for JEDI?

Several industry experts have told me they believe JEDI will never get off the ground in its current incarnation. If DoD goes down the path of a single award, the congressional inquiries and the bid protests will keep this initiative tied up in knots for the next 12-18 months.

The second draft of the RFP that is expected this week will be telling to see if the pressure by vendors and lawmakers is getting through, or if the Amazon supporters remain in control.

Read more of the Reporter’s Notebook.


70,000 contractors must get notarized letters in next 60 days to continue working for government

Up to 70,000 federal contractors are heading to their local notary to get that special stamp on a letter that’s destined for the General Services Administration to authenticate the vital details of their business, including who is the authorized “entity administrator associated with the DUNS number.”

These are the first details of the impact on vendors emerging from the latest case of fraud to affect GSA’s System for Award Management (SAM).

A GSA spokeswoman confirmed the agency already received 7,500 notarized letters.

“GSA is making internal business process improvements based on our analysis of the first set of letters received,” the spokeswoman said in an email to Federal News Radio. “We are continuously updating the instructions on SAM.gov to make it easier for entities to be in compliance. Additionally, we have posted templates on SAM.gov for entities to use when submitting their notarized letters.”

GSA is requiring notarized letters for several thousand contractors immediately, and then any vendor whose existing registrations on SAM.gov need to be updated after April 27.

This all stems from a third incident in the last five years in which a third party either stole or changed contractor data. GSA alerted vendors on March 22 after it found a third-party changed the financial information of “a limited number” of contractors registered on the governmentwide the SAM.gov portal.

GSA issued initial details of the fraud at that time and then updated the frequently asked questions on April 4.

An internal presentation from April 12, Federal News Radio obtained,  sheds even more light on the impact of the SAM.gov fraud incident.

GSA officials said more than 33,000 contractors needed to confirm a change in their bank account information in the past year. Now this is not to say all 33,000 vendors were potential victims of fraud, but it’s not a stretch that many of them were swept up in this incident as we all know how difficult it is to change bank accounts. I’m not sure anyone would voluntarily change banks.

The GSA spokeswoman wouldn’t confirm how many vendors were victim of this latest fraud, citing an active law enforcement investigation.

It seems vendors are struggling with GSA’s notarization process. Of the 7,500 notarized letters received, GSA processed more than 3,300 and rejected almost 56 percent of them (1,910) for one reason or another.

GSA said it added staff to its Federal Service Desk to support the response and continues to evaluate the overall impact of this fraud incident, including call volume and wait times.

The presentation shows GSA plans to take several other steps to further improve the process. GSA said it modified its process for international entities and partially masked sensitive data elements on SAM.gov.

By the end of June, GSA plans to end the requirement for a notarized letter “by implementing a data-driven, risk-based approach” by combining technical and analytic processes “to reduce risk and focus any additional burden only on those entities with the highest risk profile.”

The goal, GSA said, is to “provide confidence” in that approach so it would deter known fraud paths.

Finally, GSA said by April 30 it would present details to improve the governance of SAM.gov to the joint governance board.

Beyond the impact on contractors and GSA, the SAM.gov modernization effort also now will be delayed.

The presentation said “the combined fraud response will have cost/schedule impacts on modernization,” and GSA will know about the extent of the impact by mid-May.

The GSA spokeswoman said the agency remains committed to ensuring that the existing SAM.gov and the future SAM.gov systems are reliable.

“We are continuing to make progress on the modernization and utilization of beta.SAM.gov. Users may provide feedback on the new beta SAM website at beta.SAM.gov,” the spokeswoman said.

The presentation provides a bit more details about the beta.SAM.gov initiative.

By the end of May, GSA expects to decommission the current site that hosts the Catalog of Federal Domestic Assistance (CFDA). The CFDA provides a full listing of all federal programs available to state and local governments, tribal entities and public and private organizations.

Also starting in May, GSA plans to begin “alpha testing” the reports, opportunities, federal hierarchy and wage determinations modules of SAM.gov.

GSA has been trying to improve and consolidate the 10 portals that are a part of SAM.gov for almost a decade. During that time, it has now suffered three incidents — both cyber and fraud — during that time. Maybe GSA should consider adding two-factor authentication, maybe even those new Login.gov capabilities that it added to USAJobs.gov earlier this year, to SAM.gov and limit the number of vendors who have to go through the notarization process.

Read more of the Reporter’s Notebook.


Draft policy pushes interagency committee to better address building security

The Office of Management and Budget missed its own deadline to issue new identity management policy 75 days after the final IT modernization report came out in December.

OMB took 38 days extra. But it was well worth it, according to former federal executives and other identity management experts.

The draft Identity, Credential, and Access Management (ICAM) policy update hits many of the right notes, rescinds five outdated policies and clearly addresses where the future of identity management is heading in the federal government.

“I want to commend OMB for putting this out as a public draft and offering the public the opportunity to comment on and improve it. OMB Policy Memos don’t usually include an opportunity for review and comment, and it’s nice to see them taking this approach,” said Jeremy Grant, the managing director of Technology Business Strategy at Venable and the lead of the Better Identity Coalition. “Just from a ‘housecleaning’ perspective, this is an important memo. OMB has issued a number of different memorandum over the years covering different aspects of identity, some of which have grown a little long in the tooth. The idea of rescinding several old memos and replacing them with a new, overarching policy makes a lot of sense.”

The one area with which the draft policy hits a home run is finally bringing physical and logical security together.

“Agencies shall require use of the personal identity verification (PIV) credentials as the common means of authentication for federal employee and contractor access to federally-controlled facilities. Agencies shall ensure that use of the PIV credential for physical access to federal buildings are implemented in accordance with The Risk Management Process for Federal Facilities: An Interagency Security Committee Standard and NIST SP 800-116, A Recommendation for the Use of PIV Credentials in Physical Access Control Systems (PACS),” the draft policy states. “This publication provides additional information on the use of PIV credentials, the governmentwide standard identity credential, in physical access control systems.”

Additionally, it requires the Interagency Security Committee to develop a risk management standard for federal facilities and ensure it is aligned with governmentwide policy for PIV implementation.

“The standard defines the criteria and processes that those responsible for the security of a facility should use to determine its facility security level, and provides an integrated, single source of physical security countermeasures,” the draft policy states.

The General Services administration also will work with the National Institute of Standard and Technology, the Office of Personnel Management and the Homeland Security Department to develop and publish a physical access control system (PACS) security and privacy control overlay to help agencies identify core controls for PACS.

Randy Vanderhoof, the executive director of the Secure Technology Alliance, an industry association that has been involved with federal identity management issues for more than two decades, said he is pleased to see the draft policy call for improved integration of computer and front door access with the PIV card.

“The point raised about the role of DHS to step up its role in the Interagency Security Committee is important. One of shortcomings of this effort is agencies held back more aggressive procurement of approved PACs solutions because the ISC has not really provided much guidance for agencies about how to go about making those procurements work well. They have relied on agencies to manage that process themselves through the RFP process and through the hiring third parties to do security evaluations and make recommendations on designing their needs for PACS,” Vanderhoof said. “It would’ve been helpful for government resources to have offered some guidance and direction on how to do that. This document finally does that by identifying DHS as the one which is responsible for leading that effort under the ISC.”

For much of the past decade, the smart identity cards have been used as fancy flash passes when it came to physical security. Only in the last few years have agencies started to implement physical access security that requires the card to pass through gates.

Vanderhoof said one major issues for the integration of physical and logical security has been the bifurcation of how agencies bought the technology. Under GSA Schedule 70, agencies could purchase logical access control systems. But they had to use Schedule 84 for physical access control systems. He said the two schedules were not aligned properly, and that made it more difficult for security officers to know how to get effective help.

Vanderhoof said he hopes when the memo is final it will address many of these physical vs. logical security challenges.

Waiting for the OMB update

The much-anticipated draft memo, which has received no comments in 10 days, also addresses governance, capabilities and emphasizes shared services. OMB is accepting comments on the draft through May 6.

Among the five areas under governance, OMB wants agencies to create a single point to oversee and implement ICAM.

“Designate an integrated ICAM office, team, or other governance structure in support of its Enterprise Risk Management capability that includes personnel from the offices of the chief information officer, chief security officer, human resources, general counsel, senior agency official for privacy and component organizations that manage ICAM programs and capabilities,” the draft states. “These offices, as well as program managers and acquisition offices, should regularly coordinate to ensure that the agency’s ICAM policies, processes and technologies are being implemented, maintained and managed consistently. This includes coordinating the deployment of capabilities and functionality provided through the continuous diagnostics and mitigation (CDM) Program.”

OMB also is pushing for sharing of services and the use of application programming interfaces (APIs).

And the third area of shared services, agencies should use the credential management services supplied by GSA, rely on CDM to further identity management capabilities and take advantage of shared identity assurance and authentication services, which enhance online trust and safety for citizens.

Joseph Stuntz, the vice president of cybersecurity at One World Identity (OWI) and a former policy lead for OMB’s cyber and national security unit, said all of these actions are needed, but the draft policy doesn’t go far enough.

“The release of this new guidance hopefully leads to a series of agency policy and process updates that create flexibility for departments and agencies to be able to adopt the most modern secure identity technology in order to support broader IT modernization efforts and save resources being spent on legacy solutions,” he said. “Hopefully, this is the start to move federal identity forward in an even more substantial way by looking at HSPD-12. When it was written, it addressed a serious issue and it still addresses many similar issues today. But, by combining physical access, logical access and suitability, it led to an inflexible solution that does not address many current use cases like cloud and mobile. I think OMB and the White House have a chance to capture the momentum created by the update of this policy to lay out plans and next steps to address all three of these important areas in ways that promote innovation instead of restrict it.”

Door opens for third-party credentials

Stuntz brings up the one question in many private sector circles of whether OMB would begin to move agencies away from PIV and toward newer technologies.

But the draft memo didn’t move away from PIV, instead it reiterated its importance. At the same time, however, OMB opened the door even wider to the use of derived credentials as well as those provided by third parties, including, but not limited to GSA’s Login.gov platform.

Venable’s Grant said one of those potential approaches to address the challenge of high assurance systems is the use of Fast Identity Online (FIDO) Alliance web authentication standard in browsers.

Grant said the inclusion of the FIDO standards “will open up some much easier way for agencies to deliver strong authentication.”

He said the fact that OMB also told NIST to make changes to NIST SP 800-157 — the Guidelines for Derived PIV Credentials — to support some of these newer, innovative approaches to authentication in mobile devices also is a good sign.

The Secure Technology Alliance’s Vanderhoof said moving away from the PIV card would’ve been just too difficult for agencies.

“In the past, policies were focused on government-to-government or federal employee’ usage of federal information systems only, but this is the first memo that I’ve seen where they’ve actually called out the role of business-to-government and consumer-to-government identity and authentication,” he said. “That has always been the next phase of this effort, and this is the first time I’ve seen a document that actually calls out that federal agencies should be looking at other shared service providers to provide those identities. At the same time, OMB also calls out that GSA is responsible to make sure whatever those shared services that agencies use are tested and approved for the meeting of specifications and standards from NIST. Right now there aren’t any so they are putting a path toward that end.”

Grant added that the one place where OMB needs to further clarify the guidance is the role of Login.gov and whether it will support federation with third-party credentials.

“Keep in mind that the sole reason government previously focused on Connect.gov was that it is hard for agencies to integrate with multiple third party identity providers — at its core, Connect.gov was a service to make support for federation easier. So when Connect.gov was killed, that was a pretty strong signal to agencies that federation was dead,” he said. “In this new memo, however, federation requirements are back — and that’s a good thing, in my view.  But it’s unclear how all the pieces come together.  On that note, the memo places a heavy emphasis on use of ‘shared identity assurance and authentication services’ — and notes that ‘agencies should leverage private or public sector shared services.’ But it stops short when it comes to actually directing any agencies outside of GSA to stand up shared services. This may be a missed opportunity. Other agencies are sitting on stores of authoritative attributes that could be used here to assist government with identity vetting for online services. Citizens should be able to ask a government agency that issued them a physical credential to stand behind it in the online world, by validating the information from the credential.”

Read more of the Reporter’s Notebook.


Rep. Hurd says CDM is a software implementation problem, he’s only partly correct

Best listening experience is on Chrome, Firefox or Safari.  Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Rep. Will Hurd (R-Texas) is one of the few members of Congress who actually gets technology. Unlike most, he understands the internet is not a series of tubes.

Hurd, the chairman of the Oversight and Government Reform Subcommittee on IT, said delays in implementing the continuous diagnostics and mitigation (CDM) program is just about installing software into an agency’s systems, so any delays are both unnecessary and unacceptable.

Hurd said he puts the onus on the agencies to make it happen.

“This is absolutely on the agencies side. DHS is doing everything they can to have this resource. When you have an agency partner that has the capabilities and manpower to do it, it’s being implemented. My concern is that in so many agencies you don’t have people that are even capable of handling the basic tools from CDM, which are basic tools in any industry,” Hurd said in an interview during a break in the March 20 hearing on CDM with the House Homeland Security Committee. “That is what I get frustrated with. The first two-years of the program are of no cost to the agency, so this is about implementation. You are running a piece of software, and asking questions of the software and responding to alerts. This is not something that takes and requires a significant amount of manpower, so there is no justification for an agency not to introduce this into their network because you have the funding for the first two years sorted out.”

But Hurd’s comments got me thinking: Is the Homeland Security Department’s signature cybersecurity program really just a software problem? Is Hurd just simplifying what it takes to implement CDM or are agencies making it more complicated than it needs to be?

I checked in with some vendors to get their perspectives on Hurd’s comments.

Pete Morrison, the senior director of sales for security solutions for the U.S. public sector and Canada for CA Technologies, said any large program such as CDM will have its challenges.

“DHS is trying to educate the agencies on how this funding can provide real improvements to the security posture of the departments. Security is complex and DHS may not have anticipated the level of education and training it would have to provide for agency personnel to expeditiously roll out this program,” Morrison said in an email to Federal News Radio. “Cabinet level agencies, especially those with multiple sub-agencies, are not used to working on enterprisewide programs. These agencies will need to manage through that process, which includes competing priorities as well as concerns around out-year maintenance and support of the solutions. DHS can help provide some guidance around how an agency can fund and support the program after the initial two years. This is what has caused delays. More clarity around all of this will help.”

Niels Jensen, a senior vice president of Americas for ForeScout Technologies, said doing the basics around cyber hygiene at the scale of the government creates challenges that were unheard of previously.

“Where Kevin Cox, CDM program manager, originally cited 44 percent of devices found by the program were unknown or unclassified, that number has climbed to 75 percent according to his testimony. By focusing on the (sometimes painful) hygiene of Phase 1, DHS is adhering to best practices established by NIST (800-53) and reinforced by industry best practices (SANS CIS Critical Security Controls),” he said. “The number of unsecured, unmanaged or invisible devices is growing and making it more difficult for departments and agencies to keep pace since an increasing majority of these devices cannot support management and security agents.”

Rep. John Ratcliffe (R-Texas), chairman on the Homeland Security Subcommittee on Cybersecurity and Infrastructure Protection, said it’s clear CDM isn’t moving as fast as anyone would like but he’s not displeased with the program.

“This is the quintessential problem of matching government bureaucracy with improving technology and trying to match those up to use those in a cost effective and efficient way. I think it’s a struggle that’s probably going to continue,” he said. “That’s why you have hearings like this to bring urgency and focus to that and hopefully change the dynamics of that equation.”

So it seems Hurd is simplifying the challenges around CDM. It’s not “just a software implementation problem.”

But Hurd’s other points about having the people who are trained and capable to implement CDM as well as top leadership support seem to be the real gauge of success.

The Office of Personnel Management is the perfect example. After the massive data breach in 2015, OPM became the first agency to implement phase 1 of CDM, and will complete phase 2 this summer.

“For OPM, this has meant gaining greater insights into our connection points within our network. In addition, OPM has made use of CDM technologies to identify and strategically resolve potential vulnerabilities, which has resulted in better overall risk management and response,” said David Garcia, OPM’s CIO at the hearing. “The use of CDM has set the stage for OPM to move into a continuous monitoring approach that enhances OPM’s ability to manage its systems and continually to evolve its systems security in real time.”

OPM not only received the funding, but brought in leaders with strong cyber and program management skills such as Chord Chase and Lisa Schlosser.

On the other end of the spectrum is the Energy Department. Max Everett, Energy’s CIO since July 2017, said his agency is behind in implementing CDM because they focused on only a smart part of the department.

“We’ve gone back at the direction of our secretary and deputy secretary, we are looking to cover all of phase 1 and phase 2 for the entire department,” he said. “A number of areas in our department have CDM capabilities, meaning they’ve got tools that do those capabilities that we talk about in the phases, and they may or may not be necessarily be the tools that are part of those procurements.  Our role right now is filling all of those gaps and my goal over time is to sunset some of those existing tools, as we can, but integrate all the data in our dashboard that goes back to DHS.”

Everett said they have figured out the cost to fill those gaps and the cost of sustainment, which is about $8 million a year.

“I’m making sure in our out-year budget, we pay for that as a department, because it’s a departmental tool,” he said.

Energy struggled because previous CIOs didn’t have the leadership support, and possibly the internal skillets, to bring CDM to the national labs and other offices.

Cox, the CDM program manager, told the committees that despite the challenges, the program continues to make significant progress.

He said 90-to-95 percent of all federal assets will be going through the CDM governmentwide dashboard by the end of April. As of mid-March, Cox said about 25 percent of all federal assets were reporting to DHS’s dashboard.

Cox said DHS is working with the Office of Management and Budget to update the CDM memo from November 2013. He didn’t offer any details on what the update would include.

Additionally, Cox said his office is working the DHS’s Federal Network Resilience office on how to phases 3 and 4 can work in parallel with phase 4 pilots starting this summer. Phase 4 of CDM is focused on protecting the data on the network.

And finally, Cox said DHS is working with the Defense Department, including a meeting in late March, to discuss the concept of implementing a new approach called “comply-to-connect.”

“We want to look across government to see how software-defined networking and zero trust networks could work with comply-to-connect,” Cox said. “We are building that partnership up so we can share back and forth best practices and lessons learned with DoD.”

Ratcliffe said he was pleased with the optimism CDM is bringing to the government.

“The next major step is legislation. That’s why we are having multiple hearings on this issue,” he said. “The plan is to discuss this and the subcommittee staff and team will look together toward looking at legislation that will help with some of the issues.”

Read more of the Reporter’s Notebook.


Why women in senior positions could be the difference maker for IT modernization

The path to becoming a federal chief information officer is full of zigs and zags.

Sylvia Burns, the Interior Department’s CIO, almost was born for the position. Burns was part of the audio visual (AV) group in junior high school and high school — one of two women to hang out with the “geeky” boys watching Star Trek and reading authors like Isaac Asimov.

“I loved it just because it was this thing about how the future could be and technology was such a part of it, about bringing civilization to a place where we’d never been and moving the boundaries always,” Burns said.

Now Burns is celebrated her four-year anniversary as Interior’s CIO, making her the longest serving one in the CFO Act agencies — by a few months over Joe Klimavicz at the Justice Department and Swarnali Haldar of the National Archives and Records Administration — both of whom came later in 2014 and are approaching their four-year anniversaries too.

Maria Roat, the Small Business Administration’s CIO, who was brave enough to undertake the transformation of a broken down agency that was SBA and turn it into a leading edge one from a technology perspective, was introduced in to computers in high school where she worked on a Burroughs mainframe. Roat joined the Navy out of high school and worked on mainframes, key punches and eventually PCs, engineering and global enterprise networks.

Suzette Kent, Federal CIO, OMB presents opening remarks for the Women in Federal IT & Cyber event in the Jefferson Auditorium, U.S. Department of Agriculture in Washington, DC, on March. 29, 2018. USDA photo by Tom Witham.

“I’ve held pretty much about every job except for programmer, which I really suck at it,” she said. “I am a bad coder. I will troubleshoot all day long, but don’t ask me to sit still write something. Being curious, asking questions, ‘Why not? Why can’t we do that?’ I’ve been able to move around and do different things. I worked all over the place at DHS. Sometimes you are not looking for things, but somebody says, ‘Hey, take a look at this job,’ and that’s how I ended up win many of my jobs.”

And Roat did without a college degree. In fact, she just graduated four years ago with her bachelor’s degree.

Burns and Roat are part of a growing community of CIOs, who, one, are impacting the government, and two, happen to be women.

And with the federal CIO and cyber positions staffed by women at a rate of 40 percent, administration officials see a real opportunity to make change.

“We actually are embarking on the largest transformation in the world, and more importantly, results have proven that women are more effective collaborators and what you deliver is actually better,” said Suzette Kent, the federal CIO. “A recent Harvard study found the one of the best ways to boost their ability to transform themselves and their products is by involving women and having a culturally diverse team. Another study looked at Fortune 500 companies and showed teams with equal numbers of men and women were more likely to be creative, share knowledge and fulfill the team’s objectives.”

Kent said these studies only underscore the federal government’s place as well ahead of  the private sector when it comes bringing more diversity into the technology sector.

“Last year, it was announced in the private sector the number of women who held CIO positions declined by 3 percent. That’s pretty sad,” she said. “But I’m happy to report here in the federal government, the number of women in cyber and IT positions is 40 percent. In the CIO Council, that’s a picture of diversity. In the entire team, 35 percent of the leaders are women.”

Kent and other federal CIOs say it’s not just about their journey to how they rose to be technology leaders, but ensuring the pipeline behind them remains strong.

Education Department CIO Jason Gray said his deputy CIO, Ann Kim, plans to hold quarterly get-togethers with the women in the office to discuss challenges and goals as a way to ensure growth of the staff.

“My deputy, who is a women and phenomenal, led the meeting. It was interesting because on their own, there was no agenda, they decided to use this as a medium to address challenges of the organization,” he said. “I’m looking forward what they come up with.”

At the Department of Health and Human Services, CIO Beth Killoran said when she first got to the agency, technology was dominated by men with it being 57 percent white male with an average age of 56-years-old.

Now, almost three years later, HHS has a 51-to-49 percent ratio of female to male in technology positions, and the diversity of staff doubled as well.

“We are making sure we are hiring a diverse leader workforce. They understand and value the role diversity plays and are instilling it throughout the entire organization,” Killoran said. “We are showing the value of diversity and hiring the right leaders that respect the value of diversity.”

Additionally, she said the HHS CIO Council is shifting to focus more on the mission instead of technology. The goal is to have the change at the broader enterprise perspective so the mission folks know why the technology will have an impact.

From her perch as federal CIO, Kent said diversity comes from people.

“What we are doing at the federal level to drive that diversity of thought is the way we are trying to approach design, and including the people who are using the services and the people who are delivering the services in that dialogue,” she said. “An easy way to measure can sometimes be different criteria and counting who is where, but what we are really going after is diversity of perspectives.”

Kent said one example of this is with the President’s Management Agenda’s cross-agency priority goals where OMB sought employees from different agencies to be sponsors of the objectives and members of the associated working groups.

Read more of the Reporter’s Notebook.


Agencies, vendors under increasing pressure to secure their supply chains

It’s always interesting to see how trends emerge in the federal market. Sometimes they come from a policy issued by the Office of Management and Budget— think Cloud first policy — and other times, they’re driven by a single office or person in the government who believes so strongly in the issue that they almost create the enthusiasm — think the General Services Administration’s excitement over blockchain or robotics.

Let me digress for a second, one vendor told me at last week’s KNOW Identity conference in Washington, D.C., that blockchain has been around for two years or more, so I fully realize GSA, or any federal agency for that matter, was really just catching up to some parts of industry.

But back to the point of the story, agencies and vendors alike should recognize when an issue comes up at nearly every conference, whether its technology or acquisition or financial management, you should pay close attention to it. Supply chain risk management is that topic for now.

Take what happened at a panel on cyber risk management at the KNOW conference, which One World Identity sponsored. Two congressional staff member emphasized the need for vendors to take real responsibility and ensure the safety of their entire supply chain. The message was clear to an audience of traditional and non-traditional federal contractors as well as agency officials, congressional interest in supply chain risk management is only increasing.

“What we are expecting from vendors, at least from our committee’s perspective, is transparency,” said Jessica Wilkerson, a professional staff member for the House Energy and Commerce Committee. “If you look back at some of the things we have been doing recently, one of the biggest ones is software bill of materials, where we are asking the Department of Health and Human Services to convene the health care sector to come up with a way to deploy software bill of materials. This is around the WannaCry incident. This is saying, essentially, if you hand me a black box, you kneecap my ability to protect myself.”

Wilkerson said HHS and the health care sector need to come up with a way to understand the technology code, the source of the code and how to patch, upgrade, secure that piece of hardware to ensure patients and healthcare providers are protected.

Nick Leiserson, the legislative director at the Office of Rep. James Langevin (D-R.I.), said lawmakers can’t sit back and trust vendors to manage their supply chains any longer against threats.

“You can see that is gradually happening in the Defense acquisition rules, the DFARS, in terms of pushing down requirements that are from the government to say, ‘We need to know about risks in your supply chain too.’ You can’t just look at something say, ‘This didn’t directly affect the network that is connected to it now, so you are fine and you don’t need to know about this,'” he said. “That, I think, there is increasing awareness in the federal government and in Congress, that this third party risk is an enormous problem.”

The Federal Communications Commission, last week, also took up the issue of supply chain.

FCC Chairman Ajit Pai issued a proposal asking telecommunications vendors to work with their suppliers to better protect their supply chains.

“Specifically, the draft Notice of Proposed Rulemaking, if adopted, would propose to bar the use of money from the FCC’s Universal Service Fund to purchase equipment or services from companies that pose a national  security threat to United States communications networks or the communications supply chain,” the FCC states in its release.

The FCC will vote on this proposed rule at its April 17 meeting.

Pai said in the release that he is proposing to prohibit telecommunications providers from using money they received from the FCC’s $8.5 billion Universal Service Fund to purchase equipment or services from any company that poses a national security threat to the integrity of communications networks or their supply chains.

“The money in the Universal Service Fund comes from fees paid by the American people, and I believe that the FCC has the responsibility to ensure that this money is not spent on equipment or services that  pose a threat to national security,” he said.

The concept of supply chain risk management isn’t new by any means. The National Institute of Standards and Technology issued a report in 2012. The Senate Armed Services Committee’s 2011 report on the Defense Department supply chain exposed serious problems.

But only in the last six months or so, particularly with the concerns about Kaspersky Lab and now Chinese companies, Huawei Technologies or ZTE Corp., supply chain risk management is a hot topic.

The Homeland Security Department recently announced a new initiative aimed at identifying some of the cyber defense gaps between the federal government and its contractors.

All signs point to an increased pressure on agencies and vendors to understand, be transparent and protect their supply chains.

Wilkerson said the Energy and Science Committee wrote to HHS in November asking for steps it is taking to protect medical equipment.

She said HHS has responded to the committee, which will continue to work with the department.

Leiserson said Langevin, who is the ranking member of the Armed Services Committee’s Emerging Threats and Capabilities subcommittee, said his boss is closely watching DoD’s implementation of the DFARs provisions.

“Congress has been saying, ‘We need to have a better understanding of the problem,’” Leiserson said. “Some of the IT modernization reports about shared services is really a great example of how we can look at this and not silo cyber by telling each agency to look at their piece of cyber by itself. It’s also tied to the critical infrastructure piece because there are interdependencies that we don’t understand. We can’t have something that is happening in pipelines hit the power grid and we didn’t know about it.”

Both said while nothing is specifically scheduled, vendors and agencies should expect continued oversight of supply chain.

Josh Moses, the chief of OMB’s cyber and national security unit, may have summed up the focus on supply chain risk management from a whole of government approach.

“There is a much greater responsibility on the part of the agency to have that fundamental understanding before you acquire a particular service from a vendor,” he said. “We are really pressing on that and that has been much of the public disclosure of late, and I would say from our end, expect to see more of that for the next couple of months and years.”

Read more of the Reporter’s Notebook.


GSA’s central contractor website victimized by fraud for second time

Best listening experience is on Chrome, Firefox or Safari.  Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Potentially tens of thousands of government contractors could be impacted by fraud.

The General Services Administration began alerting vendors on March 22 after it found a third-party changed the financial information of “a limited number” of contractors registered on the governmentwide System for Award Management (SAM.gov) portal.

“Entities should contact their federal agency awarding official if they find that payments, which were due their entity from a federal agency, have been paid to a bank account other than the entity’s bank account,” GSA writes in a SAM update on its website.

GSA says it has taken several steps to further limit any attempts to fraud the government by now “requiring an original, signed notarized letter identifying the authorized entity administrator for the entity associated with the DUNS number before a new SAM.gov entity registration will be activated.”

Vendors that haven’t been notified of fraudulent activity should log into SAM.gov and review their information to make sure it’s correct.

This is at least the third time SAM.gov has struggled to keep its information secure. In 2013, SAM.gov potentially exposed users’ information, including some Social Security numbers and bank-account information, to the public because of a cybersecurity vulnerability.

In 2016, the Justice Department unsealed charges against Dwayne C. Hans, a U.S. citizen, who was charged with wire fraud, computer fraud and money laundering. Part of the fraud was breaking into SAM.gov.

“During this unauthorized website intrusion, the defendant changed information in entries pertaining to the financial institution, including by replacing the bank account information for the financial institution with the defendant’s personal bank account information.” Justice stated. “As a result, the Pension Benefit Guarantee Corporation sent more than $1.5 million to the defendant instead of the financial institution. These fraudulent wire transfers were reversed once they were detected.”

Hans pleaded guilty in October and will be sentenced in April

“In April 2016, Hans accessed a website maintained by the GSA that allowed companies that worked with the U.S. government to provide information about how the government should disburse money to those companies.  Hans modified payment information in an entry associated with Financial Institution 1 in order to redirect payments to accounts he controlled. As a result, a U.S. government agency transferred approximately $1.521 million to Hans instead of to Financial Institution 1,” Justice states. “Those transfers were ultimately detected and disrupted before the defendant withdrew or transferred the money.”

GSA’s inspector general is investigating this most recent fraud.

Additionally, a GSA source says this fraud has garnered attention within the agency at the highest levels, including the formation of a “Tiger Team” across GSA to help deal with the fallout from the fraud. The source says addressing the fraud issues must involve Dun & Bradstreet, which is the entity verification service and has its own process for vendors verifying who they are.

The source said it’s unclear how many vendors were impacted by the fraud. But the new approach to entity verification is a lot of work and a cost GSA was not prepared for.

Sources say GSA executives briefed the Senate Homeland Security and Governmental Affairs Committee on Friday, but it was unclear whether the briefing already was planned or called for because of the SAM.gov fraud issues.

An email to the committee seeking comment on the briefing was not returned.

A GSA spokesman emphasized to Federal News Radio that what happened to SAM.gov was not a cyber or technical breach, but a case of fraud.

But Jeremy Grant, the founder and former director of the National Strategy for Trusted Identities in Cyberspace (NSTIC), housed in the National Institute of Standards and Technology (NIST) and now managing director of technology business strategy at Venable in Washington, D.C., said this certainly sounds like a cyber incident.

“If passwords to the SAM accounts were phished, then that is the definition of a cyber incident. It just happens to be a cyber incident that was used to perpetrate fraud. The fact that money was stolen instead of data does not change the fact that the attack method was based on exploiting weaknesses in the SAM authentication system,” Grant said in an email to Federal News Radio. “Symantec just this week released their 2018 Internet Security Threat Report, which noted that 71 percent of cyberattacks last year began with spear phishing. Given that the SAM system is critical to how government contracts are managed — and how contractors get paid — it’s not surprising to see that SAM accounts would be a target for phishing attacks.”

Grant said any federal website that depends on password alone for authentication and protection is destined to be a victim of a cyber attack.

“Passwords are the single most commonly exploited attack vector in cyberspace,” he said. “Given the importance of SAM, GSA should follow NIST guidance (SP 800 63-3) and require use of multi-factor authentication to protect accounts — preferably ‘high assurance strong authentication’ where at least one factor leverages public key cryptography. That doesn’t require full-blown Public Key Infrastructure (PKI) — SAM could follow the lead of firms like Google and Bank of America, and turn on support for FIDO Alliance’s Security Keys that deliverable unphishable authentication.”

Whether it’s a cyber incident or not, acquisition experts say this is a big deal for contractors.

Christoph Mlinarchik, a government contracts expert and owner of ChristophLLC.com, a consulting firm, said GSA’s temporary fix of requiring a notarized letter shows the agency doesn’t have  real solution for this problem.

“It’s a Band-Aid on a gunshot wound. GSA can say they’re doing something while they figure out how to actually prevent this in the future,” he said. “If you read between the lines, there was a rumor that bad actors breached SAM.gov and re-routed payments to themselves, although the payments were actually owed to contractors from the federal government. GSA seems to have confirmed this rumor. Here’s how it supposedly works. Company X wins a federal contract and performs the work, but the bad actors route the payments owed to their own bank accounts. Meanwhile, Company X and the federal government are clueless as to why Company X hasn’t been paid, not knowing the information in SAM.gov was changed to send payments to the bad actors. A financial scheme worthy of a Hollywood blockbuster!”

Larry Allen, president of Allen Federal Business Partners and a long-time GSA observer, said the fact that someone figure out a way to falsify SAM.gov information should be all vendors on high alert.

Another industry source with knowledge of GSA, but who requested anonymity because they do work for the agency, said while it’s unclear whether this latest problem for SAM.gov is part of the program’s overall troubled history, the attack is part of the problem every agency faces in securing legacy systems.

“SAM is an important system to secure and like a number of systems around the government, starting with OPM’s systems that were breached by the Chinese where before they were breached, they weren’t considered terribly important or strategic. Only when you do the analysis of what can be lost does it become clear how strategic these systems are. There may not be national security plans in OPM or SAM systems, but it does contain information that is important and sensitive like many systems and you have to consider it a target,” said the source. “It goes again to the importance of finding a secure identity management approach. A lot of problems still spring from transitioning from old to new systems and having strong access and identity management during those transitions can help.”

GSA is trying to upgrade SAM.gov, launching beta.sam.gov in 2017. The goal is to create a common acquisition platform where vendors can find all the information they want in one place instead of more than five different sites.

This second case of fraud and third overall problem is part of a long running saga to improve GSA’s acquisition systems. GSA has struggled to improve SAM.gov as part of the Integrated Acquisition Environment (IAE), first hiring IBM and then bringing in Booz Allen Hamilton after its initial effort failed.


Exclusive

Does a new association to promote digital innovation have Amazon’s fingerprints all over it?

Amazon Web Services is taking its ball and going home.

Amazon is unhappy with how its point of view around electronic commerce, cloud computing and digital innovation more broadly is being represented on Capitol Hill and within the executive branch by existing industry associations and organizations. So it’s leading the effort to create a new organization.

Federal News Radio has learned AWS is the driving force behind the soon-to-be-launched Alliance for Digital Innovation (ADI), a new industry organization that many in the federal contractor community believe will be made up primarily of Amazon partners and resellers.

Rich Beutel, a former House Oversight and Government Reform Committee lead staff member and principal behind the Federal IT Acquisition Reform Act (FITARA), at least at the staff level, will lead ADI. Beutel lobbied for AWS for the last three years, earning about $250,000, according to the Senate’s Lobbying Disclosure Database.

Beutel stopped working directly for AWS this month, but did receive a final payment of $20,000 for work done in the first quarter of 2018.

“The Alliance for Digital Innovation (ADI) is a coalition of customer-focused commercial companies helping to shape government IT modernization efforts through advocacy and thought leadership efforts,” according to a one-page brochure obtained by Federal News Radio. “Citizens deserve the same technological experience in their civic lives that they have come to expect in their personal and professional lives, and government mission owners deserve fast and unimpeded access to commercial technologies. ADI members are engaging government leaders and policymakers to advocate for citizen-centric commercial innovation to enable a productive digital experience and faster access to emerging technologies by mission owners.”

While the brochure doesn’t mention AWS by name, when you look at the properties of the one-page document, it shows Beutel as the author and “Amazon.com” as the company name — meaning Beutel created the document while lobbying for Amazon and the document was created on a computer owned by Amazon.

Multiple emails to AWS and Beutel seeking comment on ADI were not returned.

Industry sources say ADI is pursuing mostly AWS partners with a goal of showing lawmakers that advocating for a certain legislative or policy position isn’t just coming from Amazon.

Industry observers say the decision to start ADI comes at a key time for AWS with the commercial e-marketplace effort at the General Services Administration and the commercial cloud program at the Defense Department just getting started.

“Amazon doesn’t play well with others,” said one industry source, who requested anonymity in order to talk about a competitor. “Their agenda is very focused on their needs. On a lot of policy issues, vendors have to be collaborative and Amazon doesn’t always take that approach. They want it their way 100 percent and if not, they do a good job in making it uncomfortable for others, and even when they do agree, there are concerns they undercut the agreed upon position.”

Another industry expert said frequently AWS is the outlier and is driving a dynamic of conflicting interests in the narrative.

“My sense of where they are going and what they are talking about is creating a different voice to come in and say other groups don’t speak for [the] government IT sector,” the expert said. “They  want to be able to tell Congress, ‘Look at who we are speaking with and we speak for the sector.’ That is direction they are moving in.”

Amazon is a member of several existing industry organizations already, including the IT Alliance for Public Sector (ITAPS), the Professional Services Council (PSC), TechNet, the Cloud Security Alliance and the Industry Advisory Council (IAC).

Still, the move to create a new organization or association when there is disagreement in direction with other associations is not unheard of. Sources says the Business Software Alliance emerged from a disagreement among industry more than 20 years ago. Other organizations that haven’t been IT focused created sub-organizations to address their sector’s IT challenges and needs as well, including the National Defense Industrial Association and the U.S. Chamber of Commerce.

But to many observers, what is different this time is AWS using its strength in the federal market to push its already strong standing.

“We understand what they are trying to do is create a voice that is supportive of the AWS perspective. AWS has not found consensus over the last few years with other groups, so as I understand it, they are trying to create a counter voice for some of those things,” said the second industry expert. “Just about everything AWS has pursued as a business opportunity, the e-commerce portal or JEDI cloud at DoD, multiple groups are posing questions to policy stakeholders about how they are being formed. AWS pushed back and said they don’t need to answer those questions or those questions have been answered so let’s move on, or there isn’t a lot of veracity in the answers from government.”

The first industry source said while it’s true to some extent that every company pushes their own positions, few have gone to the extent AWS seems to be heading toward.

“The dynamic is not the same because we are in interesting times. AWS is a dominant cloud and e-commerce marketplace provider with monopolistic power in many ways,” the source said. “That’s creating a different dynamic and they want more, and others are pushing back creating tension. I don’t think the government recognizes that tension yet.”

The one-page brochure promotes the benefits of ADI as education and advocacy.

“ADI identifies and creates opportunities for dialogue between and amongst government leaders and policy stakeholders on the benefits of government adoption of commercial solutions and citizen-centric methodologies,” the document states. “ADI advocates for the enactment of policies and legislation that promote and enhance public sector adoption of commercial solutions that will drive broader digital transformation, innovation, rapid and iterative modernization, and improved security to modernize the delivery of enhanced citizen services.”

It’s unclear how much it will cost to belong to ADI, and whether it will be open to non-AWS partners.

The industry experts were unsure how much of a threat ADI will be to other associations or organizations, and details about the alliance’s plans remain vague.

But the fact that AWS seems to be the driving force behind it likely begins a new competition for attention on Capitol Hill. And at a time when the federal government is just gaining momentum for IT modernization.


5 ways the 2018 omnibus promotes IT modernization, cybersecurity

Rep. Will Hurd (R-Texas) said about 10 days before the end of the latest continuing resolution that he was optimistic that congressional appropriators would find some money for the Technology Modernization Fund.

Margaret Weichert, the deputy director for management at the Office of Management and Budget, told the House Oversight and Government Reform Committee the week before that she was hopeful that lawmakers would come through with funding for the governmentwide IT modernization fund.

This is one of those occasions where optimism and hopefulness were not for naught. Of course, those leaders in the federal IT and acquisition communities — OMB, lawmakers, chief information officers, deputy secretaries, acquisition professionals — use those terms because they can’t say the opposite. Nothing gets done if you say the program is doubtful to work, or the contract is unlikely to be awarded anytime soon.

This is why many times the media, contractors and even lawmakers are wary when federal officials regale us with optimism.

So let’s celebrate Hurd, Weichert and many others optimism this time because for once it was genuine.

In the fiscal 2018 omnibus spending bill signed into law Friday by President Donald Trump, Congress filled the TMF $100 million.

The report language and the bill offered no details, no instructions for how OMB should manage it. Now those details were laid out in the Modernizing Government Technology (MGT) Act passed in December as part of the Defense authorization bill and subsequent OMB guidance, so that’s a sign that Congress is optimistic about the fund.

Now the $100 million was about 49 percent of what the administration asked for, but it’s better than what lawmakers originally allocated, nothing. So it’s a good start for OMB and the TMF Board to prove its processes and move agencies off legacy IT systems.

“I am grateful for my colleagues who recognized not only extensive security risks, but also the tremendous opportunity we have to make government more efficient and reliable to taxpayers,” Hurd said in a release.

Along with the TMF, here are four other budget reasons to be optimistic about IT and acquisition in 2018:

Even more money for governmentwide IT initiatives

While Hurd, OMB and many others focused on the TMF, Congress was generous with several other IT-related funds.

Federal Citizen Services Fund receives $50 million and lets it “bank” as much as $100 million to “enhance [the government’s] ability to conduct activities electronically, through the development and implementation of innovative uses of information technology.” This fund includes money for the Electronic Government (E-Gov) fund and Congress wants continued oversight over how agencies plan to spend money on these projects. In the bill’s report, lawmakers say: “Any deviation from the spending plan required for Electronic Government projects shall require a notification within 30 days to the Committees on Appropriations of the House and Senate.”

The General Services Administration runs the fund and requested in $54 million in 2018 after receiving more than $57 million the last two years.

GSA says the citizen services fund will be used to support cloud computing security through the Federal Risk and Authorization Management Program (FedRAMP) as well as portals such as data.gov, the digital analytics program and challenge.gov.

“The fund supports agency-facing programs that drive governmentwide transformation to secure, digital government through shared services, platforms and solutions, and by providing technical expertise to agencies on projects that leverage digital technologies,” GSA wrote in its 2018 budget justification. “Extensive communities of practice in key areas including social media, mobile computing, user experience, prize and challenge competitions, and contact centers serve as a catalyst to drive adoption and improvement of digital services through development and sharing of best practices, training, and establishment of working groups to address tactical needs.”

IT Oversight and Reform Fund receives $19 million and OMB “may transfer these funds to one or more other agencies to carry out projects” to ensure the integration, efficient, secure and effective use of IT. OMB had requested $25 million in 2018, which was almost $5 million less than 2017.

“OMB will use ITOR funding in FY 2018 to enhance transparency, data collection, analytics, and technical assistance in federal IT investments. ITOR oversight activities will support continued operations and enhancements to the federal IT Dashboard and PortfolioStat reviews, identifying underperforming and duplicative investments and taking corrective actions,” OMB wrote in its 2018 budget justification. “Additionally, ITOR funds will support policy analysis and development efforts to support federal IT reform including FITARA oversight. ITOR funds will also support IT acquisition reform, including IT Category Management to improve the acquisition and management of common IT goods and services to drive us to greater performance, efficiencies and savings. For example, these oversight activities will increase the productivity of IT investments by optimizing and consolidating data centers, continuing the adoption of cloud computing, and increasing the use of intra-agency and interagency shared services.

OMB also uses ITOR funding to support the U.S. Digital Service and its Cyber and National Security Unit.

Governmentwide cybersecurity awash in money

The Homeland Security Department received $722 million for cybersecurity efforts, which was about $2 million more than requested in 2018.

Specifically, federal cyber programs saw a $45 million reduction in the final appropriation as compared to the request — $432 million instead of $477 million.

Cyber readiness and response. The National Cybersecurity and Communications Integration Center (NCCIC) gets  $244 million, including $174 million for the Computer Emergency Response Teams (CERT) and $17 million for training, malware analysis, safety systems vulnerability analysis, incident response and assessments of Industrial Control Systems in emerging sectors and subsectors.

“The NCCIC is directed to continue providing technical assistance to other federal agencies, upon request, on preventing and responding to data breaches involving unauthorized access to personally identifiable information,” lawmakers stated in the report on the DHS section of the bill. “GA0 made several recommendations designed to ensure that the NCCIC is adhering to its nine implementing principles under the National Cybersecurity Protection Act. Specifically, the report noted that the NCCIC had yet to determine whether those implementing principles are applicable to its eleven statutory cybersecurity functions and had yet to establish performance metrics for the principles. Not later 90 days after the date of enactment of this Act, NPPD shall brief the committees on its specific plans to address these GAO recommendations.”

Federal cybersecurity. DHS received $102 million for the continuous diagnostics and mitigation (CDM), which is almost $9 million more than requested. Congress wants DHS to use the extra money “to accelerate deployment of CDM to federal departments and agencies. NPPD is directed to provide a briefing to the committees on the current CDM program acquisition strategy and schedule not later than 30 days after the date of enactment of this Act.”

The National Cybersecurity Protection System (NCPS), which protects federal networks and data from cyber intrusions under the EINSTEIN tools, received $287 million. Similar to CDM, Congress wants semiannual briefings on the progress of the program and any obstacles DHS is finding.

Additionally, Congress approved $3 million for DHS and National Institute of Standards and Technology to conduct pilot programs to conduct regular assessments of advanced protective technologies.

At the same time, Congress also allocated CDM and the NCPS additional money for acquisition efforts.

CDM received $246 million, more than $50 million more than it requested, while the NCPS received $115 million, almost double its request of $56 million.

“The total reflects a realignment of $58 million from Operations and Support for the National Cybersecurity Protection System, as requested,” the bill report stated. “The total includes an additional $61.8 million to support acceleration of CDM capabilities to a broader set of non-CFO Act agencies and to accelerate mobile/cloud computing visibility across the dot-gov domain.”


3 reasons why CIOs will feel more heat in 2018

To listen to the Federal Newscast on your phone or mobile device, subscribe on PodcastOne or Apple Podcasts. The best listening experience on desktop can be found using Chrome, Firefox or Safari.

When Rep. Will Hurd (R-Texas) opened the House Oversight and Government Reform Subcommittee on IT’s hearing about the State of Federal IT last Wednesday, he focused on not losing momentum that built up over the last few years.

From the Office of Management and Budget’s IT modernization strategy to the CIO Council’s State of Federal IT report to Congress passing the Modernizing Government Technology (MGT) Act as part of the Defense authorization bill , agencies have tools and data to continue to swing the pendulum away from unsecured legacy technologies.

In fact, Hurd and OMB Deputy Director for Management Margaret Weichert, who testified at the hearing, both remain optimistic that Congress will fund the Technology Modernization Fund (TMF) for fiscal 2018.

“We have had conversations [with appropriators]. It will get populated. We still are having conversations on where that going to be. I feel pretty good we will have something there,” Hurd said.

Weichert said during the hearing that the administration is hopeful that the appropriators will fund the TMF.

The Trump administration requested $228 million for the TMF in 2018 and another $210 million for 2019. OMB recently released details and a memo about how agencies can apply for those funds, should they eventually get approved by lawmakers, and the board overseeing the fund held its first meeting.

At the same time, Hurd’s concerns go beyond the actions of the appropriators. One of the biggest concerns for the chairman of the subcommittee — and one of the most active members in the House when it comes to IT and cybersecurity — is the continued high number of open recommendations from the Government Accountability Office.

And it’s that issue that will turn up the pressure on CIOs and other IT executives for 2018 and beyond.

So with that in mind, here are three reasons why CIOs will feel more IT heat in 2018:

GAO’s deep dive

One of the most interesting things about hearings is when the Government Accountability Office offers a preview of its ongoing work. And it looks as if David Powner, the director of IT management issues at GAO, and his staff will be extra busy this year.

Powner offered insights into at least three major efforts around IT modernization.

One of the most interesting ones is reviewing why some of the most critical and largest IT programs, such as the Federal Aviation Administration’s Next Generation Air Traffic Control System or the IRS’s CADE 2 effort, continue to struggle.

“We have a review underway where we are identifying and profiling these most critical acquisitions,” he said. “The reason these acquisitions need OMB’s attention is because these agencies left alone haven’t managed them well. The administration’s attention to Veterans Affairs’ electronic health record solution is spot on. We just need more of this.”

History has shown when OMB gets involved in a program, the chances for success are much higher — think of the post-data breach cyber sprint or the Healthcare.gov website.

OMB’s push for TechStat and then PortfolioStat under the Obama administration was in part an answer to these ongoing problematic IT projects.

It’s unclear how the Trump administration is conducting its program and project oversight efforts, but GAO’s review will, once again, make it clear the White House’s attention is required.

A second ongoing review will identify and profile the systems across government that are 30, 40, 50 years old or older.

“The nation’s most mission critical legacy systems that are costly to maintain and post significant cyber risks due to unsupported software need to be replaced with modern, secure technologies and ultimately decommissioned,” Powner said. “OMB needs to have an active role here to ensure these old systems like VA’s VISTa system and IRS’s individual master file have plans to replace and decommission. The administration’s recent modernization strategy was solid on network modernization, shared services and cyber, but light on tackling these most challenging modernization efforts.”

Additionally, GAO says CIOs with short tenures don’t always tackle these legacy systems, which is why OMB’s attention to them is so critical.

GAO did similar work in 2016 for the full committee, finding, for example, The IRS runs two systems in use that are 56-years-old — the individual master file and the business master file, as well as a 53-year-old system DoD uses to run the nuclear that runs on an IBM Series 1 computer — a 16-bit minicomputer, introduced in 1976.

Agencies need a workforce that is properly trained to support all of these modernization efforts. That is the third area where GAO is reviewing, the cybersecurity and IT workforce gaps as well as CIO authorities.

Powner said agencies still need to properly identify and tackle these shortcomings.

“Properly addressing many of these needs with contractors is a critical part of this solution here,” he said. “CIO authorities still need to be strengthened despite significant improvements from Federal IT Acquisition Reform Act. Your push to elevate these positions in departments and agencies is still needed. Currently 13 of the 24 CIOs report to the deputy secretary or higher. OMB plays a critical role here, especially with the recent focus on agency reorganizations.”

The White House floated a draft executive order on reemphasizing CIO authorities back in January, but it’s unclear whether that order still is in the works.

Hurd said the subcommittee has received responses from those 11 agencies where the CIO doesn’t report directly to the head or deputy head.

“A lot of the responses were things that, ‘oh, it’s basically already the case.’ Well, if it’s basically already the case, make it the actual case,” Hurd said in an interview after the hearing. “This is a simple fix that goes a long way in ensuring that an agency is making cybersecurity and good system hygiene a priority.”

Hurd said the subcommittee wasn’t familiar with the draft EO.

The Office of Personnel Management and the National Institute of Standards and Technology are working with agencies to identify and recode the IT and cyber roles as required under the Federal Cybersecurity Workforce Assessment Act of 2015. The deadline was December but few, if any, agencies accomplished the goal.

Hurd also reiterated his plans to introduce a bill to establish a U.S. Cyber-Reserves public-private sector rotational workforce.

Red teams on the hunt

Hurd plans to pressure agencies to make better use of the Homeland Security Department’s security architecture reviews and risk and vulnerability assessments, or obtain similar services from contractors to discover and mitigate cyber vulnerabilities.

“When it comes to penetration testing, a passive scan is not a penetration test, and making sure that a good best practice is to use on a regular basis a third-party security folks to come in and do a technical vulnerability or penetration test,” Hurd said after the hearing. “That level of engagement is not happening as much as I previously thought.”

So now the subcommittee plans on sending a letter to agencies asking about penetration testing and several other of what Hurd called unresolved digital hygiene questions.

One reason for Hurd’s letter is to understand more about the inconsistencies around penetration testing.

Jeanette Manfra, the DHS assistant Secretary for the Office of Cybersecurity and Communications, said there is no common definition of what people mean about penetration testing.

“Our risk and vulnerability assessments…which is actively going to identify and exploit vulnerabilities,” she said. “We haven’t previously taken statistics on which agencies are using penetration testing. In the last fiscal year, we’ve done 42, and we’ve prioritized high value assets.”

Crystal Jackson, the high value asset program manager at DHS’s Cybersecurity and Communications Office, said on Thursday at the Information Security and Privacy Advisory Board (ISPAB) meeting in Washington, D.C., that over the last two years, DHS conducted 100 assessments, and plans to do another 60 in 2018 alone.

“We are working with agencies to rescope the HVA program within their own agency and how its ties into a bigger construct of the federal enterprise,” Jackson said at the meeting. “We are taking the list of HVAs and delving deeper into the critical assets of all agencies, what they touch, and they how impact the rest of government. We are looking to identify what is the federal government’s risk profile.”

Jackson said the HVA program has shown trends among agencies around problems with network segmentation and patch management, both of which are common hygiene problems Hurd is worried about.

She said agencies need more network rigor so if one part of their system is breached, the hacker can’t jump to another section and maybe steal more valuable information.

Manfra said agencies have made a lot of progress post-cyber sprint, reducing the time to patch critical vulnerabilities to 10-to-15 days on average down from 200 days in 2014.

The whole of government approach to understanding the network connections and risks associated with them is a major reason why Hurd and DHS want agencies to do more penetration assessments.

Jackson said DHS tends to focus a lot on the high value assets at the larger CFO Act agencies, which is why it is working with the General Services Administration to create capabilities under the cybersecurity special item number.

“We are working with vendors to offer the same types of assessments using the same methodologies DHS uses,” she said. “We want to get the same kind or more detailed assessments.”

DHS also is establishing a community of interest around high value assessments to share ideas, best practices and address common challenges.

She said DHS and OMB are developing the charter for the COI over the next few months.

“We want to make sure all the agencies have a voice in this program and the ability to share ideas,” Jackson said.

A scorecard evolution

Agency senior leaders should expect the committee to continue to bring up CXOs to find out on how they are implementing the Federal IT Acquisition Reform Act (FITARA).

Hurd said he was specifically frustrated with the Defense Department’s lack of transparency over its IT budget in the 2019 funding request to Congress as well as its continued low grades on the FITARA scorecard.

In the November 2017 scorecard, DoD received three “Ds” and two “Fs” across the five areas.

“As Rep. Gerry Connolly said, ‘If the boss doesn’t care, then nobody else will care.’ I’m going to continue when we do the next FITRA scorecard hearing bring in the CIO, the CFO and the deputy agency head,” Hurd said.

OMB’s Weichert added she understands the committee’s frustration over CIOs not having all the authority they need.

“We are looking closely at how we do we address CIO authorities through the President’s Management Agenda. We are laying out how all components of various authorities will work together and align their efforts while avoiding duplication while also giving maximum capabilities to CIOs. This IT modernization efforts have to include the CFO, the chief procurement officer, the chief human capital officer as all of them need to be there in lock step with the CIO.”

Hurd said he would like to penalize agencies on the scorecard if their CIO is not reporting directly to the agency head or deputy head.

Additionally, Hurd said he still plans to transition the FITARA scorecard to one that focuses on digital hygiene.

Connolly (D-Va.), who co-authored FITARA, said he is supported of the transition but not until agencies have made more progress in meeting the requirements under FITARA.

Read more of the Reporter’s Notebook.


« Older Entries