Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

GSA-OPM set March 2019 timeline to complete initial merger

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

September will mark 13 years since the General Services Administration merged the Federal Technology Service and the Federal Supply Service to make the Federal Acquisition Service.

It’s a good time to remember what GSA went through in 2005 because once again, the agency is going down a similar path of trying to bring employees, their cultures and the services together. GSA is implementing the Trump administration’s proposal to bring the HR Solutions office at the Office of Personnel Management into FAS.

Back in 2005, GSA hired a contractor to help with the merger, chose leaders of the organizations to develop and implement a plan, received congressional approval and then completed the blending of the two organizations.

Today, GSA is following a comparable playbook. The agency first named Mary Davie, the deputy commissioner of FAS, to lead the planning effort. Now, GSA is looking for vendor help. It released a request for information last week seeking feedback across five broad areas.

“This is a very ‘GSA’ way of going about things to hire contractors and get recommendations,” said one former OPM official, who also is familiar with GSA and who requested anonymity because they did not get permission to speak to the press. “Part of the reason for needing a help is OPM doesn’t have the acumen to support this merger.”

The RFI also gives the first timeline for the merger. GSA said it expects the vendor to complete its work, which includes the rebadging of 462 employees in HRS and other actions, by March 30, 2019 to prepare for operations day one, which is scheduled for later in the spring. By 2020, GSA said it expects to complete the full merger.

“Contractor support for this effort is envisioned as an extension of the GSA/OPM task force, to support the transition and transformation activities,” the RFI states. “The scope of this initial effort includes: Supporting the planning and execution of the initial transition of the HRS organization and personnel into GSA; optimizing services and costs, improving alignment with GSA offices and functions and identifying opportunities to reduce duplication and overlap; and identifying further opportunities to transform the delivery of services that today comprise HRS and other, related services currently offered today by OPM and GSA.”

Responses to the RFI are due Aug. 17 and GSA said it expects to release a final solicitation before Sept. 30.

OPM-GSA merger makes sense

The eventual contractor will have its work cut out for them as former OPM and GSA officials say like any merger, there are culture and process obstacles that will need to be addressed.

“I think that contractors are very good [at] being objective listeners and facilitators of communications, but internally people have to want to listen,” the source said. “A contractor is not going to change what a federal employee has been living with for years and years. That will have to be driven by internally by leadership.  They will have to believe in it.”

A government source with knowledge of OPM said moving some parts of HRS into GSA make sense for several reasons. The source, who requested anonymity because their current agency didn’t give them permission to talk to the press, said OPM already transferred the contracting capabilities of HRS to GSA to run the $11.5 billion Human Capital and Training  Solutions (HCATs) multiple award contract. In fact, as a bit of an aside, it’s interesting that GSA and OPM also announced last week they were reducing the fee under HACTs to 0.75 percent from 2 percent as a way to make the contract more competitive and less costly than the schedules program.

Without a doubt over the last two years, HCATs has not lived up to expectations. So dropping the fee is an attempt to make the contract more attractive over the last two months of 2018 and into 2019.

The source said it also makes sense for GSA to take over the shared services that HRS provides USAJobs.gov, USAstaffing.gov and other related systems.

“Probably the hardest overall issue will be figuring out the cost recovery model. In the past, the training and management assistance (TMA) covered a lot of the costs for the rest of the HRS organization. When TMA integrated with GSA [through HCATs] that model was not sustainable in setting fees,” the government source said. “My guess is the other HRS programs probably have questionable financials now. GSA will likely consider each of these HRS components on a standalone basis and that will likely lead to a hard look at staffing and making investment decisions based on profitability versus an OPM culture that prioritizes HR programs, even unprofitable ones, and carrying people, even unproductive ones.”

At the same time, however, figuring out where HRS’s traditional training efforts such as the Federal Executive Institute, or its fee-based consulting services for human resources strategies and development would fit inside GSA is not as clear.

“One of things they are trying to do is make a case for the value proposition for why they would do this so it will help to have a third party do that work,” the source said. “What’s interesting to me is there have been opportunities for OPM and GSA to partner before. But it was not until HCATs that they made it work. OPM really takes an OPM-centric view and in the past didn’t see themselves as shared services provider, but rather as doing their mission. In a sense, they weren’t excited for shared services and may have missed some opportunities early on. Now that the merger is actually happening, GSA and OPM will have to deal with the culture change that comes with any big change like this one.”

OPM heading toward a culture clash?

The first source agreed that OPM is a unique place compared to GSA.

“GSA has a much more results-oriented approach to delivering programs than OPM,” the source said. “Overall, especially in parts of HRS that use contractors, this will be a much better way to do business. I think especially in intermediate and senior level staff the move to GSA will be a bit of a shock and they will have challenges adapting to GSA’s culture.”

One way GSA is trying to address these culture challenges is by bringing in David Vargas, the former OPM director of the HR Line of Business and acting chief information officer and deputy CIO.

One industry expert, who also requested anonymity because their company does business with both GSA and OPM, said the merger is concerning for several reasons.

First, by adding HRS from OPM, GSA becomes that much bigger of an agency and will play a larger role in other agencies day-to-day so what’s the impact of those changes across government.

“The government needs to focus on decentralizing and distributing key OPM authorities out to the agencies,” the source said. “The track record of federal HR shared services is poor and has increased costs and added layers of inefficiency instead of the other way around – how does this help? What’s the business case for it? What does this do to free agencies to access better technologies to improve talent acquisition and talent management business processes and outputs inside the federal hiring agency?”

Read more of the Reporter’s Notebook


Oracle, DoD face-off over cloud contract rests on single award rationale

If nothing else, Oracle Corp. has kept the federal community entertained over the last few years. First, the software giant submitted a stingingly candid commentary on the Obama administration’s IT modernization efforts, detailing three false narratives perpetuated over the last decade.

Now, Oracle offered a biting rejection of the Defense Department’s Joint Enterprise Defense Infrastructure (JEDI) cloud contract just 11 days after the Pentagon released the final solicitation.

Oracle’s protest of DoD’s decision to make JEDI a single source, single award is almost as aggressive of a takedown as was the company’s IT modernization comments.

“This anti-competitive RFP violates law and regulation, and creates significant risk that DoD will award a 10-year, $10 billion contract to a company that will not offer the best value for all the potential JEDI Cloud user’s current and future cloud service needs,” Oracle’s lawyers write.  “The DoD determination and findings oddly intimates that DoD will receive proposals for firm fixed prices to meet DoD’s future, unarticulated tactical cloud computing needs (classified and unclassified) for the next 10 years and today can determine the single best value cloud computing technological leader over the next 10 years when some — if not most — of the impactful technology has yet to be developed.”

While the Government Accountability Office reviews and analyzes the JEDI case under a November deadline, we asked three federal acquisition lawyers for their take on the case. None are associated with Oracle or the case, and all are offering their opinions based on Oracle’s public filings.

Federal News Radio talked to Steve Schooner, Nash & Cibinic professor of Government Procurement Law and co-director of the Government Procurement Law Program at the George Washington University. FNR also spoke with Antonio Franco, a partner with PilieroMazza PLLC, and Charles Tiefer, a professor of Government Contracting at the University of Baltimore Law School.

Of Oracle’s three arguments about why the RFP is flawed, which is strongest?

Franco: Oracle’s strongest argument is that the Department of Defense has not established that a single award is appropriate for such a large, long term contract. With multiple award contracts offering more options to accommodate technological innovations, competition among multiple contractors seems more appropriate under the FAR. Putting aside questions about the rationale for the large single award, Oracle also raises legitimate questions about compliance with the FAR.

Tiefer: Oracle’s strongest argument is that Defense’s restriction of JEDI to a single awardee violates the policy of both Congress, by the Federal Acquisition Streamlining Act (FASA) law, and the FAR regulation, to compete task and delivery orders.  Congress, by the FASA law, and the FAR regulation do not intend that there be only one competition at the beginning, and then monopoly award of delivery orders, in a 10-year contract for billions of dollars of services.

The JEDI contract concerns services that are very rapidly changing and evolving, and Defense just doesn’t justify skipping competitions over task orders for the next decade. It violates common sense to set up what is basically a monopoly contract for such services.

Schooner: If GAO determines that the contracting officer’s underlying determinations and findings was defective, that’s a much easier case. Maybe my favorite/best technical/micro-level argument is the lack of finite scope criticism. The RFP does not purport to identify the “specific tasks to be performed” in contract year one, much less over the 10-year period of performance. In other words, there is no baseline for apples-to-apples comparison.

Oracle writes, “The RFP neither establishes the prices that DoD will use across the contract term nor identifies the specific tasks to be performed. Both the pricing and the cloud service offerings are dynamic. The same is true of the commercial marketplace of third party software DoD seeks to access.” That’s all part and parcel of the “RFP does not provide a reasonable basis to assess the relative price to DoD of making a single award …” Oracle writes.

Did Oracle make a good enough case to show that they were actually prejudiced enough by the RFP?

Tiefer: Oracle made a convincing case that they are actually prejudiced. They show that the only way DoD could find a loophole to make a single award was to find the unbelievable. Namely, Defense would have to credibly set out now the details for competing on a firm fixed price for all future task orders. But, Oracle cannot be obliged to set genuine firm fixed price bids for the highly diverse future task orders, impossible to anticipate this far in advance, for which the details are, of course, quite unavailable.

Franco: Oracle has made a credible claim of prejudice as the GAO does resolve doubts about prejudice in favor of the protestor when an agency allegedly violates procurement regulations. Oracle has raised legitimate issues for the GAO’s review. If the GAO ultimately finds that no procurement regulations have been violated, the GAO can decide the case on the merits. Finding that Oracle has not been prejudiced by the alleged procurement violations to avoid a decision on the merits would be unfortunate as Oracles raises important questions about a huge contract.

Schooner: Neither DoD nor commercial technological marketspace leaders can accurately predict where the still nascent cloud computing industry will be or who will lead it five years from now, much less 10. With quantum computing, blockchain, artificial intelligence and machine learning, internet of things and other technologies actively disrupting a disruptive technology, the only constant is change. DoD knows this, which leads to this obvious “duh” moment. Firm fixed price contracts looking ahead 10 years for evolving technologies is professional guesswork.

Relying on, or expecting to hold a contractor to, those FFP prices is unrealistic, a fool’s errand, inconsistent with experience, delusional, irresponsible — fill in the blank with something that rhymes with arbitrary and capricious. The DoD determination and findings oddly intimates that DoD will receive proposals for FFPs to meet DoD’s future, unarticulated tactical cloud computing needs — classified and unclassified — for the next 10 years and today can determine the single best value cloud computing technological leader over the next 10 years when some – if not most – of the impactful technology has yet to be developed.

What legal questions remain outstanding about this entire procurement?

Schooner: Challenging the nature of a stated requirement or the basic acquisition strategy – in a vacuum – is a tough road to hoe. Agencies are entitled to a fair amount of discretion in most acquisition planning disciplines. The bar should be relatively low for DoD to demonstrate there was a rational basis for this strategy. The Competition in Contracting Act of 1984 and the Federal Acquisition Regulation suggest a strong open market, i.e. full and open competition, bias. Nonetheless, closing a market, even a large market, for a long period of time is not in and of itself objectionable. DoD frequently closes massive markets through industry consolidation.

The Navy relies on a single shipyard for nuclear aircraft carriers and two shipyards for nuclear submarines. The Air Force selected a single air frame for the next generation of in-flight refueling. The Army did the same with its new sidearm, or modular handgun system. The reality is that sustaining competitive markets typically involves purchasing excess capacity or subsidizing potential competitors, longstanding, common practices that have fallen out of favor over the last few decades.

Franco: This procurement likely raises a number of other legal questions which I have not been able to evaluate. The one question that does come to mind was how the DoD is going to reconcile its supposed need to support a “common environment” for artificial intelligence and machine-learning requirements when the department has recognized it will always have multiple cloud environments. The DoD contemplates cloud users to include all of the DoD, including the Coast Guard, Intelligence Community, countries with which the United States has defense arrangements, and federal government contractors. How does DoD reconcile the conflicting goal of having multiple cloud environments with the common environment the procurement seeks to develop?  If the agency decides that it needs to go with one common environment, it means that the huge cloud market may be the domain of one contractor for the next 10 years. It is questionable whether that serves the government’s best interest and offers the best value.

Tiefer: The key remaining legal question is how much backing the top leadership of DoD can give to its nominal findings that the RFP can justify a single award for 10 years based on firm fixed prices. Another legal question for the JEDI contract, not covered in this protest, is whether DoD is setting up this RFP so that it is tilted in favor of Amazon because of Amazon’s unique credential of a past CIA contract. The super-high level of classified treatment for the CIA is unnecessary for 99 percent of what DoD handles, and for Amazon to have a high preference just for that limited aspect is for the tail to wag the dog.

Oracle’s long term view

GAO has until November to make a decision. But Schooner said this initial protest by Oracle may be more about setting the groundwork for future complaints.

“Also, even if Oracle doesn’t prevail, the protest may build a record that could potentially constrain DoD or tie DoD’s hands once the competition actually begins,” Schooner said. “In other words, Oracle may simply be seeking to obtain additional information and create a record upon which to build a subsequent protest, once Oracle is excluded from the competition or DoD actually chooses its JEDI partner.”

The one thing is clear in all of this, it will be a lot of fun watching this saga unfold and hopefully all parties involved will learn something about listening.

Read more of the Reporter’s Notebook


5 lessons CIOs can take from confusion around FCC’s alleged cyber attack

After reading the Federal Communications Commission’s inspector general report that the commission’s then-chief information officer overstated — maybe even lied — in 2017 about suffering from a distributed denial of service (DDOS) attack when HBO’s John Oliver did a segment on the dangers of ending net neutrality, I had a simple question: Who cares?

Ajit Pai
Federal Communications Commission Chairman Ajit Pai

It’s not that anyone should condone the hyperbole or the outright attempt to misinform the public by a federal employee. But I just couldn’t get too excited about the report, especially given the fact that David Bray, the FCC CIO at the time, has been gone for more than a year and the net neutrality debate is over for now. On top of that, Bray’s reputation as an honest and caring federal employee leaves me wondering if this was more politics than problem.

So instead of going over all the unpleasant details of the IG’s 106-page report and trying to use our 20/20 hindsight to place blame on Bray and his staff for jumping to conclusions or sticking to a narrative that obviously had gone off course, I’ve asked former federal CIOs to offer some lessons learned to current federal executives about dealing with a similar situation that is likely to happen again.

Lesson No. 1: Stay calm

Sounds obvious, but panicking or jumping to conclusions based on an emotional reaction or political pressure is a real threat. Jonathan Alboum, the former CIO at the Agriculture Department and now chief technology officer at Veritas, said CIOs need space to assess the situation.

“It’s too easy to jump to conclusions, which is a sign that you don’t have a great understanding of your IT environment,” he said. “The better you understand what and where your data is and how your systems function, the faster you can get to the root cause.”

Tony Scott, the former federal CIO, said IT executives need to be careful in what they say without a thorough understanding of the facts because Congress is listening. Be cognizant that stating something as fact could have long-term repercussions.

Lesson No. 2: Have a plan

Several former CIOs highlighted the need to know what to do when something bad happens. It’s more than who takes the system offline or what’s the best number to reach the Homeland Security Department, but you need to know all of the things that happen a few days, a few weeks and a few months after a cyber event.

“Ensure that effective incident response policies and standard operating procedures exist,” said Simon Szykman, the former Commerce Department CIO and now chief technology officer at Attain. “A strong policy/process framework will help ensure that incident response actions and reporting match the actual circumstances surrounding an incident, making it less likely that follow-on activities or communications get off track.”

The plan also should dip into the system architecture environment, said Shawn Kingsberry, the former CIO at the Recovery Accountability and Transparency Board and now vice president for digital government and citizen services for Unisys Global Public Sector.

“It is also clear that a plan should be in place to protect expected demand volumes, and a contingency plan should be also available, should the volumes greatly exceed anticipated demand,” he said. “This should become a part of the DNA of the organization and how they execute.”

Alboum said while outages are not common and can happen for a variety of reasons, having a strong resiliency program will help agencies turn their focus from what happened to the speed with which the organization can respond, correct the issue and resume successful operations.

Lesson No. 3: Obtain multiple points of view

The one big mistake Bray and the FCC made was failing to alert the Homeland Security Department’s U.S. Computer Emergency Response Team (U.S.-CERT) if this was indeed a DDOS attack. Federal law and regulations require agencies to contact US-CERT should a major incident happen, and the taking down of the commission’s comment system would seem to fit the bill. The former CIOs said having that third party review audit logs and data will help put you on more solid ground.

“You need to get multiple points of view or multiple analyses of the problem,” Scott said. “You shouldn’t rely on a single source. Triangulate what actually happened because it could easily be a different technical issue than you first thought.”

Szykman added having that independent, objective analysis, whether it’s through an informal request like a chief information security officer from another agency or through formal channels to US-CERT, puts the CIO in a better place when discussing what happened with agency executive, Office of Management and Budget officials or lawmakers.

Lesson No. 4—Inspire trust

This is the one thing Bray always had going for him, the federal IT community, FCC leadership had confidence in what he did and what he said. So when he said the attack appeared to be a DDOS type of attack, FCC commissioners and lawmakers had every reason to believe him.

Alboum said trust from being correct as much as it does from simply saying “I don’t know right now.”

“At the same time, the leader must be able to articulate the things they are doing to get to the bottom of the situation,” he said. “However, if you don’t have visibility into your data or understand how your systems are working, then it’s much harder to articulate how you are going to resolve a crisis.”

Scott said sometimes you can inspire trust by admitting you were wrong with your first conclusion.

“CIOs have learned over and over again that what things first appear to be often change when the full facts become available,” he said. “Most of the time, your first inclination is probably wrong.”

Lesson No. 5: Data rules the day

This was another shortfall of Bray and his staff. FCC IT staff told auditors they didn’t have enough information from event logs to make any thorough determination of what happened. That led auditors and other experts to question why they were so sure the comment system suffered a DDOS attack versus just a significant spike in usage after the John Oliver show.

Every former CIO said having the right data and being able to present it to agency leadership and lawmakers is how you keep the conversation moving forward.

“Government organizations today need to be prepared and make strategic choices around data collection, data storage, utilization and the location of that data – and how it can be safeguarded against system outages or cyber-attacks. Recovery from an a system or data center outage is exacerbated by the complexity that has been built into our IT environments. Getting to the bottom of a problem in our complex, hybrid, multi-cloud world, is not easy. There too many moving pieces,” Alboum said. “To understand the true cause of a system or data center outage or cyber-attack takes time. It also takes understanding of how all the pieces fit together. If you don’t understand where your data is located, you will be at a loss to answer questions from your leadership or respond to the public in the fastest and best way possible.”

Kingsberry added if agencies are building systems using digital standards then when looking at an incident, you will have consistent development and security operational services to understand the problem.

“Logs from applications, network devices and security tools can provide data to confirm or invalidate assumptions regarding the nature/source of an incident,” Szykman said. “Don’t draw firm conclusions prior to completing a sound analysis, and be open to revising hypotheses if necessary.”

And finally Scott said having the right data also lets you fill all the gaps in communication with auditors, Congress, agency leaders and cyber staff.

In the end, communicating with all the stakeholders from a position of knowledge will ensure you can avoid getting caught up in a political firestorm.

Read more of the Reporter’s Notebook


DHS moving quickly to get National Risk Management Center off the ground

The Homeland Security Department isn’t waiting around to get its new National Risk Management Center up and running. DHS name Bob Kolasky to serve as the center’s first director. Kolasky currently is the assistant secretary for infrastructure protection in the National Protection and Programs Directorate (NPPD).

“Bob is uniquely qualified to lead this significant undertaking and I am confident he is ready for the challenge,” wrote Chris Krebs, the DHS undersecretary of NPPD, in an email to staff obtained by Federal News Radio. “Bob will stand up a planning team and begin his transition to lead the center.”

Bob Kolasky

DHS Secretary Kristjen Nielsen announced the new National Risk Management Center on July 31 at DHS’ Cybersecurity Summit. Nielsen said the new center would help break down some of the communication barriers that exist between the government and sectors when it comes to sharing cybersecurity threats.

“Our goal is to simplify the process, to provide a single point of focus for the single point of access to the full range of government activities to defend against cyber threats,” Nielsen said. “I occasionally still hear of companies and state and local [governments] who call 911 when they believe they’ve been under a cyber attack. The best thing to do would be to call this center — this will provide that focal point.”

Krebs said Steve Harris will serve as acting principal deputy assistant secretary alongside Scott Breor who will continue to serve as acting deputy assistant secretary.

“Steve, along with the rest of the IP leadership team, will continue the work Bob has undertaken, including enhancing our physical security capability and continuing regionalization efforts,” Krebs wrote.

Kolasky has been with DHS since 2007 serving in a variety of roles, including the assistant director in the Office of Risk Management and for the last six years in OIP. He also spent time as an analyst for the Government Accountability Office and worked in industry.

In addition to naming Kolasky and making other related  personnel moves, Krebs offered more insight into how the new center will work with existing DHS programs, including the National Cybersecurity and Communications Integration Center (NCCIC) and the National Infrastructure Coordinating Center (NICC).

“[W]e identified a clear need for tighter collaboration across industry and government, not just in cybersecurity efforts, but in generally understanding and addressing existing and emerging risks. So as we continue to integrate the watch and warning functions of the NCCIC and NICC, we must also enhance efforts to understand holistic risk conditions across our nation’s infrastructure, whether cyber or physical — what’s essential, what’s a potential single point of failure, and what functions and services underpin our very society, government, and economy,” he wrote. “The NCCIC will continue to be our eyes and ears for cyber and the NICC for physical threats. The National Risk Management Center will be the engine for how we understand and the platform by which we’ll collectively defend our infrastructure.”

Krebs said pulling the eyes, ears and body together is part of how DHS will operationalize risk management.

“That higher order understanding of risk, criticality, and how to increase resilience has been at the heart of Office of Cyber and Infrastructure Analysis’ (OCIA) mission since its inception,” he wrote. “The establishment of the center represents the elevation of that mission and the operationalization of the secretary’s authorities to lead and coordinate national critical infrastructure protection efforts alongside our government and industry partners.”

From FERC to FDIC

Along with the changes at DHS, there are several other important people on the move in the federal technology and acquisition communities.

Mittal Desai moved to the Federal Deposit Insurance Corporation (FDIC) from the Federal Energy Regulatory Commission (FERC) to be the deputy chief information security officer.

Desai had been at FERC for 11 years, including the last four as its CISO.

The FDIC has replaced a good portion of its CIO executives over the last year with Howard Whyte rising to be FDIC’s chief information officer in October. Whyte hired Zach Brown in from the Consumer Financial Protection Bureau April to be the permanent CISO and now Desai.

The Marines Corps named Brig. Gen. Lorna Mahlock as its new CIO in July.  Mahlock is the first African-American woman to achieve the rank of brigadier general in the Marines. She received the promotion in April.

Before becoming CIO, Mahlock served as the deputy director for plans, policy and operations and commanding officer of the Marine Air Control Group 18 in Okinawa, Japan.

The Marine Corps had been without a permanent CIO since Brig. Gen. Dennis Crall left in February for a new position in the Office of the Secretary of Defense. Ken Bible, deputy director of C4 and deputy CIO, had been acting CIO.

New executive at NIST and former VA CIO lands

Additionally, Andy Blumenthal started a new position as program manager in the Office of Associate Director for Management Resources (ADMR) at National Institute of Standards and Technology.

Blumenthal joined the government in 2000 and served in a variety of senior IT roles including chief enterprise architect for the Secret Service and the State Department’s CIO for Global Information Services. Since 2017, he worked in the Department of Health and Human Services as the deputy chief operating officer in the Office of the Assistant Secretary for Preparedness and Response.

Finally, former Veterans Affairs CIO Scott Blackburn returned to his former company, McKinsey & Company, after leaving the government in early July.

Blackburn, who previously spent nine years with McKinsey, came back to the consulting firm in the public sector office focusing on health care, technology and large-scale transformation.

He spent more than three years at VA, including the eight months as interim CIO.

Read more of the Reporter’s Notebook


GSA wants to rescue innovations from the ‘valley of death’

Pallabi Saboo is the CEO of Harmonia Holdings Group, a small women-owned business, with revenues approaching $20 million.

Harmonia Holdings Group is working for the Marine Corps to modernize its legacy system by using a tool the company developed to automatically migrate old software code into leading-edge languages.

Saboo said Harmonia Holdings couldn’t have won the contract without the Small Business Innovation Research (SBIR) program. Harmonia Holdings has won 89 SBIR grants over the last 15 or so years.

But Saboo said her company won the Marine Corps contract and many others not because agencies are using the SBIR program like Congress intended, but despite of it.

“The best kept secret in Phase III of the SBIR program is that any agency can put millions of dollars on the contract without competition. It’s very similar to an 8(a) contract,” Saboo said in an interview with Federal News Radio. “The problem is most contracting officers have no clue how to do that. I’ve tried to sell that until I was blue in the face. I have 89 different SBIRs, but I can’t get them to use it. That is challenging.”

Instead, Harmonia Holdings is using SBIR as a kind of venture capital fund where Saboo can obtain funding for research and development and then once the tool or software is ready, offer it back to the government through other contract opportunities.

“I use SBIR technology opportunities as areas where I want to develop competencies for my company,” she said.

GSA to offer Assisted Acquisition Services

And this is why the General Services Administration’s new pilot to use its Assisted Acquisition Services in the Federal Acquisition Service aims to connect companies like Saboo’s with federal agency buyers.

“We partnered with Small Business Administration to make this pilot program happen, said Mark Lee, GSA’s assistant commissioner of the FAS Office of Policy and Compliance, during a press briefing on July 30. “We think AAS is positioned to set up a shared service across the government to help support innovative solutions throughout the marketplace and that can lead to the job creation the program is designed to bring. There is a unique opportunity to work with companies early on and bring them to a contract vehicle and get them opportunities.”

Lee said the client support centers in Region 5 and the FedSIM office will run the pilots and have dedicated support to help expand the use of companies under the SBIR program throughout the rest of the federal acquisition offerings by GSA.

Saboo said any more attention Phase III SBIR companies can receive to help commercialize their technologies, the real key is whether FAS trains its acquisition workforce to use the program to the full extent the law allows.

“Once you finish Phase III, you are in the valley of death because there is no specific need anymore,” she said. “If you find someone who is interested in your project, you need money to really expand the project to do what they want, and the customer, especially in the Defense Department, may not be able to get you money for two years because of how the budgeting process works. We, and many other companies, have never been able to use the Phase 3 contract because contracting officers don’t know how to use it and the gap between funding is too much.”

Lee said part of the reason why GSA is stepping in is to address similar problems Saboo is referring to.

He said 13 agencies participate in the SBIR program and many lack a dedicated contracting shop so having AAS support SBIR contracts with a dedicated area

SBIR participating agencies

Jeff Koses, GSA’s senior procurement executive, said AAS is offering access to the SBIR Phase III companies currently and the pilot runs through fiscal 2019.

“We will have a governance process in place to ensure we understand the rules and making most effective use of the program,” he said. “We will communicate with SBA about future possibilities, such as extending the pilot to other parts of GSA or other areas of the SBIR program. When a customer agency comes to us with authority to use a company in the SBIR program, GSA will be the acquisition arm to support that makes the connection between the agency need and industry partner.”

John Shoraka, a former SBA associate administrator for government contracting and business development and now managing director of GovContractPros, said GSA’s idea is a good one, but the agency and SBIR companies also need to keep in mind several things.

“There should be significant SBA involvement as their Office of Innovation and Investment (OII) manages the SBIR program and reports on it to Congress,” he said. “The reporting of SBIR awards is unclear, and it is also unclear how many agencies use all of their dedicated SBIR funding, and how successful the funding is. There should be a report card grading system like the small business report cards to Congress for SBIR.”

Saboo said another way to improve the SBIR program would be for agency acquisition officers and commercialization assistance programs to spread the word about how the program works consistently and persistently.

“GSA doesn’t have to do matchmaking. I can take products and create opportunities if contracting officers are open to listening and understanding what they can do with the SBIR program,” she said.

Read more of the Reporter’s Notebook


GSA, NGA shrink time to cyber-approve systems from year to month

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Quick, which federal cybersecurity process is most costly, takes the longest and is hated by most federal chief information security officers, program managers and chief information officers?

If you answered, the Authority to Operate (ATO), you’d probably be correct.

For an agency to get a system from test and development to full production it must have an ATO, meaning the system owner must approve the risk-based assessment and security controls that are running on top of the system.

For many system owners, obtaining an ATO can take a year and cost hundreds of thousands of dollars, and while it is supposed to be good for a year, without continuous monitoring, it really is good for about five minutes once it goes online.

“It’s also quite common for the specific steps to obtain an ATO to differ from person-to-person, which ultimately encourages everyone to take many unnecessary extra steps ‘just in case,’ further prolonging the process without adding benefit,” wrote Nick Sinai, a former federal deputy chief technology officer and now a venture partner at Insight Venture Partners and an adjunct faculty at Harvard, in a Medium post July 2017. “Further, the weight of the process strongly discourages ever changing a system, which would trigger the ATO process anew; this stagnation leaves most government legacy systems at great risk for vulnerabilities, not to mention lost innovation and productivity.”

But what if an agency could get an ATO in 30 days or five days, or one day? That idea was a dream industry experts and federal executives have talked about for years.

Today, that dream is much closer to reality.

When a need meets the desire, amazing things can happen, especially in government.

The General Services Administration’s Technology Transformation Service had a backlog of 30 systems that needed new ATOs. Aidan Feldman, an innovation specialist at GSA’s 18F organization, said there was no way TTS could clear out that backlog if every ATO took six-to-18 months.

So 18F did what it is known for, taking a big hairy problem and breaking it down into consumable pieces.

Feldman said 18F created an ATO sprint team that streamlined the process and reduced time to complete an ATO from six-to-18 months to 30 days.

“The real key to this ATO sprint team was getting everyone in the same room. We all work remotely so in this case it was a virtual room,” Feldman said in an interview with Federal News Radio. “If we have more conversational back and forth in real time, it increased the understanding on both sides, and greatly reduced the overall time to complete.”

18F reduced the backlog of systems that needed new ATOs in 18 months, and maybe more importantly, created a repeatable process.

Source: GSA 18F

The dream is even more real over at the National Geospatial-Intelligence Agency. Just over a year ago, Jason Hess, the one-time cloud security architect and now vice president of global security at JPMorgan Chase, excited the federal IT community by talking about getting an ATO in a day.

While NGA hasn’t met that goal, Matt Conner, NGA’s chief information security officer, said after the Aug. 1 Meritalk Cyber brainstorm event in Washington, D.C., that the agency has realized an ATO in as little as three, five and seven days.

“We are continuing to build the telemetry necessary, the business rules, the promotion path for code committed to our dev/ops pipeline and to promote that as quickly as possible to operational,” Conner said in an interview. “We still haven’t realized the one-day ATO, but it’s out there.”

Conner said the NGA is so excited about the potential of reducing the ATO process down even further, there is discussion about an instant approval.

“We are continuing to shore up our continuous monitoring and telemetry capabilities for new capabilities that are developed so that we can really, really quickly authorize something or ATO something and move it directly into step 6 of the Risk Management Framework and continue to monitor changes to that baseline in operations,” he said. “The ATO-in-a-day solutions have always applied to our dev/ops environment. So these are capabilities developed on a platform, in a handful of languages that we prescribe with a handful of orchestration services, according to a handful of profiles that we’ve defined. It sounds like a lot of limits, it’s not. I would consider much more guardrails than limits.”

Long-time cyber challenge

The Office of Management and Budget and others have recognized over the years that the ATO process was broken. Back in 2017, OMB said it was running a pilot program to consider other approaches to shorten the authority to operate (ATO) life cycle and may potentially look at a “phased ATO.”

It’s unclear what happened to those pilots around a phased approach to an ATO as OMB never publically discussed those results or findings.

The attempt to fix the ATO process has been an ongoing project for OMB.

If you go back to 2013 in the annual FISMA guidance, OMB told agencies they had four years to get to continuous monitoring of systems, which would change the ATO process by making it an infrequent event to one that happens every time there is a change to the system.

Now despite these policy changes, the ATO process remains arduous and costly. Many agencies have moved to approve systems most regularly, in most cases annually.

The one thing about what NGA and GSA accomplished is the ATO in a day or 30 days works only because the organizations set specific limits. Conner said big monolithic systems that continue to use the waterfall approach to development will never enter the quick turnaround security conversation.

For 18F, the limits came from almost every system running on the cloud.gov platform, meaning a specific set of security controls that came with the platform-as-a-service were easily agreed upon and checked off the list.

The sprint team also used standardized ATO tools for architecture diagrams, vulnerability scanning and logging, according to Feldman’s blog post from July 19.

Feldman said one problem with the ATO process is the communication was slow and usually by email. 18F set up a dashboard to make progress easy to track,  and created these virtual teams that got together to hash out problems or challenges.

“Teams coming in may not have compliance experience so sending them to FISMA or other large daunting paperwork about the ATO process is not going to be the easiest way to get into it. So, we standardized and documented our process including a checklist about of what is expected going in, Feldman said. “Similarly for those systems, if every system coming in to the assessment is running on different infrastructure, then, for the assessors, there is no consistency and they don’t know what to expect. It’s the assessor’s job to understand what’s going on in the system and so it takes longer if they have to relearn a new technology every time. We found the more that we are able to inherit between systems and share the best practices of how you configure it, and then the shared language around how you explain how the system is working, having all that more consistent helped the process on both sides.”

NGA sets the guardrails

Over at NGA, Conner described a similar experience.

“These are code or algorithms or layers to our geospatial information system applications,” he said. “The process is really applying a lot of business logic to the telemetry we can gather and measure. We are looking at code quality, code dependency, static and dynamic testing, we are looking at targeted profiles that we’ve built that apply a set of controls to a workload and it’s all cloud based and platform based.”

Conner said NGA has been in touch with 18F about the improvements to the ATO process.

“I’ve always likened the ATO in a day process to a speed pass on the local toll roads. You can real fast with your speed pass after you have gone online and registered your car and registered your transponder, tied it to a bank account and affixed it to the inside of your windshield in a certain place, then you can go real fast. If you don’t do any of those things, you don’t go real fast and you stop and give people money,” he said. “It’s the same thing. We have a set of design patterns; we’ve got a set of orchestration services; we’ve got a set of compute environments so if you are in that space and you want to play by our standards inside these guardrails, we will accelerate you as fast as we can. If you want to do something bespoke, you will have a different process.”

Feldman said 18F’s process is not magic or anything special — it’s just a matter of getting people in the same room as willing participants, documenting the process and your learnings, and setting the same expectations on all sides. That will go a long way, he said.

“I don’t think it will get down to an hour or anything like. The ATO process is there to make sure you are doing your due diligence securitywise and that can’t go away completely and I don’t think it should go away completely,” he said. “What I do hope to see is a reduced effort for the teams and the assessors to complete the ATO. That comes with better tooling, better documentation and better tracking of these projects as they go through so we can get ahead of problems as they come up.”

Read more of the Reporter’s Notebook


20 projects vying for the remaining $55M in the Technology Modernization Fund

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

You can’t quite say the Technology Modernization Fund Board has $55 million burning a hole in its pocket. But the chunk of change remaining in the TMF for fiscal 2018 is heading out the door over the next two months.

Alan Thomas, the commissioner of the Federal Acquisition Service at the General Services Administration and a member of the board, said last week another “slew of proposals” are going through the review process.

Now what Thomas didn’t say, unfortunately, was how many and from which agencies.

For that information, we put on our reporter’s hat and opened up our “reporter’s notebook” to do some digging.

You see GSA and the Advanced Technology Academic Research Center (ATARC) hosted a TMF Summit on July 26 and decided to close it to the press. Now I understand there was some discussion and maybe even some disagreement about whether it was a good idea to close it to the media. Selfishly, let me say unequivocally, it was a bad idea.

The reasons are two-fold: First, the panel of TMF Board members and the industry panel weren’t going to say anything different in front of a mixed crowd of industry and government employees whether or not the press was there. Second, with more than 150 people in the room, of course details of the discussion would come out.

By the way, the Office of Management and Budget already is in a little bit of hot water for its struggles with communications about the fund with its congressional overseers, so the reluctance to “use” the press to explain what’s going on is more bewildering.

But I digress, back to the news.

20 projects seeks Technology Modernization funds

After talking to several attendees of the summit, Federal Chief Information Officer Suzette Kent and the board members said there are more than 20 proposals now vying for the remaining $55 million. But sources say the board offered no more details on what those projects were or which agencies submitted them. This is a big change from the first round where the board received only nine proposals from seven agencies.

Thomas, speaking at the AFFIRM event which was on the record and open to the press, admitted that most of the first set of proposals seeking extra funding fell far short of the board’s expectations.

“The review process was more rigorous than the typical agency program managers were used to,” he said. “We hope we raised their game. All proposals must be signed off by the agency’s CIO and CFO.  If it doesn’t look sharp, we aren’t hesitant to call the CFO and CIO and say is this what you really mean. The message is don’t just do it for us, but do it for all of your proposals.”

An industry source, who attended the TMF Summit and requested anonymity in order to speak about the “not for attribution” event, said Maria Roat, a TMF Board member and the CIO of the Small Business Administration, said the board said “no” to several proposals and to find money elsewhere.

The source said Roat said agencies struggled with the initial proposals to define why they chose a particular technology and what their major milestones would be tied to a realistic and clearly defined timeframe.

Charles Worthington, a board member and the CTO at the Veterans Affairs Department, said the panel was surprised it didn’t see more shared services or at least multiple agency proposals.

Thomas reiterated what Kent, board members and the TMF program management office have been saying for some time: The board wants projects that could be emulated across the government and that could be “flagship” or “lighthouse” projects where others easily can see the benefits.

One industry source, who also went to the summit and requested anonymity for the same reasons, said Kent said the three agencies that won funding in June — the departments of Agriculture, Energy and Housing and Urban Development — came with a clear vision and requirement, and received bonus points if they had done a pilot, or demonstrated a track record of success and were moving into scale and implementation to better ensure success.

Energy pushing for culture change

Interesting insights, but far from earth-shattering details that haven’t been talked about before.

In fact, Max Everett, the Energy CIO, said in June at an AFCEA Energy event, that the focus on moving the entire agency to cloud email and the extra money were part of the reasons for developing a business case proposal, but the overarching decision came down to addressing the fundamental way the agency manages technology.

“One of the other things that is really important about TMF is building a business case, having an holistic picture that you can present to leaders about where you are going and why,” he said. “It’s not just about we need a new thing or it’s time. It was lined up to all of our mission functions and priorities, and it told a story. That was really important because for us as a very federated department with dispersed capabilities all over the nation, this was about some other things. It was about collaboration, common baselines on security, enhanced mobile user experience. It will encourage us toward a true life cycle model for managing our IT.”

He said the cloud migration project represents “significant culture change” for the department.

And it’s that concept that attracted the board to fund Energy’s proposal, the board members said at the summit.

Sources say the board members highlighted the need to break down siloes when it comes to IT and the second order effects of culture change was highly attractive.

Kent said as part of the requirement to get the funding the board tasked DoE to create a playbook so other agencies could follow their model.

The first source said Kent highlighted the fact that Energy already had done the planning and needed money to accelerate it.

Everett said that planning started when he arrived at Energy more than a year ago.

“If there is no other value to cloud email, just being able to talk about a number that is associated with that service of a mailbox has some value by itself,” he said. “We spent a lot of time looking at what the costs are for all of these disparate systems across the department, understanding the costs and the experience we had with moving parts of the department into cloud email. We actually had a good idea of what those costs were, and we sought other agencies who had that experience as well. Then we started to push people to give us real numbers on what it cost to run their on-premise emails. That had been a little of a challenge, to be frank.”

He said too often Energy organizations didn’t account for all the assorted costs of energy, backups, storage, hardware, labor and other things.

Paying back the extra funding is the only option

A big issue for both government and industry at the summit was how HUD, Energy and USDA, and eventually others, would pay back the TMF loan.

The second source said they got a sense that agencies were complaining a lot about the requirement to pay back the loan, but Kent said OMB made it clear to agencies that once they were in for a dime, they were in for the whole dollar.

The sources said Kent made it clear that even if an agency fails, they still have to pay the loan back.

“They are being strict on that requirement,” the source said.

The first industry source said Kent also left no doubt that agencies cannot ask for more money in their budgets to pay back the loan.

The source said Kent said Congress expects that the modernization will run more efficient and cost less money, and has clear expectations about seeing a real dollar return on investment.

Kent said the board will work with teams about meeting payback expectations.

To that end, the second source also said the board is tying all projects to a reporting dashboard, detailing deliverables and milestones.

The source said Kent made it clear that once the project is accepted, the CFO, CIO and program manager must report back to the board on a regular basis that they are all working in the same direction. What is particularly important, said board member and Social Security Administration CIO Rajive Mathur, is that the CFO needs to understand the terms of repaying the loan.

The source added Kent underscored the fact that if the agency fails to meet its project deadlines, the board will stop the funding stream and the CIO and CFO still will be responsible for paying back the loan.

As for the new projects that Thomas referred to as in the cue, the first source said Kent emphasized that the board wants to see more “product based” proposals that are focused on commercial technology.

The source said they hadn’t heard board members talk so definitively before about the desire for agencies to bring in commercial innovations.

What’s ironic about all of these details is getting multiple stories in the press about the board’s oversight and rigor of the process would help alleviate lawmakers’ concerns about the program and help the Senate reinstate TMF funding for fiscal 2019. Instead, concerns about getting “clearance” and “approved talking points” continued the veil of secrecy around the fund that put OMB in this hot water in the first place.

Read more of the Reporter’s Notebook


Death knell for using LPTA for services and other acquisition highlights in the NDAA

Every year the Defense Authorization bill is filled with golden nuggets of little noticed provisions that make a big impact on the federal acquisition community.

This year they range from a strong focus on making it easier for agencies to buy commercial items to mixed messages around the use of other transaction agreements to the ever increasing concerns about too many bid protests.

This is, by far, not a comprehensive list, but several that most federal and contractor employees interested in acquisition should know about.

The House passed the NDAA on July 26. The Senate is expected to take up the bill next week. House and Senate conferees agreed to the bill on July 23, clearing the way to passage.

DoD-only provisions

Sections 816 and 817 make minor, but important updates to acquisition law

Under 816, Congress modifies the limitations of single source task and delivery order contracts, replacing “reasonably perform the work” with “efficiently perform the work.”

Roger Waldron, the president of the Coalition for Government Procurement, said in a blog post that the change raises concerns and risks because it could potentially diminish the benefits multiple companies bidding on task order contracts.

While under 817, lawmakers want agencies to use their preliminary cost analyses as the rationale for developing multi-year contracts.

The conferees didn’t heed the General Services Administration’s request of raising the micro-purchase threshold for all e-commerce buys to $25,000 from $10,000. But they did bring DoD up to par with the rest of the government by raising their micro-purchase limit to $10,000 from $5,000.

Bid protests continue to be a big concern for lawmakers and DoD. So much so, in fact, that they added Section 822 to the bill to require DoD to study the frequency and effects of bid protests involving the same contract award or proposed award that have been filed at both the Government Accountability Office and the Court of Federal Claims. The report would include the length of time protests take, the appeals to the federal court, whether the protester was an incumbent and whether they were a large or small business. The Pentagon also would have to establish a data collection system to better track and analyze bid protest trends in the future.

There has been no hotter topic in the procurement market over the last year than supply chain risk management. To that end, lawmakers, in Section 881, permanently extended the 2011 provision in the NDAA that requires DoD to manage its supply chain risks. It also clarifies that the secretary’s risk determination applies throughout DoD. The provision further clarifies that the secretary’s determination cannot be protested before the GAO or Court of Federal Claims. Additionally, DoD should notify other components or federal agencies responsible for similar procurements facing supply chain risks.

Governmentwide acquisition changes

House and Senate lawmakers agreed several significant provisions that will impact every agency.

Lawmakers took on the dreaded lowest-price technically acceptable (LPTA) by requiring a new Federal Acquisition Regulation rule within four months of the NDAA becoming law that would limit the use of this approach to buying products and services. The bill details six areas the new FAR rule must address, including requiring the agency to clearly describe the minimum requirements, performance objectives and standards to judge bids, and the determination that any proposed technical approach would require no, or minimal, subjective judgement by the agency’s source selection authority.

Additionally, the conferees say in order to balance effective oversight with reasonable costs, they want the GAO to develop a methodological approach that will provide insight into the extent to which LPTA source selection criteria are used by agencies, without requiring a review of each individual instance in which such criteria are used.

Lawmakers also told agencies to limit the use of LPTA when buying IT, cybersecurity, system engineering and technical, audit readiness, knowledge-based and a host of other services.

Rich Beutel, a former Hill staff member, lead staff member of the Federal IT Acquisition Reform Act (FITARA) and now president of Cyrrus Analytics LLC, said the NDAA is trying to address congressional interest in reversing DoD’s “habit of ignoring commercial innovations in such areas as cloud migration, because LPTA simply does not work well as an procurement approach when contracting for such complex technical services.”

Lawmakers revisited its direction around the upcoming e-commerce portal GSA is developing in section 838. The conferees is requiring the portal to meet the definition of full and open competition under the law — meaning there are at least two suppliers that offer similar products.

At the same time, legislators emphasized to GSA that the portal must be done in a manner that:

  • Enhances competition
  • Expedites procurement
  • Ensures a reasonable price for commercial products
  • Be implemented with multiple contracts with multiple portal providers
  • Requires the portal providers to safeguard the data and not use the data for pricing, marketing or for any competitive advantage

Related to, but separate from, the e-commerce portal is section 878 that that would require the Office of Federal Procurement Policy to finalize the definition of the term procurement administrative lead time (PALT) and produce a plan for measure and publicly report PALT data for contracts and task orders greater than the simplified acquisition threshold. The measurement will focus on the time for when an agency releases an initial solicitation to the time of award. This data collection is in response to the increased use of other transaction agreements and similar approaches to reduce the time to get capabilities to the mission.

One provision GSA is excited about is section 876 to increase competition at the task order level. Lawmakers are providing exceptions that certain multiple award contracts, including those under the Federal Supply Schedule, can acquire services where price is not necessarily an evaluation factor.

Alan Thomas, the commissioner of GSA’s Federal Acquisition Service, said July 26 at an AFFIRM breakfast, that this change was something the agency had been pressing for because of the desire for the price competition to happen at the task order level. GSA took similar approaches and had success with its governmentwide multiple award contracts such as OASIS, the Human Capital and Training Solutions (HCaTS) and Alliant 2. GSA Administrator Emily Murphy said one of her priorities is to ensure there is more competition at the task order level and this change in the law is the first major step to ensure that.

DoD and OTAs

Lawmakers are definitely sending a mixed message when it comes to the use of other transaction agreements (OTAs). On one hand House and Senate Armed Services committee member are pushing DoD to use more OTAs by loosening the rules around this approach over the last few years.

But in the 2019 NDAA, lawmakers are starting to express some initial concerns. First in section 215, conferees limited to 80 percent of the funds the Air Force can spend on its Air Operations Center 10.2 OTA. The committees want a report on the Air Force’s acquisition and development approach as well as costs of the development and how the entire service can use what the center is developing.

Additionally in section 873, lawmakers want DoD’s service acquisition chiefs to collect and analyze data around OTAs, and then provide a report to Congress addressing the who is using this approach, how often, for what purpose, how much money they are spending and any successes or challenges.

The driving force behind the use of OTAs has been the Defense Innovation Unit Experimental (DIUx) and lawmakers now are concerned about the value this organization is bringing to DoD.

Lawmakers want a report by May 1 that details how DIUx is integrating with other research and development efforts across the military and its services as well as measuring the organization’s effectiveness, detailing the number, type and impact of its transactions and initiatives. Specifically, legislators want more details on how many non-traditional and traditional Defense contractors are working directly with DIUx and “the number of innovations delivered into the hands of the warfighter.”

Read more of the Reporter’s Notebook


Trump to nominate retired Marine to be VA’s new CIO

It’s been 18 months since the Veterans Affairs Department had a permanent chief information officer and has had only two permanent technology executives since 2009, both of whom lasted on average of three years.

In the meantime, VA has faced a host of challenges ranging from continued cybersecurity shortfalls to delays and problems in developing an electronic health record that is interoperable with the Defense Department to programs and projects falling behind or not meeting expectations.

This is why President Donald Trump’s decision to nominate James Gfrerer to be the next assistant secretary for information and technology and CIO at VA is significant.

James Gfrerer

If confirmed by the Senate, Gfrerer would join VA from his current position as an executive director with Ernst & Young, where he worked in the firm’s cybersecurity practice. He also joins a long list of VA CIOs — both permanent and interim — who are veterans. Gferer served in the Marine Corps for more than 20 years, and was a Defense Department detailee to the Department of State, where he led interagency portfolios in counterterrorism and cybersecurity.

“Having permanent leadership in place to oversee these projects and the VA’s various information and technology systems will be critical as Congress works with the VA to address concerns and make improvements to bring VA into the 21st century,” said Sen. Johnny Isakson (R-Ga), chairman of the Veterans Affairs Committee, in a statement.

Gfrerer seems to be another in a long line of qualified CIOs for VA having earned a bachelor’s degree in computer science from the U.S. Naval Academy and Masters of Science in national resource strategy from the National Defense University.

He would replace Laverne Council, who left in January 2017, as the last permanent VA CIO. Over the last 18 months, Scott Blackburn and Camilo Sandoval have been acting CIOs or executives performing the duties of the CIO. Blackburn left VA in April and Sandoval has been acting since then.

The big question is whether his experience at E&Y will translate in being able to manage a $4.4 billion IT budget that may go up to $5.5 billion in fiscal 2019 if Congress approves the White House’s request, and a workforce of more than 377,000 spread across the country serving more than 20 million veterans.

The budget increase would include “$381 million for development projects such as modernization of legacy systems; development of a Digital Health Platform, and a new financial management system,” states VA’s budget request to Congress.

According to the Federal IT Dashboard, Gfrerer faces an uphill challenges with VA’s IT projects, only 41 percent are on budget and 79 percent are on schedule. The good news is 95 percent of all projects are using an iterative or agile approach to development, and 81 percent of those projects are on schedule.

More good news for Gfrerer is how much the agency is spending on legacy systems. The VA CIO’s office reported in January that about 60 percent it’s budget on operations and maintenance. The governmentwide average is closer to 80 percent.

“VA now plans to move critical functions from outdated and unsustainable platforms to more modern systems that operate at lower maintenance costs. We expect this change to save millions of dollars that can be reinvested in projects that directly enhance services for all veterans,” the report states. “A significant part of this reinvestment focuses on cloud and IT infrastructure improvements. Moving these functions to new platforms requires careful, thoughtful planning, and we accomplished much of that necessary planning in 2017.”

Among Gfrerer ‘s biggest challenges is moving from strategy to implementation for the electronic health record. In May, VA awarded Cerner Corp. a 10-year, $10 billion contract to implement the same EHR system that the Defense Department is deploying and abandon its own, existing Veterans Information Systems and Technology Architecture (VistA).

In June, VA executives told the House Veterans Affairs Committee that Cerner will build and provide project management, planning and pre-initial operating capabilities under the first task order. It will conduct facility assessments at three sites in the Pacific Northwest under the second task order and provide an EHR hosting solution under the third, acting VA Secretary Peter O’Rourke said in his written testimony.

VA will begin deployment at those three sites in October, with the goal of fully implementing the system by March 2020.

To date, VA has designated 260 full-time employees to the department’s new Electronic Health Record Modernization Program Office. VA will add more staff over time as the agency implements the new system to more sites, said John Windom, program executive officer for VA’s EHR modernization office.

Gfrerer will have to make quick work of the learning curve and begin managing VA’s technology transformation across several fronts.

Read more of the Reporter’s Notebook


Is EPA weakening the authority, power of its CIO?

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Editor’s Note: This story was updated on July 24 with new details from EPA.

After two laws, an executive order and countless reminders that federal agency chief information officers MUST report directly to agency secretaries or deputy secretaries, the Environmental Protection Agency is regressing.

Multiple sources in and out of EPA say the leaders are reorganizing the entire administrative and management offices and pushing the CIO deeper into the organization under the assistant secretary in the office of administrative and resource management (OARM).

The decision to reorganize OARM comes as EPA named in late June Vaughn Noga as its new CIO and principal deputy administrator in the Office of Environmental Information, replacing Steve Fine, according to an internal announcement obtained by Federal News Radio.

Fine took a new position starting Aug. 5 as a senior advisor in the Office of Air and Radiation in Ann Arbor, Michigan.

“Vaughn currently serves as the director, Office of Administration, Office of Administration and Resources Management, where he is responsible for a diverse portfolio, including facilities and property management, physical and personnel security, and safety and health,” said the internal email from EPA Chief of Staff Ryan Jackson. “I want to thank Dr. Steve Fine for serving as the PDAA in OEI and leading the organization through transition and improvements in cybersecurity, and hardware and software advances agencywide.”

Fine had been acting CIO since Ann Dunkin left in January 2017 as part of the Obama administration. But EPA hasn’t had a Senate-confirmed assistant administrator and CIO for five years when Malcolm Jackson left the position. The Senate never confirmed Dunkin because of an unrelated hold.

What the memo doesn’t say — and sources say, has not been communicated in writing — is Noga may be a PDAA in name only.

Sources say EPA’s plan to merge OEI and OARM would push the CIO, which had been Senate-confirmed assistant administrator position for more than two decades, down to a deputy position in the resources management office.

The CIO would answer to the principal deputy of OARM, currently Donna Vizian, or the assistant administrator, should one ever be named and confirmed. Vizian has been acting since before the Obama administration ended.

“Moving the IT organization down a level will only decrease the visibility and the needs of IT by the agency leadership,” said one former EPA official, who requested anonymity in order to speak about sensitive discussions. “It is a surprising move since the presidential cyber executive order made the deputy administrator responsible for cyber. This is a step backwards for IT and EPA.  This model was used back in the late 1990s before organizations realized IT needed to be part of the leadership team for them to be successful.”

EPA taking a step back from technology

An EPA spokesman on Tuesday confirmed in an email to Federal News Radio that the agency is merging the Office of Environmental Information and the Office of Administration and Resources Management after consulting with the Office of Management and Budget.

“The combined organization – the Office of Mission Support – will be the national program manager for information management, information technology, acquisition, grants, human resources, real property and security.  It’s workforce will be about 1,000 people,” the spokesman said. “Given the breadth of responsibility and the size of the organization, the agency decided to name the deputy assistant administrator for Environmental Information- the senior career position over IT/IM – to be the CIO.  The proposed placement would allow the CIO to focus full-time in IT and information management (IM) activities—continuing to give those issues the attention they deserve.  It also ensures continuity in the organization and a strategic approach by an IT professional. The CIO will continue to have access to the senior leadership of the agency and will continue to participate in agencywide IT/IM budget formulation and strategic planning.”

Another former EPA official, who also requested anonymity because they didn’t receive permission to speak to the press from their current company, said the merger is concerning because it seems to drop the CIO down a notch in the organization.

“Depending on how they do it, the CIO may not have a voice at the table with deputy administrator or administrator,” the former official said. “If you have an administrator who is concerned about technology, then maybe he or she gives the CIO a seat at the table. But it’s not organic like it would be if the CIO reported directly to the administrator or deputy administrator.”

The former official said it seems that technology is taking a back seat at EPA.

“Technology grew up in each of those mission offices separately over time, and that created inefficiencies,” the official said. “So if the CIO is not with other assistant administrators then technology will be an afterthought and not be used strategically. Think about it this way, if the assistant administration for air talking about technology, but does not understand where technology is going or where OMB wants it to go, and the CIO is not sitting there to talk about where the agency needs to go, then you run the risk of creating siloed solutions.”

Dunkin said in an email to Federal News Radio given how complicated technology is and will continue to be, the CIO needs to have both the positional power and direct access that comes by reporting to head of the department to influence others who report to the executive too.

“I’ve heard many people try to tell me otherwise but it is clear to me that reporting relationships and access matter. The CIO has to have a seat at the table with the rest of the department or agency head’s staff. They must be seen as a peer,” Dunkin said. “So, the leadership of the EPA can make all the claims they want the CIO will be invited to the administrator’s staff and have access to the administrator, but if the CIO is a deputy Assistant Administrator, they won’t have the relationships or leverage to get results. This is further complicated within the EPA where every assistant administrator and regional administrator is a political appointee. Turning the CIO into a career position clearly removes them from the group.”

She added the CIO is held accountable by OMB and Congress to meet IT mandates, so changing the reporting structure will reverse a decade of progress.

“Meeting the mandates weren’t easy when I was there but my peers knew that I had the backing of the administrator and the deputy administrator along with the law. If the CIO becomes a deputy administrator, not only does the CIO lose positional authority and their position as a peer appointee, but they administration sends a clear message that they don’t care about the law that empowers CIOs,” Dunkin said. “That means the CIO loses the ability to exercise the authorities of Federal IT Acquisition Reform Act (FITARA) or [President Trump’s] executive order. Staff will know that the administrator doesn’t care about complying. The new EPA CIO will be operating with both hands tied behind his back as a career deputy assistant administrator.  So, the bottom line is that this is not good for EPA continuing to transform IT. The Pruitt team has starved OEI of resources and now the CIO is losing their authority.”

OMB not happy either

Multiple emails to the Office of Management and Budget asking about how the administration is tracking agency progress in meeting the goals of the May executive order on CIO authorities were not returned.

The latest FITARA scorecard from the House Oversight and Government Reform Committee says EPA’s CIO reports directly the administrator or deputy administrator. But unless Noga or someone else comes forward to tell the committee access has been shut down or clearly diminished, the charade of reporting will continue.

And that’s why this move is so worrisome for many former EPA officials. There is nothing in writing. Sources say congressional oversight committees declined to approve the reorganization, but agency leaders are going forward anyway.

Sources say OMB isn’t happy with EPA’s actions either.

The second former EPA executive said these changes are being driven by Henry Darwin, EPA’s chief of operations. His background is using lean management principles where he used this approach as the state of Arizona’s chief of operations, where he oversaw the operations of all 35 state agencies and worked to stand up state government’s first intentional management system based upon lean principles.

Lean principles focus on the customer by creating more value in each process and reducing any process that doesn’t add value.

As for Vaughn, he walks into a tough situation. Dunkin had begun to reorganize the CIO’s office, but Fine and others weren’t happy with that direction and had begun dismantling it.

The second former EPA source said Vaughn likely will reverse course again and continue with the concepts of agile project development and having an open architecture.

“I know that Vaughn has quite a bit of experience in technology world, having been head of operations and as chief technology. He also was instrumental to help move EPA to the cloud with Microsoft Office 365,” the source said. “Cloud will be one of a few priorities. He’s well positioned to step into that role and know people, the organization and having been outside of OEI for last few years and consuming their services, he has a viewpoint of being inside and outside. I think he’s well positioned to step into that role and identify where they should be focusing.”

Noga will discuss his priorities this week in Dallas, Texas during EPA’s annual IT meeting.

The former official said it may be two years until EPA really understands the impact of the change.

“Because the CIO didn’t have a seat at the table when decisions are made, EPA will build out the technology and then there will be audits, and only then you will know if this change was a problem,” the source said. “Former Administrator Scott Pruitt was supportive of this change. With current acting Administration Andrew Wheeler being a former EPA official, he’s at least grounded in the mission of the agency. But I don’t think he will stop this OEI and OARM reorganization. It’s too far down the path.”

Read more of Reporter’s Notebook


« Older Entries

Newer Entries »