The National Geospatial-Intelligence Agency is in talks with Capitol Hill to find legal ways to achieve a unique barter agreement between the government and industry: swapping potentially years’ worth of data locked in the agency’s archives for expertise and new computational techniques from the private sector.
Robert Cardillo, NGA’s director, said such an arrangement is purely notional at this point, but that what he termed a “public-private partnership” could help the agency solve some of its current challenges in the areas of artificial intelligence and employing advanced algorithms.
“The proposition is, we have labeled data sets that are decades old that we know have value for those that are pursuing artificial intelligence, computer vision, algorithmic development to automate some of the interpretation that was done strictly by humans in my era of being an analyst,” he told an intelligence conference hosted by Georgetown University on Thursday. “And so that partnership is one that we’re discussing with the Hill now to make sure we can do it fairly and openly.”
Cardillo sees such a partnership as a logical next step in NGA’s growing involvement with commercial companies, many of whom it already relies on to supply some of the data it uses in its intelligence and combat support missions.
“As we interact with this growing community that’s seeking to take advantage of the internet of things, big data, et cetera to create consumer insight and increase sales, I see a way that we can take what we have much of — our expertise, our experience and our data — and maybe level it with something that we don’t have as much as we’d like, which is the high-end computer science, the cutting-edge algorithmic development, the leading researchers on the academic front,” he said. “But we have something of value as well, so maybe we can make a trade: our historical intellectual property for their current cutting-edge property.”
In addition to potentially running afoul of traditional notions of fair and open competition in government contracting, Cardillo said the agency is also considering the implications of potential abuse if it were to open up data sets that are not available to the general public to only a handful of selected firms.
“We want to make sure we’re doing this in a transparent way so that people can know what is being exchanged, and make sure we have the proper protections in place against any abuse,” he said. “Those questions are being asked of my team right now. The data that we’re talking about is data that the U.S. taxpayer has already invested in. We need to be careful about how we do it, but that’s why we’re having the conversation.”
There have been several recent instances in which NGA has opened up data sets that, in a previous era, would have remained walled-off from the rest of the world and accessible only to government customers, but those cases have tended to involve making the data publicly available, not trading it in a value-for-value exchange.
For instance, starting in 2015, the agency, at the request of the White House, embarked on a new project to build a never-before-constructed type of elevation data for the Arctic, using imagery provided by the private firm DigitalGlobe and analytical work contributed by the University of Illinois, the University of Minnesota and Oak Ridge National Laboratory. At the project’s conclusion, NGA made the resulting datasets available on its public website.
“The reason that’s valuable is that if you’re a hydrographer and you want to understand the impact of a change in ice flow, or if you want to understand safety of navigation through sea lanes that weren’t previously open, you have to have that baseline data in order to run the algorithm that you’ve developed,” Cardillo said. “It’s advanced our scientific understanding of what’s happening in the Arctic. We ended up sharing the data, not trading it in that case, but I think we could do that in other areas — food security, disaster relief — of course, we’d be interested in applying it to advantage our military as well.”