One of the few remaining challenges around information sharing is not culture or policy, but technology.
Few would argue with the fact that agencies have taken significant steps to securely make data available to their colleagues and even, to a limited extent, to industry. But the problem continues to be how agencies can use and manage the huge amount of information the government collects, especially when it comes to terrorism and homeland security. “The day after — should we ever be attacked — you will say it was somewhere, you just couldn’t either find it or worse yet connect it,” said David Shedd, the deputy director for the Defense Intelligence Agency, Thursday at a conference on information sharing sponsored by the Center for Strategic and International Studies in Washington. “It’s just borne out of the enormity of the data that is out there. As a veteran of what has been termed intelligence failures and occasionally an intelligence success, I can tell you that will be viewed as failure.”
Shedd added the problem is how do analysts process all the data with the right algorithms and right technology to find the needles in the haystack.
“The problem for that analyst today is you can’t possibly in a 24 hour day, if they were to work 24 hours a day, get through all that data even in their area of responsibility,” he said.
Possible solutions to the data flood?
Despite this enormous challenge, there are some potential solutions starting to emerge.
Kshemendra Paul, the program manager for the Information Sharing Environment, said data standards and tagging are key to making the data searchable.
He said efforts such as the National Information Exchange Model (NIEM) or the National Suspicious Activity Reporting initiative are two examples of where this already is happening.
Paul said 18 agencies are using NIEM in one way or another. And this week, Paul said the Centers for Disease Control and Prevention is hosting officials from the U.S., Canada and Mexico to discuss how to implement NIEM around specific public health and law enforcement areas.
As for the Suspicious Activity Reporting (SAR) standard, Paul said more than 200,000 law enforcement officers have been trained and another 200,000 will receive education later this year.
And now, Paul said the government is expanding the SAR initiative to offer training to fire, emergency medical services 911 operators and critical infrastructure owners and operators.
Better technology needed
DIA’s Shedd added technology also must play a big role.
He said analysts can’t just pull the data they need or think they need all the time. The systems must push the data based on a set of standards and rules. “We can do far better than we are doing today in having a push model of how information reaches a profile of the analytic side of the intelligence community that is profiled around data sets that will help that analyst do his or her job better as a result of it being pushed to them,” Shedd said. “You must develop ways in which information is not only pulled — you will never a get away from the pull model — but you have to go concretely and definitively toward a much heavier push model in terms of your ability to bring data sets together to begin to inform the picture that a human eye can then look at.”
Paul and members of the intelligence community are working on a new national strategy for secure information sharing that likely will address the data overload challenge. The draft in the works and a final document could come out in the next three to six months.
The national strategy also will highlight expectations in the post-WikiLeaks environment.
The intelligence community already has made some changes and is in the middle of making others, said retired Air Force Maj. Gen. James Clapper, the director of national intelligence.
He said among the Office of the Director for National Intelligence’s top six priorities is the need to share and safeguard information. Clapper said WikiLeaks has caused the IC to do more in terms of auditing, monitoring and controlling movable media.
“We have to do more to both tag data and ensure we can properly identify people,” he said. “so, if we are sharing information, we are assured that they have the bona fides and that they are authorized to receive the information.”
He said greater identity management and improved labeling, cataloging and tagging of data actually improves sharing across the intelligence community.
“If you can be sure the information you are sharing is actually going to an authorized recipient that actually is an inducement to do more sharing,” Clapper said. “We will, of course as we always do, install all the appropriate IT mouse traps to prevent a recurrence of WikiLeaks, but in the end our system is based on personal trust.”
President Barack Obama signed an Executive Order last October to codify many of the changes made in the wake of the WikiLeaks. It also created new offices to oversee the move to secure sharing.
IC defending against insider threats
Clapper said the IC has varying degrees of capabilities to do the auditing and monitoring. He said the intelligence community will invest in new technologies to address the insider threat. “We need to develop a national insider threat policy,” he said. “Inherently, we’ve always had a responsibility for detecting insider threats. What WikiLeaks has done is heightened our sensitivity to that. In an IT context, an insider threat is quite profound and I think that is why everyone is being sensitized about being alert to detecting insider threats. There isn’t a silver bullet here. You have to have that and you have to have auditing and monitoring.”
ODNI is developing a system to tag data and another one to monitor employees. Clapper said both are a work in progress.
Clapper said he expects the systems to be part of the IC’s new IT architecture, which will promote integration and efficiency and make it easier to share data more broadly within the intelligence community.
Clapper said employees tag data, and then can label, account and catalog it, which will make establishing a community of interest more quickly and more efficiently than how the IC does it now.
“If you know what the data is in question, you know where it is and you know with whom it can be shared and then you can account for it when it is, you are in a much better posture both in terms of security and from a sharing standpoint,” he said.
Clapper said the IC has made some progress, but it will take about five years to establish such a system.
Beyond securing and safeguarding data, Clapper said the continued integration of the IC, the development of standards especially around IT, ensuring privacy and embracing a common operating model and shared services are among his top priorities.
In the end, Clapper said the goal is to ensure the data meets the needs of the analysts but the sources are kept secure.