Intelligence community sees ‘most promise’ in AI’s capabilities for analysts

The intelligence community said it sees artificial intelligence as a possibility for its analysts to tap into more data and information than ever before.

The intelligence community, like most everyone else, has big plans to jump into the world of artificial intelligence and automation, and among the IC’s priorities for the future, the prospect of AI holds perhaps the greatest potential.

“This area holds some of the most promise that we have across the intelligence community, it really does,” Lt. Gen. John Bansemer, assistant director of national intelligence for partner engagement within the Office of the Director of National Intelligence, said Wednesday at the Intelligence and National Security Summit in National Harbor, Maryland. “It’s also an area that we’re probably collaborating as closely with the [undersecretary of defense for intelligence] and the department.”

The Pentagon has already designated artificial intelligence and machine learning as a key component of its 2018 national defense strategy. DoD solidified that interest earlier this summer when it stood up a new Joint Artificial Intelligence Center (JAIC), housed within its CIO shop.

But other agencies within the intelligence community want to get into the AI game as well. These organizations see the potential to change nearly every aspect of the intelligence business, from the the way agencies continuously monitor employees for insider threats to how analysts inform and develop their own decisions.

“The holy grail down the road for us will be being able to bring that data together in an environment where you can look at and fuse both classified and publicly available information in one spot, because that’s real power,” Neil Wiley, director for analysis at the Defense Intelligence Agency, said.

For now, at least, the IC envisions an opportunity where AI can help supplement — not completely replace — the work that many analysts do daily.

“It can be a very powerful sidecar to our scarcest resource, which is really good analysts,” Dawn Meyerriecks, director of science and technology for the CIA, said. “We don’t look at this as it’s going to suddenly make analytic talent obsolete. It takes our best people and it cues up for them the things that are going to fundamentally impact their judgments.”

As multiple IC officials said during the span of the two-day intelligence summit, they don’t envision a scenario where new machine-learning capabilities knock many people out of their jobs.

“I don’t think that we’re talking about creating completely autonomous capabilities here,” Bansemer said. “There’s still going to be a person, an analyst, in the loop. When we had earlier discussions with our analysts, a lot thought that we were trying to work them out of a job. I’m not worried about that for at least the next 10-to-20 years.”

The potential for automation and AI means intelligence analysts will have access to a larger body of information to tackle a problem or consider a scenario. Under the current, manual process analysts can only consider as much data as they can physically handle and process.

“Our through-put is extraordinarily small, and we only really consider a fraction of the potentially relevant information out there,” Wiley said.

Automation could potentially give intelligence analysts a larger body of data to consider, which would open new possibilities from both an analytical and ethical mentality, he added.

“The intelligence community has learned a lot of lessons, sometimes painfully, about what characterizes an intelligence assessment,” Wiley said. “What are the hallmarks, ethically, of an intelligence assessment? They’re laid out in the intelligence community directives. We have to be clear about the level of confidence we had. We have to be clear about how we arrived at our judgments, what sources were available to us, what we liked and what we didn’t like, what assumptions we made.”

The IC will need to make the same considerations for future machine-generated assessments.

“Just because a machine generated it, does not get us off the hook for the ethical standards required of an intelligence problem,” Wiley said.

Like most other federal organizations, the intelligence community is still searching AI’s specific path forward.

“The weak link is really our own vision for how we can implement it,” Lt. Gen. Darsie Rogers, deputy director for combat support at the Defense Threat Reduction Agency, said. “What processes do we have that artificial intelligence can help with? It will grow rapidly, exponentially, once more and more folks in uniform and folks in the department get a sense for what it means to have a data-driven decision. What kind of data do you need? What form do you need it in? What is your process for getting that data into a place that a machine can operate on it?”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories