Intelligence agencies leaning more heavily on open source data

Analysts showing value of searching public information

By Jason Miller
Executive Editor
FederalNewsRadio

In the seven years since Sept. 11, many in the intelligence community have realized the value of open source data to find about about terrorists and other organizations.

Randall Fort, assistant secretary of the State Department’s bureau of intelligence and research, says open source slowly is becoming the first choice of the policy and intelligence community.

But Fort warns the flood of data that analysts are faced with is quickly becoming overwhelming.

“How do we consume, validate and use all this data?” asks Fort Thursday at the Office of the Director for National Intelligence’s Open Source Conference in Washington. “Not enough attention is being paid to the volume of data, especially open source data.”

Fort says technology will help deal with the massive amounts of data.

But others believe it is a combination of new skills, new methodologies and the ever necessary culture change.

Frank Cilluffo, director of Homeland Security Policy Institute at The George Washington University, says open source “must be demystified.”

“It is an enabler to other collection disciplines,” he says. “The greatest role open source can play is to decipher trends and analysis.”

Jennifer Sims, a professor at Georgetown University, says open source provides a “decision advantage” over adversaries, and in many respects, is better than classified data.

And for that reason, some open source analysis cannot be made public, she says.

Sims says there are four categories of open source data:

  • Baseline knowledge, which look at patterns of behavior, what the trends mean and how it will affect the United States
  • Open data that is hard to find and highly sensitive to the policymaker
  • Counter intelligence, which includes information that was once hidden but now is open and could change quickly
  • Information deliberately revealed by your adversary and which the analysis must understand why they revealed the data and what it means

Sims says one big challenge is the over-classification of open source data.

“Decision makers believe that the higher the classification of the information, the more important it is and they will pay more attention to it,” she says.

Fort says State tends to make most data unclassified so they can distribute more broadly.

“The relationship between the producer and collector needs to be closer,” Cilluffo says. “We are still five years out before seasoned analytic folk can understand and use the information.”


On the Web:

ODNI – ODNI Announces Establishment of Open Source Center

ODNI – Open Source Center

(Copyright 2008 by FederalNewsRadio.com. All Rights Reserved.)

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.