UMBC ebiquity
Intelligence community embraces Web 2.0

Intelligence community embraces Web 2.0

Tim Finin, 1:00pm 24 February 2007

A Computerworld story, Top Secret: DIA embraces Web 2.0, that discusses how the Defense Intelligence Agency is embracing new Web based collaboration and integration tools. (spotted on SmartMobs)

“The U.S. Department of Defense’s lead intelligence agency is using wikis, blogs, RSS feeds and enterprise “mashups” to help its analysts collaborate better when sifting through data used to support military operations. The Defense Intelligence Agency (DIA) is seeing “mushrooming” use of these various Web 2.0 technologies that are becoming critical to accomplishing missions that require intelligence sharing among analysts, said Lewis Shepherd, chief of DIA’s Requirements and Research Group at the Pentagon.”

One of the recent technology successes within the US intelligence community is Intellipedia, a set of wikis available on classified networks run by the US Government.

“DIA first launched a wiki it dubbed Intellipedia in 2004 on the Defense Department’s Joint Worldwide Intelligence Communications System (JWICS), a top-secret network that links all the government’s intelligence agencies.”

Another aspect that is being used in the intelligence community is integrating information and services in real time using Web 2.0 techniques.

DIA last year began a project to create a data access layer in its architecture using a service-oriented architecture to pull together human intelligence (data gathered by people) and publicly available data gathered from the Internet and other sources into a single environment for analysis, Shepherd added. Analysis of data in this new environment will be done in part by using Web 2.0 applications, such as “mashups,” that collect RSS feeds, Google maps and data from the DIA network that users can access with a lightweight AJAX front end, he added. “Web 2.0 mashup fans on the Internet would be very much at home in the burgeoning environment of top-secret mashups, which use in some cases Google Earth and in some cases other geospatial, temporal or other display characteristics and top-secret data,” Shepherd said.

These are good example of the movement within the intelligence and law enforcement communities from a “need to know” environment toward a “need to share” one. Traditional access control policies are often based on the concept of “need to know” and are typified by predefined and often rigid specifications of which principals and roles are pre-authorized to access what information. This can and does lead to systems which discourage the sharing of information by requiring principals to be known in advance, depreciating interoperability, ignoring context, and being unresponsive to novel and unexpected situations. One of the recommendations of the 9/11 commission was to find ways to move from this traditional perspective toward one that privileges the “need to share”.

While it may be easy to define what “need to share” means in terms of very high level organizational policy, it will be challenging to understand what new technical approaches and systems are needed to support it. In addition to wikis, blogs, feeds and web services, I suspect that other new ideas will help here. The Semantic Web offers a good approach to publishing, sharing and integrating data. Computational policies can address the contextual sharing of information. NLP and information extraction are important for acquiring information from open sources. Social network analysis, trust models, and reputation systems will also probably play key roles. Finally, machine learning is often an underlying approach to getting all of the components to work and integrate.


Comments are closed.