AAAI Symposium on Open Government Knowledge, 4-6 Nov 2010, Arlington VA

November 2nd, 2011

If you are in the DC area this weekend and are interested in using Semantic Web technologies, you should come to the AAAI 2011 Fall Symposium on Open Government Knowledge: AI Opportunities and Challenges. It runs from Friday to Sunday midday at the he Westin Arlington Gateway in Arlington, Virginia.

Join us to meet the thought governmental and business leaders in US open government data activities, and discuss the challenges. The symposium features Friday (Nov 4) as governmental day with speakers on,, open gov data activities in NIH/NCI and NASA and Saturday (Nov 5) as R&D day with speakers from industry, including Google and Microsoft, as well international researchers.

This symposium will explore how AI technologies such as the Semantic Web, information extraction, statistical analysis and machine learning, can be used to make the valuable knowledge embedded in open government data more explicit, accessible and reusable.

See the OGK website for complete details.

Reuters Calais to support Semantic Web Linked Data in next release

November 14th, 2008

Thompson Reuters announced on their blog (Life in the Linked Data Cloud: Calais Release 4) that their next release of the Calais web-based information extraction services will support linked data.

“In that release we’ll go beyond the ability to extract semantic data from your content. We will link that extracted semantic data to datasets from dozens of other information sources, from Wikipedia to Freebase to the CIA World Fact Book. In short – instead of being limited to the contents of the document you’re processing, you’ll be able to develop solutions that leverage a large and rapidly growing information asset: the Linked Data Cloud.”

The new capabilities will be available in release 4 that is expected
out on 09 January 2009.

The change is based on Calais returning de-referenceable URIs for the entities it finds. Accessing those URIs will produce RDF with links to corresponding entities in DBpedia, Freebase and other sources of “Semantic Web” data. It will be very interesting to see how well their system does at mapping document entities (e.g., “secretary Rice”) to entities in the LOD cloud such as Accessing that URI with a request for content type application/rdf+xml returns the RDF at that has RDF assertions extracted by DBpedia from Wikipedia.

JWS special issue on The Web of Data

November 11th, 2008

Axel Polleres and David Huynh are editing a special issue of the Journal of Web Semantics on The Web of Data that will appear in the Summer 2009. Submitted papers are due by January 21, 2009. See the special issue call for papers for details.

Parallax: a better interface for Freebase

August 14th, 2008

David Huynh completed his PhD at MIT CSAIL last year and joined MetaWeb a few months ago, where he has been working on new and better interfaces to explore the data encoded in their Freebase system. He recently released Parallax as a prototype browsing interface for Freebase. Here is a video that shows the interface in action.

Freebase Parallax: A new way to browse and explore data from David Huynh on Vimeo.

Freebase is “an open database of the world’s information” that is constructed by a Wiki-like collaborative community. In many ways it is like the Semantic Web model, with two big differences: (1) the data is stored centrally rather than distributed across the Web and (2) the representation system is not based on RDF but rather uses a custom built object-oriented data representation language.

Freebase is a great resource. Much of the data is extracted from Wikipedia, so its content has a large overlap with DBpedia. But it is also relatively easy to upload additional information in various structured forms and many have done so, resulting in an extended coverage.

This is clearly a system in the Web of Data space along with the Linking Open Data effort and having it should offer a way for us all to explore the consequences of some of the underlying design decisions.