UMBC ebiquity
BigOWLIM reasons over billions of RDF triples

BigOWLIM reasons over billions of RDF triples

Tim Finin, 1:00pm 28 May 2006

Sirma’s Ontotext Lab announced BigOWLIM, a new high performance storage and inference layer for the Sesame RDF database. They demonstrated that it can handle more than a billion triples by loading the Lehigh LUBM benchmark and correctly answering the evaluation queries. Of course, this took a while — over 70 hours to load and build the model, materialized via forward chaining, which comprised over 1.8B triples.

While their OWLIM system does reasoning and query processing in memory, BigOWLIM stores the model in binary files and used them to answer queries and perform inference.

There is a presentation from WWW2006 Developer day. Evaluation copies of the beta version of BigOWLIM are available on request and a free version of the in memory OWLIM system is available to download.


Comments are closed.