Sirma’s Ontotext Lab announced BigOWLIM, a new high performance storage and inference layer for the Sesame RDF database. They demonstrated that it can handle more than a billion triples by loading the Lehigh LUBM benchmark and correctly answering the evaluation queries. Of course, this took a while — over 70 hours to load and build the model, materialized via forward chaining, which comprised over 1.8B triples.
There is a presentation from WWW2006 Developer day. Evaluation copies of the beta version of BigOWLIM are available on request and a free version of the in memory OWLIM system is available to download.