Please use this identifier to cite or link to this item: https://hdl.handle.net/11108/326
Title: 

Keeping linked open data caches up-to-date by predicting the life-time of RDF triples

Authors: 
Nishioka, Chifumi
Scherp, Ansgar
Year of Publication: 
2017
Citation: 
[Title:] WI '17 Proceedings of the International Conference on Web Intelligence, Leipzig, Germany — August 23 - 26, 2017 [ISBN:] 978-1-4503-4951-2 [Publisher:] ACM [Place:] New York, NY [Pages:] 73-80
Abstract: 
Many Linked Open Data applications require fresh copies of RDF data at their local repositories. Since RDF documents constantly change and those changes are not automatically propagated to the LOD applications, it is important to regularly visit the RDF documents to refresh the local copies and keep them up-to-date. For this purpose, crawling strategies determine which RDF documents should be preferentially fetched. Traditional crawling strategies rely only on how an RDF document has been modified in the past. In contrast, we predict on the triple level whether a change will occur in the future. We use the weekly snapshots of the DyLDO dataset as well as the monthly snapshots of the Wikidata dataset. First, we conduct an in-depth analysis of the life span of triples in RDF documents. Through the analysis, we identify which triples are stable and which are ephemeral. We introduce different features based on the triples and apply a simple but effective linear regression model. Second, we propose a novel crawling strategy based on the linear regression model. We conduct two experimental setups where we vary the amount of available bandwidth as well as iteratively observe the quality of the local copies over time. The results demonstrate that the novel crawling strategy outperforms the state of the art in both setups.
Persistent Identifier of the first edition: 

Files in This Item:
There are no files associated with this item.





Items in ZBWPub are protected by copyright, with all rights reserved, unless otherwise indicated.