Host a server of Triple Pattern Fragments with HDT datasets

1. Triple Pattern Fragments
2. Set up a Triple Pattern Fragments Server with HDT datasets

Acknowledgements: If you use our tools in your research, please acknowledge them by citing the following papers: show

 

 

1. Triple Pattern Fragments

Linked Data Fragments (LDF) is aimed at improving the scalability
and availability of SPARQL endpoints by minimizing server-side processing
and moving intelligence to the client. One such type is called a Triple Pattern Fragment where only simple Triple Patterns (TP) are allowed to be
queried and results are retrieved incrementally through pagination. Each of these pages (referred to as fragments) includes the estimated results and hypermedia controls, such that clients can perform query planning, retrieve all fragments, and join sub-query results locally. As such, server load is minimized and large data collections can be exposed with high availability. Given that HDT provides fast, low-cost TP resolution, TPF has been traditionally used in combination with HDT.

 

2. Set up a Triple Pattern Fragments Server with HDT

Setting up a TPF server with HDT is rather straightforward.

  1. Download a TPF Server implementation (Node.js is recommended).
  2. Place your HDT datasets in the server, open the JSON config file of the server (e.g. see configuration instructions for Node.js) and modify the settings to point to your hdt, e.g.:
    "settings": { "file": "/home/user/myfile.hdt" }
    
  3. Run the server and enjoy querying with any TPF client! E.g. running the server in Node js:
    ldf-server config.json 5000 4