Instructions
You can download a binary from the latest release page.
shex-rs
has been implemented in Rust and is compiled using cargo. The command cargo run
can be used to compile and run locally the code.
TBD
The folder examples
contains several example files.
wget https://github.com/weso/RemoteHDT/blob/master/resources/1-lubm.ttl
remote-hdt --rdf 1-lubm.ttl --serialize
This project contains an exploration on ways to replicate HDT using ZARR.
- We have to be able to import data from RDF dumps
- Then we have to load them into the Application
- Lastly, the loaded data should be serialized into ZARR
This project could be divided into two main crates:
- RemoteHDT --> The HDT fork using ZARR
- rdf-rs --> utilities for importing RDF dumps using Rust
.
├── \*.zarr # Resulting Zarr project
├── rdf-rs # Crate for importing the RDF dumps into the system
├── examples
├── src
│ ├── zarr # All the Zarr utilities
│ └── main.rs # Main application for creating the Zarr project
└── ...
X axis --> subjects Y axis --> predicates Z axis --> objects
!Caveat Unique values should be stored in each of the axis
For each triple, a 1 will be set to each (X, Y, Z) such that (X, Y, Z) = (s, p, o). -1 otherwise.
Support several systems of reference; namely, SPO, POS, OSP...Explore the Linked Data Fragments project- Streaming + Filtering = Larger than RAM?
Quality attributes: synchronization, size of the original dump...HDTCat --> Larger than RAM HDT, while RemoteHDT --> Remote HDTCat?- Serverless Linked Data Fragments?
Benchmarking- Store the HashSet inside the Zarr directory (somewhere)
- Work on the quality attributes and features that we are good at
- LUBM benchmarks
- Create a Shape for the LUBM benchmarks.