RDFox v3.1.0 was released on the 7th July 2020. Along with the addition of SWRL support and a slew of small improvements and fixes, the new version introduces an exciting new feature to improve support for applications that require reasoning under the closed-world assumption: Datalog Constraints.
Datalog Constraints leverage RDFox’s unique incremental reasoning capabilities to bring the expressiveness of RDFox’s rule language to the problem of constraining data store content. In so doing, they provide application developers a means to ensure that their data stores remain valid and focused on their application domain, without having to write any external code.
RDFox exploits incremental reasoning algorithms to ensure that materialisations are up-to-date before a transaction is committed; that is, the implicit facts in each data store are exactly those which logically follow from applying the store’s rules to its explicit facts. Using incremental and materialised reasoning in this way ensures that implicit facts can be queried with the same jaw-dropping speed as explicit facts.
Building on this, RDFox’s new constraint validation feature is implemented as a commit-time check, performed after the incremental reasoning step, that the transaction will not introduce any instances of a special constraint violation class into the data store’s default graph. Any transaction which fails this check is rejected with an explanatory message that can be determined by the author of the constraint.
The good news for anyone who already knows RDFox Datalog is that they already know how to write Datalog Constraints. The good news for everyone else is that RDFox Datalog is easy to learn as we’ll now see.
Before we examine Datalog Constraints more closely it is useful to quickly describe RDFox Datalog. A general Datalog rule has the structure:
where <HEAD> and <BODY> are lists of triple patterns separated by commas. A triple pattern is a triple in which variables may occur in any (or all) of the subject, property or object positions. For example, the triple pattern
[?person, a, foaf:Person] matches all triples that have rdf:type in the property position and foaf:Person in the object position. The ?person variable will be bound to whichever name appears in the subject position of the matching triple.
When all of the triple patterns in a rule’s body match a subset of the data in a data store, RDFox adds in the facts specified by substituting the values bound by the rule body into the triple patterns in the rule head. For example, the following simple Datalog rule makes the relationship :marriedTo symmetric by ensuring that, wherever we have a statement to say that person A is married to person B, we should also have a statement that person B is married to person A.
RDFox Datalog includes several extensions such as filtering, negation and aggregation, to provide additional data analysis capabilities. A full description of these extensions is beyond the scope of this article, however the examples later will show these features in action.
A Datalog Constraint is nothing more than an RDFox Datalog rule that derives instances of the constraint violation class <http://oxfordsemantic.tech/RDFox#ConstraintViolation> (or just rdfox:ConstraintViolation where the rdfox: prefix has been defined) into the default graph. Deriving instances of this class into the default graph requires including a triple pattern of the form:
in the rule head where ?? indicates that it does not matter what appears in the subject position. We’ll discuss the choice of what to substitute for ?? in the first example below.
Let’s imagine we’re setting up our data store to hold test scores for a class of students. The maximum score for the test is 100 and we have decided to use the relation :testScore to record each student’s score. To enforce the constraint, we need a rule body that matches scores > 100. The following does just that using just a triple pattern and a filter literal:
We now need to combine this with a rule head containing our template triple pattern from above giving:
This is not yet a valid RDFox rule because of the ?? near the start of the first line which we must now replace. No matter what we replace it with, be it one of the variables from our rule body or even a constant, the resulting constraint will prevent test scores of more than 100 from being committed to the data store. So why not take one of those easy options? The answer is to do with the usefulness of the error message users will see when they try to add scores greater than 100.
When a transaction commit fails due to the presence of constraint violations, RDFox will include details of up to ten of those violations in the error message it returns. By ensuring that we use a different individual to represent each separate violation, we gain the benefit of more helpful messages when the constraint is violated.
With this in mind, we’ll use the rdfox:SKOLEM built-in tuple table to create a new individual to represent the constraint violation and bind it to the variable ?v. We can then save useful values from each violation instance by associating them with the ?v variable so that they will be included in any error message.
The completed constraint is:
With this constraint in place, importing the following Turtle:
results in the error message:
This clearly tells us that among the data we tried to commit were two violations of the constraint “Maximum test score is 100.”, each showing the student and outsized score they relate to. Mission accomplished!
In this example we’re setting up a data store to contain a mailing list using the foaf vocabulary. We want to ensure that we have at least one foaf:mbox property for every foaf:Person in the data store. The foaf:mbox property records a person’s email address. Instances of foaf:Person without such a property are just polluting our data store given its intended purpose.
Again, we need a rule body that will match subgraphs which violate the constraint. In this case we need to use negation to ensure that our rule body only matches where foaf:mbox is missing. We load the following prefixes and rule.
Now when we try to import the following Turtle:
we receive the error message:
Here we see that our Datalog constraint really is pinpointing the problematic part of the data in the transaction: there is no violation relating to Bob’s node as that has a foaf:mbox property.
The previous examples showed how to add constraints that apply locally, either to an individual relation or class, but it’s also possible to write constraints over entire collections using aggregation. To illustrate this, we’ll show how we ensure that a small sushi bar with 5 seats can’t be overbooked.
In the data store for our booking application, we have a different identifier for each hour-long slot. Customers make their reservations by creating a link from themselves to their desired slot using the :hasBooked relation. Our occupancy constraint is protected by the following rule which uses an AGGREGATE literal to count the number of bookings in each slot:
Unlike the rules shown in the earlier examples, instead of using the rdfox:SKOLEM function to create the actual violation, this rule uses CONCAT to compute a natural-language description of the violation and uses this as the violation instance. Since the property common to all constraint violations ([?v, a, rdfox:ConstraintViolation]) is filtered out of the properties printed in error messages, the messages returned as a result of this constraint will contain just the human-readable message itself.
With the above rule loaded, we’re ready to start taking bookings. First of all, a party of five book an eight o’clock slot at the restaurant in a single transaction which is accepted:
When a sixth person tries to join the party however:
the system gives a clear, human-readable message describing the problem:
This demonstrates the versatility of Datalog Constraints when it comes to providing helpful feedback.
Datalog Constraints bring the full power of RDFox’s best-in-class reasoning capabilities to bear on the problem of constraining data store content. We look forward to seeing what the growing community of developers working with RDFox will build with them!