4

I want to solve the famous zebra puzzle by A. Einstein with the help of semantic web tools for python, preferably owlready .

The starting point are two owl-files linked in https://github.com/RDFLib/OWL-RL/issues/3. The first one (XML-syntax) works as expected. The second one (different author, n3 turtle syntax) fails to load or only loads "partially" in owlready. For easier comprehension of my steps I documented them in this notebook: https://github.com/cknoll/demo-material/blob/main/expertise_system/einstein-zebra-puzzle-owlready-solution-attempt.ipynb.

<Edit1> For better reference I include the essential part of the notebook:

import os
import owlready2 as owl2

data_path = "ontology_data"
path2 = os.path.join(data_path, "zebra.n3.txt") # original turtle syntax
path2 = os.path.join(data_path, "zebra.n3.txt.xml") # created with ontospy
onto = owl2.get_ontology(path2).load()

list(onto.classes()) # -> empty list -> loading seems to have failed

# now try xml syntax
# create a new world
owl2.default_world = owl2.World()
onto = owl2.get_ontology(path2x).load()

list(onto.classes()) # -> expected result
list(onto.properties()) # -> expected result
onto.hasPet # -> expected result


# unexpected/wrong:

onto.individuals # -> empty list

list(onto.livesIn.get_relations())  # -> KeyError 

</Edit1>

Conclusion: I can load the XML-version of this ontology but I cannot confirm that the concepts are defined as owl:oneOf-objects nor can find where assertions like


:Norwegian :livesIn :House1 .


:Norwegian :livesIn [ :isNextTo [ :hasColor :Blue ] ] .

ended up.

I would be glad for some hints.

(I think, once the ontology is correctly represented the solution could be obtained via something like sync_reasoner_pellet(infer_property_values=True, infer_data_property_values=True))

cknoll
  • 2,130
  • 4
  • 18
  • 34
  • 1
    Although links are helpful, all information that is necessary to understand the question should be embedded in the question. – trincot Oct 24 '20 at 08:01
  • 1
    just load the file and use a reasoner like Pellet. What else do you want to know? – UninformedUser Oct 24 '20 at 08:03
  • 1
    That is not my point. It is a matter of policy on Stack Overflow. Good questions will remain here for many years to come, but links break. – trincot Oct 24 '20 at 08:09
  • 1
    My comment was for the TO, not to you. They just have to load the ontology file in owlready and run the reasoner. There is nothing else they can do - either the inferences will show up or not. – UninformedUser Oct 24 '20 at 10:03
  • @trincot I feared that it would be too much source code but now I condensed it to the relevant part. I also formulated the question more precisely. Hope it is better now. – cknoll Oct 24 '20 at 10:18
  • @UninformedUser Due to some failure of mine I did not realize that the first ontology file actually works quite well. Pellet gives the expected result. However the second file seems to load only partially (individuals are missing) and therefore pellet cannot infer anything useful. II have edited the question to reflect this aspect. – cknoll Oct 24 '20 at 10:21
  • 1
    I see, I remember same issues with the parser - or at least I didn't understand the different behavior. Can try the following please: load the 2. file which doesn't work into Protege and write it back to a file. This will ensure that OWL 2 declaration axioms will be used for the entities. At least that was the only difference last time we had issues. – UninformedUser Oct 24 '20 at 10:30
  • @UninformedUser This seems to help. `onto.individuals()` now is not empty anymore. However `list(onto.livesIn.get_relations())` still gives a KeyError. And Pellet runs about 100s without a meaningful result (more roles). Maybe something is just wrong with that ontology, independently of the dialect. Additionally it seems like the xml-file which was produced by `ontospy` has some flaws together with owlready (in proteget it works fine). Nevertheless I would prefer a command line tool for dialect conversion. – cknoll Oct 24 '20 at 11:04

0 Answers0