You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/usage/09_further_resources.md
+32-3Lines changed: 32 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,18 +4,19 @@ You can find more details in the related papers for each algorithm:
4
4
5
5
Concept Learning:
6
6
7
-
-**TDL**→ Tree-based OWL Class Expression Learner for Large Graphs (manuscript will be added soon)
7
+
-**TDL**→[Tree-based OWL Class Expression Learner for Large Graphs](https://dl.acm.org/doi/10.1007/978-3-032-06066-2_29)
8
8
-**Drill**→[Neuro-Symbolic Class Expression Learning](https://www.ijcai.org/proceedings/2023/0403.pdf)
9
9
-**EvoLearner**→[EvoLearner: Learning Description Logics with Evolutionary Algorithms](https://dl.acm.org/doi/abs/10.1145/3485447.3511925)
10
10
-**NCES2**→[Neural Class Expression Synthesis in ALCHIQ(D)](https://papers.dice-research.org/2023/ECML_NCES2/NCES2_public.pdf)
11
11
-**ROCES**→[Robust Class Expression Synthesis in Description Logics via Iterative Sampling](https://www.ijcai.org/proceedings/2024/0479.pdf)
12
12
-**NCES**→[Neural Class Expression Synthesis](https://link.springer.com/chapter/10.1007/978-3-031-33455-9_13)
13
-
-**NERO***→ (soon)[Learning Permutation-Invariant Embeddings for Description Logic Concepts](https://link.springer.com/chapter/10.1007/978-3-031-30047-9_9)
13
+
-**NERO**→[Learning Permutation-Invariant Embeddings for Description Logic Concepts](https://link.springer.com/chapter/10.1007/978-3-031-30047-9_9)
14
14
-**CLIP**→[Learning Concept Lengths Accelerates Concept Learning in ALC](https://link.springer.com/chapter/10.1007/978-3-031-06981-9_14)
15
15
-**CELOE**→[Class Expression Learning for Ontology Engineering](https://www.sciencedirect.com/science/article/abs/pii/S1570826811000023)
16
16
-**OCEL**→ A limited version of CELOE
17
+
-**SPELL**→[SAT-Based PAC Learning of Description Logic Concepts](https://www.ijcai.org/proceedings/2023/0373.pdf)
18
+
-**ALCSAT**→[SAT-Based Bounded Fitting for the Description Logic $\mathcal{ALC}$](https://arxiv.org/pdf/2507.21752)
17
19
18
-
*_Not implemented in our library yet._
19
20
20
21
Sampling:
21
22
-**OntoSample**→[Accelerating Concept Learning via Sampling](https://dl.acm.org/doi/10.1145/3583780.3615158)
@@ -116,6 +117,34 @@ address="Cham"
116
117
publisher={Springer Nature Switzerland}
117
118
}
118
119
120
+
# NERO
121
+
@InProceedings{10.1007/978-3-031-30047-9_9,
122
+
author="Demir, Caglar
123
+
and Ngonga Ngomo, Axel-Cyrille",
124
+
editor="Cr{\'e}milleux, Bruno
125
+
and Hess, Sibylle
126
+
and Nijssen, Siegfried",
127
+
title="Learning Permutation-Invariant Embeddings for Description Logic Concepts",
128
+
booktitle="Advances in Intelligent Data Analysis XXI",
129
+
year="2023",
130
+
publisher="Springer Nature Switzerland",
131
+
address="Cham",
132
+
pages="103--115",
133
+
abstract="Concept learning deals with learning description logic concepts from a background knowledge and input examples. The goal is to learn a concept that covers all positive examples, while not covering any negative examples. This non-trivial task is often formulated as a search problem within an infinite quasi-ordered concept space. Although state-of-the-art models have been successfully applied to tackle this problem, their large-scale applications have been severely hindered due to their excessive exploration incurring impractical runtimes. Here, we propose a remedy for this limitation. We reformulate the learning problem as a multi-label classification problem and propose a neural embedding model (NERO) that learns permutation-invariant embeddings for sets of examples tailored towards predicting {\$}{\$}F{\_}1{\$}{\$}F1scores of pre-selected description logic concepts. By ranking such concepts in descending order of predicted scores, a possible goal concept can be detected within few retrieval operations, i.e., no excessive exploration. Importantly, top-ranked concepts can be used to start the search procedure of state-of-the-art symbolic models in multiple advantageous regions of a concept space, rather than starting it in the most general concept {\$}{\$}{\backslash}top {\$}{\$}⊤. Our experiments on 5 benchmark datasets with 770 learning problems firmly suggest that NERO significantly (p-value {\$}{\$}<1{\backslash}{\%}{\$}{\$}<1{\%}) outperforms the state-of-the-art models in terms of {\$}{\$}F{\_}1{\$}{\$}F1score, the number of explored concepts, and the total runtime. We provide an open-source implementation of our approach (https://github.com/dice-group/Nero).",
134
+
isbn="978-3-031-30047-9"
135
+
}
136
+
137
+
# OWLAPY
138
+
@misc{baci2025owlapypythonicframeworkowl,
139
+
title={OWLAPY: A Pythonic Framework for OWL Ontology Engineering},
140
+
author={Alkid Baci and Luke Friedrichs and Caglar Demir and Axel-Cyrille Ngonga Ngomo},
0 commit comments