Skip to content

Commit

Permalink
Transform nodeset2 files to OWL
Browse files Browse the repository at this point in the history
This is the first step of OPCUA to Semantic Web transformation.
The tool added can be used to parse the existing opcua companion specifications
and provide a RDF/OWL based representation. This can then later be used of tools
to validate and translate OPCUA machines to a common data language.

Related Epic IndustryFusion#514
Related User Stories IndustryFusion#555, IndustryFusion#571

Signed-off-by: marcel <wagmarcel@web.de>
  • Loading branch information
wagmarcel committed Aug 19, 2024
1 parent 80381f5 commit dbd6684
Show file tree
Hide file tree
Showing 28 changed files with 251,773 additions and 1 deletion.
5 changes: 4 additions & 1 deletion .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,7 @@ jobs:
cd semantic-model/dataservice && make setup && make lint test
- name: Build datamodel tools
run: |
cd semantic-model/datamodel/tools && make setup && make lint test
cd semantic-model/datamodel/tools && make setup && make lint test
- name: Build opcua tools
run: |
cd semantic-model/opcua && make setup && make lint test
3 changes: 3 additions & 0 deletions semantic-model/opcua/.flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[flake8]
max-line-length = 120
ignore = E722, W504
37 changes: 37 additions & 0 deletions semantic-model/opcua/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
#
# Copyright (c) 2024 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

PYTHON := python3
LINTER := python3 -m flake8
PIP := pip
HELM_DIR := ../../helm/charts/shacl
NAMESPACE := iff
TOLINT := nodeset2owl.py lib/nodesetparser.py lib/utils.py
PYTEST := python3 -m pytest


lint: requirements-dev.txt
$(LINTER) ${TOLINT}

setup: requirements.txt setup-dev
$(PIP) install -r requirements.txt

setup-dev: requirements-dev.txt
$(PIP) install -r requirements-dev.txt

test:
${PYTEST} tests
(cd tests/nodeset2owl; bash ./test.bash)
73 changes: 73 additions & 0 deletions semantic-model/opcua/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# Tools to translate from OPC/UA Information Model to Semantic Web standards

## nodeset2owl.py

This script translates OPCUA nodeset files to OWL (in ttl format).

```console
usage: nodeset2owl.py [-h] [-i [INPUTS [INPUTS ...]]] [-o OUTPUT] [-n NAMESPACE] [-v VERSIONIRI] [-b BASEONTOLOGY] [-u OPCUANAMESPACE] -p PREFIX [-t TYPESXSD] nodeset2

parse nodeset and create RDF-graph <nodeset2.xml>

positional arguments:
nodeset2 Path to the nodeset2 file

optional arguments:
-h, --help show this help message and exit
-i [INPUTS [INPUTS ...]], --inputs [INPUTS [INPUTS ...]]
<Required> add dependent nodesets as ttl
-o OUTPUT, --output OUTPUT
Resulting file.
-n NAMESPACE, --namespace NAMESPACE
Overwriting namespace of target ontology, e.g. http://opcfoundation.org/UA/Pumps/
-v VERSIONIRI, --versionIRI VERSIONIRI
VersionIRI of ouput ontology, e.g. http://example.com/v0.1/UA/
-b BASEONTOLOGY, --baseOntology BASEONTOLOGY
Ontology containing the base terms, e.g. https://industryfusion.github.io/contexts/ontology/v0/base/
-u OPCUANAMESPACE, --opcuaNamespace OPCUANAMESPACE
OPCUA Core namespace, e.g. http://opcfoundation.org/UA/
-p PREFIX, --prefix PREFIX
Prefix for added ontolgoy, e.g. "pumps"
-t TYPESXSD, --typesxsd TYPESXSD
Schema for value definitions, e.g. Opc.Ua.Types.xsd
```

### Create Default Specs
For local testing

bash ./translate_default_specs_local.bash

### Examples

Create core.ttl:

python3 nodeset2owl.py ${CORE_NODESET} -i ${BASE_ONTOLOGY} -v http://example.com/v0.1/UA/ -p opcua -o core.ttl


Create devices.ttl:

python3 nodeset2owl.py ${DI_NODESET} -i ${BASE_ONTOLOGY} core.ttl -v http://example.com/v0.1/DI/ -p devices -o devices.ttl

Create ia.ttl:

python3 nodeset2owl.py ${IA_NODESET} -i ${BASE_ONTOLOGY} core.ttl devices.ttl -v http://example.com/v0.1/IA/ -p ia -o ia.ttl

Create machinery.ttl:

python3 nodeset2owl.py ${MACHINERY_NODESET} -i ${BASE_ONTOLOGY} core.ttl devices.ttl -v http://example.com/v0.1/Machinery/ -p machinery -o machinery.ttl


Create pumps.ttl:

python3 nodeset2owl.py ${PUMPS_NODESET} -i ${BASE_ONTOLOGY} core.ttl devices.ttl machinery.ttl -v http://example.com/v0.1/Pumps/ -p pumps -o pumps.ttl

create pumpexample.ttl:

python3 nodeset2owl.py ${PUMP_EXAMPLE_NODESET} -i ${BASE_ONTOLOGY} core.ttl devices.ttl machinery.ttl pumps.ttl -n http://yourorganisation.org/InstanceExample/ -v http://example.com/v0.1/pumpexample/ -p pumpexample -o pumpexample.ttl



## extractType.py

Coming soon

164 changes: 164 additions & 0 deletions semantic-model/opcua/README_validation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,164 @@
# Semantic Data of OPCUA

In the following the mapping of OPCUA information model to NGSI-LD is described.
This is the information model we use as refernce
```
ParentObject (NodeId: ns=2;i=99)
├── ArrayObject_01 (NodeId: ns=2;i=100)
├── ArrayObject_02 (NodeId: ns=2;i=200)
├── ChildObjectX (NodeId: ns=2;i=101)
├── ChildObjectY (NodeId: ns=2;i=102)
├── DataVariableX (NodeId: ns=2;i=201, datatype: string)
│ └── PropertyX (NodeId: ns=2;i=301, value=0)
├── DataVariableY (NodeId: ns=2;i=202, datatype: integer)
│ └── PropertyY (NodeId: ns=2;i=302, value=true)
├── DataVariableZ (NodeId: ns=2;i=202, datatype: boolean)
├── PropertyZ (NodeId: ns=2;i=103, value="test")
```

## Conversion of OPCUA Objects to NGSI-LD:
The id and type are assumed to be `urn:mainid:nodei101` and type `ParentObjectType`

```
{
"id": "urn:mainid:nodei99",
"type": "ParentObjectType",
"hasChildObjectX":{
"type": "Relationship",
"object": "urn:mainid:sub:nodei101"
},
"hasChildObjectY":{
"type": "Relationship",
"object": "urn:mainid:sub:nodei102"
}
}
```

## Conversion of OPCUA Data Variables to NGSI-LD:


```
{
"id": "urn:mainid:nodei99",
"type": "ParentObjectType",
"hasDataVariableX":{
"type": "Property",
"value": "string"
},
"hasDataVariableY":{
"type": "Property",
"value": 0
},
"hasDataVariableZ":{
"type": "Property",
"value": false
},
}
```

## Conversion of OPCUA Properties
Properties of Objects are only differentiable from Datavariables by metadata provided by entities.ttl (see below)

```
{
"id": "urn:mainid:nodei99",
"type": "ParentObjectType",
"hasDataVariableX":{
"type": "Property",
"value": "string",
"hasPropertyX": "test"
},
"hasDataVariableY":{
"type": "Property",
"value": 0,
"hasPropertyY": 0
},
"hasDataVariableZ":{
"type": "Property",
"value": false
},
"hasPropertyZ": false
}
```
## Semantic conversion of OPCUA Object-arrays

Objects which are part of an array are typcially defined with a template <> definition. E.g. object_<no> means that there could be object_1, object_2, ... browsepath.
The problem of this is that the name convention is not exactly specified, so object_#1, object#2, ... or object_01, object_02, ... is also possible. Moreover, this makes it difficult to write logic which addresses all element in an array because one needs to guess the names with a pattern. Therefore, we treat this case different. A NGSI-LD instance of such an object would look as follows:

```
{
"id": "urn:mainid:nodei99",
"type": "ParentObjectType",
"ArrayObject": [
{
"type": "Relationship",
"object": "urn:mainid:sub:nodei100",
"datasetId": "urn:iff:datasetId:ArrayObject_01"
},
{
"type": "Relationship",
"object": "urn:mainid:sub:nodei200",
"datasetId": "urn:iff:datasetId:ArrayObject_02"
}
]
}
```

Node that you can now access all objects at once, e.g. with a SHACL expression but still you can select one specific object by using the respective `datasetId` reference or the `id` of the specific object.

## Input to the semantic conversion
Besides the companion specifications, there are two files needed:
1. A `nodeset2` file which contains the Type Definitions of the machine.
2. A `nodeset2` file which contains a snapshot of the Instance

As a result, there are the following files created:

## `instances.jsonld`

Contains all entities with respective Properties and Relationships of the machine instances.

## `shacl.ttl`

Contains all SHACL rules which could be derived automatically from the Type definition `nodeset2`.

## `entities.ttl`

Contains all generic semantic information related to the entities which could be derived for Relationships and Properties. It also include the type hierarchy.


```
uaentity:hasDataVariableX a owl:NamedIndividual,
owl:ObjectProperty ;
rdfs:domain uaentity:ParentObjectType ;
rdfs:range ngsi-ld:Property ;
base:hasOPCUADatatype opcua:String .
uaentity:hasChildObjectX a owl:NamedIndividual,
owl:ObjectProperty,
base:SubComponentRelationship ;
rdfs:domain uaentity:RootObjectType ;
rdfs:range ngsi-ld:Relationship ;
uaentity:hasArrayObject a owl:NamedIndividual,
owl:ObjectProperty,
base:SubComponentRelationship ;
rdfs:domain uaentity:RootObjectType ;
rdfs:range ngsi-ld:Relationship ;
base:isPlaceHolder true .
```

## `knowledge.ttl`

Contains all information about additional classes e.g. describing enums

# Validation

## Offline Validation

For offline validation one can apply

pyshacl -s shacl.ttl -e entities.ttl -df json-ld instances.jsonld

To demonstrate the need of the entities.ttl, try validation without entities.ttl. Dependent on the models you will see additional errors due to the fact that SHACL validator cannot use the knowledge that one type is a subtype of something else.
Loading

0 comments on commit dbd6684

Please sign in to comment.