Diplomatik is a lightweight and extensible service that adapts a single set of APIs to query a variety of data sources. Its APIs resemble SQL and act as if you are directly querying the underlying database or data warehouse. At the same time, it takes care of datasource compatability and security issues. For example, the same APIs can be used to query a Postgres database or a Big Query data warehouse.
Learn more here: https://fretz12.github.io/diplomatik
SQL is a powerful language used to query and transform data. Applications that allow the end user to explore or manage data typically hide it layers deep in the backend. Exposing SQL directly to the end user would put more flexibility and power in the user's hands. However, exposing SQL syntax directly via APIs presents some problems:
-
Different data sources can have different SQL syntax or DSLs
-
The API caller (i.e. a front-end developer) may have no background in writing SQL, let alone figuring out data source specific syntax quirks
-
Security issues, such as SQL injection attacks
Diplomatik is a middleware service that solves the above problems with a single set of SQL-like APIs and plugs into many data sources. It is focused on being developer-oriented, API-first, and extensible for custom scenarios.
-
Single set of APIs to query multiple data sources
-
Flexibility and power of SQL expressed in API data models
-
Seamless handling of security, multi-tenancy, and permission concerns
-
Extensible: build your own SQL-based APIs
-
Powerful SQL-based filters, functions, expressions, deep nesting
- Docker
- Docker Compose
- jq
Please follow installation instructions for above tools
git clone https://github.com/fretz12/diplomatik.git
cd diplomatik
Execute the provided Docker Compose script to launch the Diplomatik service and a demo Postgres database locally. The Postgres database will be the targeted data source where we provide queries to it.
docker compose -f deployment/docker/demo/docker-compose.yml up
Let's start querying database with Diplomatik's APIs!
First, we'll create a table:
curl 'http://127.0.0.1:8000/queries/write' \
-H 'Content-Type: application/json' \
-H 'accept: application/json' \
--data-raw '
{
"query_type": "data_source_management",
"data_source_config": {
"source_type": "postgres"
},
"query_id": "my_unique_id",
"command": {
"command_type": "create_table",
"table_name": "my_new_table",
"column_definitions": [
{
"column_name": "id",
"data_type": "auto_increment_int",
"extra_definitions": "PRIMARY KEY"
},
{
"column_name": "str_column",
"data_type": "string",
"byte_count": 255
},
{
"column_name": "int_column",
"data_type": "int32",
"extra_definitions": "NOT NULL"
}
]
}
}
'
'
Next, let's add some data:
curl 'http://127.0.0.1:8000/queries/write' \
-H 'Content-Type: application/json' \
-H 'accept: application/json' \
--data-raw '
{
"query_type": "insert",
"data_source_config": {
"source_type": "postgres"
},
"query_id": "my_unique_id",
"insert_type": "values",
"table": {
"table_name": "my_new_table"
},
"columns": [
{
"field_type": "column",
"column_name": "str_column"
},
{
"field_type": "column",
"column_name": "int_column"
}
],
"values": [
[
"A",
10
],
[
"B",
20
],
[
"C",
30
]
]
}
'
With the data in the table, let's see it:
curl 'http://127.0.0.1:8000/queries/write' \
-H 'Content-Type: application/json' \
-H 'accept: application/json' \
--data-raw '
curl -X POST 'http://127.0.0.1:8000/queries/read' \
-H 'Content-Type: application/json' \
-H 'accept: application/json' \
--data-raw '
{
"query_type": "search",
"data_source_config": {
"source_type": "postgres"
},
"query_id": "my_unique_id",
"source_extraction": {
"query_type": "select",
"extraction_type": "select",
"source_formation": {
"formation_type": "single_table",
"table": {
"table_name": "my_new_table"
}
},
"fields": [
{
"field_type": "column",
"column_name": "str_column"
},
{
"field_type": "column",
"column_name": "int_column"
}
]
}
}
' | jq
And we can query it with some basic filtering and math transformations:
curl -X POST 'http://127.0.0.1:8000/queries/read' \
-H 'Content-Type: application/json' \
-H 'accept: application/json' \
--data-raw '
{
"query_type": "search",
"data_source_config": {
"source_type": "postgres"
},
"query_id": "my_unique_id",
"source_extraction": {
"query_type": "select",
"extraction_type": "select",
"source_formation": {
"formation_type": "single_table",
"table": {
"table_name": "my_new_table"
}
},
"fields": [
{
"field_type": "column",
"column_name": "str_column"
},
{
"field_type": "value",
"expression": "__field_placeholder__ * 2 + 10",
"fields": [
{
"field_type": "column",
"column_name": "int_column"
}
],
"alias": "my_transformed_column"
}
],
"filter": {
"filter_type": "or",
"filters": [
{
"filter_type": "equals",
"lhs": {
"field_type": "column",
"column_name": "str_column"
},
"rhs": {
"field_type": "value",
"expression": "A"
}
},
{
"filter_type": "equals",
"lhs": {
"field_type": "column",
"column_name": "int_column"
},
"rhs": {
"field_type": "value",
"expression": "30"
}
}
]
}
}
}
' | jq
That's it! For more examples, head over here