Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
sdaros committed Mar 26, 2018
1 parent d6856ac commit 548cfb2
Showing 1 changed file with 136 additions and 120 deletions.
256 changes: 136 additions & 120 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,87 +4,72 @@ The Workflow Accelerator Connector is a RESTful web service which acts as a prox

![Overview](https://cip.li/res/wfa-connector-overview.png)

In order to use the connector with Signavio's Workflow Accelerator, the connector must first be running on a server that is accessible to the public internet. After the connector is running and the databases have been provisioned, a Workflow Accelerator administrator can [add](https://docs.signavio.com/userguide/workflow/en/integration/connectors.html#configuring-a-connector) the connector to a workspace under _Services & Connector_. A process owner can then use the connector in a process to populate a drop down field dynamically using data from the database. More information about this can be found [here](https://docs.signavio.com/userguide/workflow/en/integration/connectors.html)
In order to use the connector with Signavio's Workflow Accelerator, the connector must first be running on a server that is accessible to the public internet. After the connector is running and the databas has been provisioned, a Workflow Accelerator administrator can [add](https://docs.signavio.com/userguide/workflow/en/integration/connectors.html#configuring-a-connector) the connector to a workspace under _Services & Connector_ menu entry. A process owner can then use the connector in a process to populate a drop down field dynamically using data from the database. More information about this can be found [here](https://docs.signavio.com/userguide/workflow/en/integration/connectors.html)

## Features

- Insert, update and delete table rows using a standard RESTful API
- Supports multiple databases (MicrosoftSQL Server, Sqlite3, MySQL, PostgresSQL)
- Insert and update table rows using a standard RESTful API
- Supports multiple SQL databases (MicrosoftSQL Server, Sqlite3, MySQL, PostgresSQL)

## Installation
## Deployment

The connector is written in [go](https://golang.org) and can be downloaded from the [release page](TODO) for linux and windows platforms. Alternatively, the executable (`workflow-connector`) can be generated by compiling the source code [locally](#Install_from_source_(on_linux)), after resolving all package dependencies, or [in a docker container](#Install_from_source_(using_docker)). If you decide to compile the executable yourself and are not well versed with go, then it is recommended to generate the executable file in a docker container.
The following examples will demonstrate how to deploy the workflow connector web service in the cloud, or on premise.

### Install from source (bare metal)
1. Download and install go from your distrubtion's package manager (for ubuntu `apt-get install go`) and make sure you are using version >= 1.9
2. Download and install the workflow-connector using the `go get` command on the command line. The workflow-connector source code and all dependencies will be downloaded to the location pointed to by your `$GOPATH` environment variable. Here we assume that this is located in `~/go`.
### In the cloud (using Heroku) ###

```sh
go get -v github.com/signavio/workflow-connector
```
3. Download the `dep` utility in order to install the projects dependencies locally.

```sh
# install dep utility
go get github.com/golang/dep/cmd/dep
# install project's dependencies
${GOPATH}/bin/dep ensure -vendor-only
```
4. compile the source code into an architecture specific executable. Adjust your `$GOARCH` and `$GOOS` variables as needed.

```sh
# export the required environment variables
export CGO_ENABLED=1 GOARCH=amd64 GOOS=linux
# build the code and generate the `workflow-connector` binary
go build -o workflow-connector cmd/wfadb/main.go
```
The following [screencast](https://drive.google.com/file/d/1V8Kizoka-5L-56SpqTRxshCBBDyerB7v/view?usp=sharing) will show how to use heroku to install, configure and deploy the workflow connector.

### Install from source (using docker)
### On premise (bare metal) ###

TODO
Deploying the workflow connector on premise is as simple as copying the executable file, and related configuration files, to a directory on the server where it should be run. The following section demonstrate how this can be done in an ubuntu linux environment.

## Deployment
#### Installation ####

### Regular server (bare metal)
The connector can be downloaded from the [release page](TODO) for linux and windows platforms. Alternatively, the executable can be generated by compiling the source code as shown below.

Deploying the workflow connector web service is as simple as copying the executable file, and related configuration files, to a directory on the server where it should be run. The following code snippets demonstrate how this can be done in an ubuntu linux environment.
##### Install from source #####

1. First, download the repository from github to a directory where you would normally store application specific configuration files.
1. Download and install go from your distrubtion's package manager (for ubuntu `apt-get install go`) and make sure you are using version >= 1.9
2. Set the `$GOPATH` environment variable and add `$GOPATH/bin` to your `$PATH`. Here we assume that `$GOPATH` points to `~/go`.

```sh
mkdir -p ~/.config/
git clone https://github.com/signavio/workflow-connector ~/.config/
mkdir ~/go
echo "export GOPATH=$HOME/go" >> ~/.bashrc
echo "export PATH=\$PATH:\$GOPATH/bin" >> ~/.bashrc
source ~/.bashrjc
```

2. You can copy the default configuration files (`config/descriptor.json` and `config/config.yaml`) and edit the files to match your needs. See the [configuration] section for more details.
3. Download and install the workflow-connector using the `go get` command on the command line.

```sh
cd ~/.config/signavio/workflow-connector
# Edit the descriptor.json file to match the
# data model of your SQL databases
cp config/descriptor.json . && vi descriptor.json

# Edit the config.yaml file and specify environment
# specific settings like port number, etc.
cp config/config.yaml.example ./config.yaml && vi config.yaml
go get -v github.com/signavio/workflow-connector
```
3. Copy the executable file to a directory located in your `$PATH`. Here we assume that `~/bin` is in your `$PATH` and that the executable file has already been downloaded from github release page into the `~/Downloads` directory.
4. Download the `dep` utility in order to install the projects dependencies locally.

```sh
cp ~/Downloads/workflow-connector_amd64_linux ~/bin/
# install dep utility
go get github.com/golang/dep/cmd/dep
# install project's dependencies
dep ensure -vendor-only
```
4. Run the executable file and see [section x] to use `curl` to perform rudimentary functionality testing
5. compile the source code into an architecture specific executable. Adjust your `$GOARCH` and `$GOOS` variables as needed.

```sh
# ~/bin/workflow-connecto_amd64_linux
# export the required environment variables
export CGO_ENABLED=1 GOARCH=amd64 GOOS=linux
# compile the code and generate the `workflow-connector` binary
go build -o workflow-connector cmd/wfadb/main.go
```

### Containerized (using docker)
6. The executable is now located in `$GOPATH/github.com/signavio/workflow-connector/workflow-connector`. At this point you can copy the executable somewhere in your `$GOPATH`

TODO
```sh
# Here we assume ~/bin/ is in your $PATH
copy $GOPATH/github.com/signavio/workflow-connector/workflow-connector ~/binj/
```

## Configuration
#### Configuration ####

##### config.yaml #####
All program and environment specific configuration settings (like database connection information, username, password, etc.) should be saved in a `config.yaml` file that is located in the same directory as the executable, or in one of the following directories:

| **Linux** |
Expand All @@ -93,36 +78,47 @@ All program and environment specific configuration settings (like database conne
| /etc/ |
| ~/.config/workflow-connector |

All settings may instead be configured using the system's environment variables. For example, you can specify the database connection url by exporting the environment variable `DATABASE_URL=sqlserver://john:84mj29rSgHz@172.17.8.2?database=test`. This means that nested fields in the yaml file should be delimited with a '_' (underscore) character when used in an environment variable. All configuration settings declared via environment variables will take precedence over the settings in your `config.yaml` file.
All configuration settings in `config.yaml` can also be specified as environment variables. For example, you can specify the database connection url by exporting the environment variable `DATABASE_URL=sqlserver://john:84mj29rSgHz@172.17.8.2?database=test`. This means that nested fields in the yaml file are delimited with a '_' (underscore) character when used in an environment variable. All configuration settings declared via environment variables will take precedence over the settings in your `config.yaml` file.

##### descriptor.json #####

## Example (using heroku)
The workflow connector also needs to know the schema of the data it will receive from the database. This is stored in the connector descriptor file `descriptor.json` and an example is provided in this repository. You can also refer to the [workflow documentation](https://docs.signavio.com/userguide/workflow/en/integration/connectors.html#connector-descriptor) for more information.

The following [screencast](https://drive.google.com/file/d/1V8Kizoka-5L-56SpqTRxshCBBDyerB7v/view?usp=sharing) shows you how to install and configure the workflow database connector using Heroku.
##### HTTP basic auth #####

## Example (using sqlite)
The webservice will only respond to clients using HTTP basic auth. This can be enabled by setting `tls.enabled = true` and providing valid TLS certificates in the `config.yaml` file. The username for HTTP basic auth is stored in as plain text in `config.yaml` but the password is stored salted and hashed using [argon2](https://passlib.readthedocs.io/en/stable/lib/passlib.hash.argon2.html). You can use the following commands to generate a argon2 password hash.

This example will assume you are running in a linux environment (using a ubuntu distro) and are using a sqlite database located in the same directory as the executable.
1. Install passlib using python `pip`

### Prerequisites
```sh
pip install passlib
```

1. Download this github repository to a local directory on your computer
2. Use the python shell in the command line to generate an argon2 password hash

```bash
git clone https://github.com/signavio/workflow-connector && cd wfa-connector
```python
from passlib.hash import argon2
argon2.hash("password")
```

2. Download and install sqlite and golang
#### Testing ####

```bash
apt-get install sqlite go
You can test the deployment with a local sqlite database to make sure that the REST API is behaving properly. The following sections demonstrate how this can be done.

##### Prerequisites #####

1. Download and install sqlite

```sh
apt-get install sqlite
```

3. Create the database file
```bash
touch test.db
2. Create the database file in the config directory we used earlier.
```sh
touch ~/.config/workflow-connector/test.db
```

### Populate the database
##### Populate the database #####

For testing purposes, we will create a table called `equipment` and populate it with data. The table will end up looking like this:

Expand All @@ -136,9 +132,9 @@ For testing purposes, we will create a table called `equipment` and populate it
| 4 | Masch Tun (50L) | 199.99 | 2016-09-04 11:00:00 |


We can accomplish this by writing the necessary sql statements to a temporary file and then import the file into a sqlite database.
We will write the necessary sql statements to a temporary file and then import the file into a sqlite database.

```bash
```sh
echo "\
CREATE TABLE IF NOT EXISTS equipment ( \
id integer not null primary key, \
Expand All @@ -154,7 +150,8 @@ VALUES \
" > /tmp/test.db
```

```bash
```sh
cd ~/.config/workflow-connector
sqlite3 test.db < /tmp/test.db
```

Expand All @@ -172,115 +169,134 @@ sqlite> SELECT * FROM equipment;
sqlite> .quit
```

Now exit out of the sqlite command line interface (using the command `.quit`)

### Run the workflow connector
##### Run the workflow connector #####

Before running the `workflow-connector` command, you should either edit the `config.yaml` file to include the database connection parameters and other settings, or export these settings as environment variables. Running the `workflow-connector` command will then listen on the specified port (here port 8080) for inbound HTTP Requests.
Before running the `workflow-connector` command, either edit the `config.yaml` file to include the database connection parameters and other settings, or export these settings as environment variables.

```sh
# Export environment variables
#
export PORT=:8080 DATABASE_URL=test.db DATABASE_DRIVER=sqlite3
#
# Run the connector
./workflow-connector
~/bin/workflow-connector
Listening on :8080

```

### Test the REST API
##### Exercise the REST API #####

Now we can test the functionality of the connector's REST API either in a new terminal, or using the following the postman collection.
Now we can test the functionality of the connector's REST API either in a new terminal, or using the following the [postman collection](TODO). All HTTP requests are sent using HTTP basic auth with the default username (`wfauser`) password (`Foobar`) combination here.

Go ahead and fetch the product with id 1 by sending a `HTTP GET` request to the connector using the `curl` command (you can `apt-get install curl` if `curl` is not yet installed):

```bash
curl --verbose --request GET http://localhost:8080/products/1
```sh
curl --verbose --header "Authorization: Basic $(echo -n "wfauser:Foobar" | base64)" --request GET http://localhost:8080/equipment/1
# Response:
## Headers
> GET /products HTTP/1.1
> GET /equipment/1 HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.56.1
> User-Agent: curl/7.55.1
> Accept: */*
> Authorization: Basic d2ZhdXNlcjpGb29iYXI=
>
< HTTP/1.1 200 OK
< Content-Type: application/json
< Date: Mon, 13 Nov 2017 15:10:03 GMT
< Content-Length: 60
< Date: Fri, 23 Mar 2018 21:33:47 GMT
< Content-Length: 595
<
## Data
{
"cost": {
"amount": 119,
"currency": "EUR"
},
"equipmentMaintenance": [],
"equipmentWarranty": [],
"id": "1",
"price": 2,
"product_name": "Club-Mate"
"name": "Stainless Steel Cooling Spiral",
"purchaseDate": "2017-09-07T12:00:00Z"
}
$
```

You will see that the webservice returned the product with id 1 as `application/JSON`

#### Insert a new product in the database
###### Insert a new product in the database ######

You can create a new product by sending a `HTTP POST` to the webservice

```bash
curl --verbose --request POST --header "Content-Type: application/x-www-form-urlencoded" --data "price=0.84&product_name=Schlenkerla+Rauchbier" http://localhost:3000/products
```sh
curl --verbose --header "Authorization: Basic $(echo -n "wfauser:Foobar" | base64)" --request POST --data 'name=Malt+mill+550&acquisitionCost=1270&purchaseDate=2016-09-04+11:00:00' http://localhost:8080/equipment

# Response:
## Headers
> POST /products HTTP/1.1
> POST /equipment HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.56.1
> Accept: */*
> Authorization: Basic d2ZhdXNlcjpGb29iYXI=
> Content-Type: application/x-www-form-urlencoded
> Content-Length: 45
>
* upload completely sent off: 45 out of 45 bytes
< HTTP/1.1 200 OK
< Content-Type: application/json
< Date: Mon, 13 Nov 2017 15:12:33 GMT
< Date: Fri, 23 Mar 2018 21:33:47 GMT
< Content-Length: 2
<
## Data
{}
$
```

Let's verify that the new product was inserted into the database

```bash
curl --request GET http://localhost:8080/products/4
# Response:
{
"id": "4",
"price": 0.84,
"product_name": "Schlenkerla Rauchbier"
"cost": {
"amount": 1270,
"currency": "EUR"
},
"equipmentMaintenance": [],
"equipmentWarranty": [],
"id": "5",
"name": "Malt mill 550",
"purchaseDate": "2017-09-04T11:00:00Z"
}
$
```

#### Updating an existing product
###### Updating an existing product ######

You may be thinking, a delicious, franconian Rauchbier definitely costs more than .84€, and you would be right. So let's adjust that price to something more realistic by sending a `HTTP PUT` with the new price.
By sending a `HTTP PUT` to the web service you can change existing with entries. Let's go ahead an adjust the name of the malt mill we just added recently.

```bash
curl --verbose --request PUT --data "price=1.25" http://localhost:8080/products/4
# Response:
{}

# Now verify the new price
curl --request GET http://localhost:8080/products/4
# Response:
```sh
curl --verbose --header "Authorization: Basic $(echo -n "wfauser:Foobar" | base64)" --request PUT --data 'name=Malt+mill+400' http://localhost:8080/equipment/5

# Response:
## Headers
> PUT /equipment HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.56.1
> Accept: */*
> Authorization: Basic d2ZhdXNlcjpGb29iYXI=
> Content-Type: application/x-www-form-urlencoded
> Content-Length: 45
>
* upload completely sent off: 45 out of 45 bytes
< HTTP/1.1 200 OK
< Content-Type: application/json
< Date: Fri, 23 Mar 2018 21:33:47 GMT
< Content-Length: 2
<
## Data
{
"id": "4",
"price": 1.25,
"product_name": "Schlenkerla Rauchbier"
"cost": {
"amount": 1270,
"currency": "EUR"
},
"equipmentMaintenance": [],
"equipmentWarranty": [],
"id": "5",
"name": "Malt mill 400",
"purchaseDate": "2017-09-04T11:00:00Z"
}
$
```

#### Deleting an existing product
###### Deleting an existing product ######

TODO: deletion is not supported at the moment.

Expand Down

0 comments on commit 548cfb2

Please sign in to comment.