Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prettification of non-code files #158

Merged
merged 4 commits into from
Jun 5, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .drone.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ steps:
commands:
- CGO_ENABLED=0 go test -v ./...

- name: check spelling
- name: ci-checks
image: nixos/nix
commands:
- ./ci-checks.sh
Expand Down
66 changes: 34 additions & 32 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,42 +1,44 @@
# Tinkerbell
# Tinkerbell

[![Build Status](https://cloud.drone.io/api/badges/tinkerbell/tink/status.svg)](https://cloud.drone.io/tinkerbell/tink)

Tinkerbell is a bare metal provisioning and workflow engine comprised of three major components: a DHCP server ([boots](https://github.com/packethost/boots)), a workflow engine (tinkerbell, this repository), and a metadata service ([hegel](https://github.com/packethost/hegel)). The workflow engine is comprised of a server and a CLI, which communicate over gRPC. The CLI is used to create a workflow along with its building blocks, templates and targeted hardware.
Tinkerbell is a bare metal provisioning and workflow engine comprised of three major components: a DHCP server ([boots](https://github.com/packethost/boots)), a workflow engine (tinkerbell, this repository), and a metadata service ([hegel](https://github.com/packethost/hegel)).
The workflow engine is comprised of a server and a CLI, which communicate over gRPC.
The CLI is used to create a workflow along with its building blocks, templates and targeted hardware.

# Packet Workflow
# Packet Workflow

A Packet Workflow is an open-source microservice that’s responsible for handling flexible, bare metal
provisioning workflows, that is...
- standalone and does not need the Packet API to function
- contains `Boots`, `Tinkerbell`, `Osie`, and workers
- can bootstrap any remote worker using `Boots + Osie`
- can run any set of actions as Docker container runtimes
- receive, manipulate, and save runtime data
A Packet Workflow is an open-source microservice that’s responsible for handling flexible, bare metal provisioning workflows, that is...

- standalone and does not need the Packet API to function
- contains `Boots`, `Tinkerbell`, `Osie`, and workers
- can bootstrap any remote worker using `Boots + Osie`
- can run any set of actions as Docker container runtimes
- receive, manipulate, and save runtime data

## Content
- [Setup](docs/setup.md)
- [Components](docs/components.md)
- [Boots](docs/components.md#boots)
- [Osie](docs/components.md#osie)
- [Tinkerbell](docs/components.md#tinkerbell)
- [Hegel](docs/components.md#hegel)
- [Database](docs/components.md#database)
- [Image Registry](docs/components.md#registry)
- [Elasticsearch](docs/components.md#elastic)
- [Fluent Bit](docs/components.md#fluent-bit)
- [Kibana](docs/components.md#kibana)
- [Architecture](docs/architecture.md)
- [Say "Hello-World!" with a Workflow](docs/hello-world.md)
- [Concepts](docs/concepts.md)
- [Template](docs/concepts.md#template)
- [Provisioner](docs/concepts.md#provisioner)
- [Worker](docs/concepts.md#worker)
- [Ephemeral Data](docs/concepts.md#ephemeral-data)
- [Writing a Workflow](docs/writing-workflow.md)
- [Tinkerbell CLI Reference](docs/cli/README.md)
- [Troubleshooting](docs/troubleshoot.md)

- [Setup](docs/setup.md)
- [Components](docs/components.md)
- [Boots](docs/components.md#boots)
- [Osie](docs/components.md#osie)
- [Tinkerbell](docs/components.md#tinkerbell)
- [Hegel](docs/components.md#hegel)
- [Database](docs/components.md#database)
- [Image Registry](docs/components.md#registry)
- [Elasticsearch](docs/components.md#elastic)
- [Fluent Bit](docs/components.md#fluent-bit)
- [Kibana](docs/components.md#kibana)
- [Architecture](docs/architecture.md)
- [Say "Hello-World!" with a Workflow](docs/hello-world.md)
- [Concepts](docs/concepts.md)
- [Template](docs/concepts.md#template)
- [Provisioner](docs/concepts.md#provisioner)
- [Worker](docs/concepts.md#worker)
- [Ephemeral Data](docs/concepts.md#ephemeral-data)
- [Writing a Workflow](docs/writing-workflow.md)
- [Tinkerbell CLI Reference](docs/cli/README.md)
- [Troubleshooting](docs/troubleshoot.md)

## Website

Expand Down
1 change: 1 addition & 0 deletions ci-checks.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,4 @@
#!nix-shell -i bash

codespell -q 3 -I .codespell-whitelist *
prettier --check '**/*.json' '**/*.md' '**/*.yml'
12 changes: 6 additions & 6 deletions deploy/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
version: '2.1'
version: "2.1"
services:
certs:
build: tls
Expand Down Expand Up @@ -53,11 +53,11 @@ services:
- 5432:5432
depends_on:
fluentbit:
condition: service_started
condition: service_started
healthcheck:
test: ["CMD-SHELL", "pg_isready -U tinkerbell"]
interval: 1s
timeout: 1s
timeout: 1s
retries: 30
logging:
driver: fluentd
Expand Down Expand Up @@ -137,9 +137,9 @@ services:
cacher:
condition: service_started
logging:
driver: fluentd
options:
tag: boots
driver: fluentd
options:
tag: boots
ports:
- $TINKERBELL_HOST_IP:80:80/tcp
- 67:67/udp
Expand Down
11 changes: 2 additions & 9 deletions deploy/tls/ca-config.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,18 +6,11 @@
"profiles": {
"server": {
"expiry": "8760h",
"usages": [
"signing",
"key encipherment",
"server auth"
]
"usages": ["signing", "key encipherment", "server auth"]
},
"signing": {
"expiry": "8760h",
"usages": [
"signing",
"key encipherment"
]
"usages": ["signing", "key encipherment"]
}
}
}
Expand Down
3 changes: 1 addition & 2 deletions docs/architecture.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# Architecture


![](img/workflow-architecture.png)
![](img/workflow-architecture.png)
12 changes: 6 additions & 6 deletions docs/cli/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,9 @@ Command line interface for Packet Workflow.

### Synopsis

Command line interface for Packet Workflow. The CLI allows you to update the hardware details with respect to a worker machine. It also enables you to create a template which is eventually used to create a workflow.
Command line interface for Packet Workflow.
The CLI allows you to update the hardware details with respect to a worker machine.
It also enables you to create a template which is eventually used to create a workflow.

### Operations

Expand All @@ -25,8 +27,6 @@ Command line interface for Packet Workflow. The CLI allows you to update the har

### See Also

- [tink hardware](hardware.md) - Hardware (worker) data operations
- [tink template](template.md) - Template operations
- [tink workflow](workflow.md) - Workflow operations


- [tink hardware](hardware.md) - Hardware (worker) data operations
- [tink template](template.md) - Template operations
- [tink workflow](workflow.md) - Workflow operations
6 changes: 3 additions & 3 deletions docs/cli/hardware.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ Hardware (worker) data operations.
### Synopsis

Hardware operations:

```shell
all Get all known hardware for facility
id Get hardware by id
Expand All @@ -23,6 +24,5 @@ Hardware operations:

### See Also

- [tink template](template.md) - Template operations
- [tink workflow](workflow.md) - Workflow operations

- [tink template](template.md) - Template operations
- [tink workflow](workflow.md) - Workflow operations
57 changes: 31 additions & 26 deletions docs/cli/template.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,9 @@ Template operations.
### Synopsis

Template operations:

```shell
create create a workflow template
create create a workflow template
delete delete a template
get get a template
list list all saved templates
Expand All @@ -21,31 +22,35 @@ Template operations:

### Examples

- The following command creates a workflow template using the `sample.tmpl` file and save it as `sample`. It returns a UUID for the newly created template.
```shell
$ tink template create -n <template-name> -p <path-to-template>
$ tink template create -n sample -p /tmp/sample.tmpl
```

- List all the templates
```shell
$ tink template list
```

- Update the name of an existing template
```shell
$ tink template update <template-uuid> -n <new-name>
$ tink template update edb80a56-b1f2-4502-abf9-17326324192b -n new-sample-template
```

- Update an existing template and keep the same name
```shell
$ tink template update <template-uuid> -p <path-to-new-template-file>
$ tink template update edb80a56-b1f2-4502-abf9-17326324192b -p /tmp/new-sample-template.tmpl
```
- The following command creates a workflow template using the `sample.tmpl` file and save it as `sample`.
It returns a UUID for the newly created template.

### See Also
```shell
$ tink template create -n <template-name> -p <path-to-template>
$ tink template create -n sample -p /tmp/sample.tmpl
```

- List all the templates

```shell
$ tink template list
```

- Update the name of an existing template

- [tink hardware](hardware.md) - Hardware (worker) data operations
- [tink workflow](workflow.md) - Workflow operations
```shell
$ tink template update <template-uuid> -n <new-name>
$ tink template update edb80a56-b1f2-4502-abf9-17326324192b -n new-sample-template
```

- Update an existing template and keep the same name

```shell
$ tink template update <template-uuid> -p <path-to-new-template-file>
$ tink template update edb80a56-b1f2-4502-abf9-17326324192b -p /tmp/new-sample-template.tmpl
```

### See Also

- [tink hardware](hardware.md) - Hardware (worker) data operations
- [tink workflow](workflow.md) - Workflow operations
26 changes: 15 additions & 11 deletions docs/cli/workflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ Workflow operations.
### Synopsis

Workflow operations:

```shell
create create a workflow
data get workflow data
Expand All @@ -23,17 +24,20 @@ Workflow operations:

### Examples

- Create a workflow using a template and hardware devices
```shell
$ tink workflow create -t <template-uuid> -r <hardware_input_in_json_format>
$ tink workflow create -t edb80a56-b1f2-4502-abf9-17326324192b -r {"device_1": "mac/IP"}
```
#### Note:
1. The key used in the above command which is "device_1" should be in sync with "worker" field in the template. Click [here](../concepts.md) to check the template structure.
2. These keys can only contain letter, numbers and underscore.
- Create a workflow using a template and hardware devices

```shell
$ tink workflow create -t <template-uuid> -r <hardware_input_in_json_format>
$ tink workflow create -t edb80a56-b1f2-4502-abf9-17326324192b -r {"device_1": "mac/IP"}
```

#### Note:

1. The key used in the above command which is "device_1" should be in sync with "worker" field in the template.
Click [here](../concepts.md) to check the template structure.
2. These keys can only contain letter, numbers and underscore.

### See Also

- [tink hardware](hardware.md) - Hardware (worker) data operations
- [tink template](template.md) - Template operations

- [tink hardware](hardware.md) - Hardware (worker) data operations
- [tink template](template.md) - Template operations
26 changes: 18 additions & 8 deletions docs/components.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,37 +2,47 @@

### Boots

Handles DHCP requests, hands out IPs, and serves up iPXE. It also uses the Tinkerbell client to pull and push hardware data. `boots` will only respond to a predefined set of MAC addresses so it can be deployed in an existing network without interfering with existing DHCP infrastructure.
Handles DHCP requests, hands out IPs, and serves up iPXE.
It also uses the Tinkerbell client to pull and push hardware data.
`boots` will only respond to a predefined set of MAC addresses so it can be deployed in an existing network without interfering with existing DHCP infrastructure.

### Osie

Installs operating systems and handles deprovisioning.

### Tinkerbell

Service responsible for processing workflows. It is comprised of a server and a CLI, which communicate over gRPC. The CLI is used to create a workflow along with its building blocks, i.e., a template hardware devices.
Service responsible for processing workflows.
It is comprised of a server and a CLI, which communicate over gRPC.
The CLI is used to create a workflow along with its building blocks, i.e., a template hardware devices.

### Hegel

Metadata service used by Tinkerbell and Osie during provisioning. It collects data from Tinkerbell and transforms it into a JSON format to be consumed as metadata.
Metadata service used by Tinkerbell and Osie during provisioning.
It collects data from Tinkerbell and transforms it into a JSON format to be consumed as metadata.

### Database

We use [PostgreSQL](https://www.postgresql.org/), also known as Postgres, as our data store. Postgres is a free and open-source relational database management system that emphasizes extensibility and technical standards compliance. It is designed to handle a range of workloads, from single machines to data warehouses or Web services with many concurrent users.
We use [PostgreSQL](https://www.postgresql.org/), also known as Postgres, as our data store.
Postgres is a free and open-source relational database management system that emphasizes extensibility and technical standards compliance.
It is designed to handle a range of workloads, from single machines to data warehouses or Web services with many concurrent users.

### Image Repository

Depending on your use case, you can choose to use [Quay](https://quay.io/) or [DockerHub](https://hub.docker.com/) as the registry to store component images. You can use the same registry to store all of the action images used for a workflow.
Depending on your use case, you can choose to use [Quay](https://quay.io/) or [DockerHub](https://hub.docker.com/) as the registry to store component images.
You can use the same registry to store all of the action images used for a workflow.

On the other hand, if you want to keep things local, you can also setup a secure private Docker registry to hold all your images locally.
On the other hand, if you want to keep things local, you can also setup a secure private Docker registry to hold all your images locally.

### Fluent Bit

[Fluent Bit](https://fluentbit.io/) is an open source and multi-platform Log Processor and Forwarder which allows you to collect data/logs from different sources, unify and send them to multiple destinations. The components write their logs to `stdout` and these logs are then collected by Fluent Bit and pushed to Elasticsearch.
[Fluent Bit](https://fluentbit.io/) is an open source and multi-platform Log Processor and Forwarder which allows you to collect data/logs from different sources, unify and send them to multiple destinations.
The components write their logs to `stdout` and these logs are then collected by Fluent Bit and pushed to Elasticsearch.

### Elasticsearch

[Elasticsearch](https://www.elastic.co/) is a distributed, open source search and analytics engine for all types of data, including textual, numerical, geospatial, structured, and unstructured. Fluent Bit collects the logs from each component and pushes them into Elasticsearch for storage and analysis purposes.
[Elasticsearch](https://www.elastic.co/) is a distributed, open source search and analytics engine for all types of data, including textual, numerical, geospatial, structured, and unstructured.
Fluent Bit collects the logs from each component and pushes them into Elasticsearch for storage and analysis purposes.

### Kibana

Expand Down
Loading