Skip to content

Pipeline: filters: tensorflow: style #1700

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
65 changes: 38 additions & 27 deletions pipeline/filters/tensorflow.md
Original file line number Diff line number Diff line change
@@ -1,56 +1,73 @@
# Tensorflow

## Tensorflow
The _Tensorflow_ filter plugin allows running machine learning inference tasks on the records of data coming from input plugins or stream processors. This filter uses [Tensorflow Lite](https://www.tensorflow.org/lite/) as the inference engine, and requires Tensorflow Lite shared library to be present during build and at runtime.

_Tensorflow Filter_ allows running Machine Learning inference tasks on the records of data coming from input plugins or stream processor. This filter uses [Tensorflow Lite](https://www.tensorflow.org/lite/) as the inference engine, and **requires Tensorflow Lite shared library to be present during build and at runtime**.
Tensorflow Lite is a lightweight open source deep learning framework used for mobile and IoT applications. Tensorflow Lite only handles inference, not training. It loads pre-trained models (`.tflite` files) that are converted into Tensorflow Lite format (`FlatBuffer`). You can read more on converting [Tensorflow models](https://www.tensorflow.org/lite/convert).

Tensorflow Lite is a lightweight open-source deep learning framework that is used for mobile and IoT applications. Tensorflow Lite only handles inference \(not training\), therefore, it loads pre-trained models \(`.tflite` files\) that are converted into Tensorflow Lite format \(`FlatBuffer`\). You can read more on converting Tensorflow models [here](https://www.tensorflow.org/lite/convert)
The Tensorflow plugin for Fluent Bit has the following limitations:

### Configuration Parameters
- Currently supports single-input models
- Uses Tensorflow 2.3 header files

## Configuration parameters

The plugin supports the following configuration parameters:

| Key | Description | Default |
| :--- | :--- | :--- |
| input\_field | Specify the name of the field in the record to apply inference on. | |
| model\_file | Path to the model file \(`.tflite`\) to be loaded by Tensorflow Lite. | |
| include\_input\_fields | Include all input filed in filter's output | True |
| normalization\_value | Divide input values to normalization\_value | |
| `input_field` | Specify the name of the field in the record to apply inference on. | _none_ |
| `model_file` | Path to the model file (`.tflite`) to be loaded by Tensorflow Lite. | _none_ |
| `include_input_fields` | Include all input filed in filter's output. | `True` |
| `normalization_value` | Divide input values to `normalization_value`. | _none_ |

### Creating Tensorflow Lite shared library
## Creating a Tensorflow Lite shared library

Clone [Tensorflow repository](https://github.com/tensorflow/tensorflow), install bazel package manager, and run the following command in order to create the shared library:
To create a Tensorflow Lite shared library:

```bash
$ bazel build -c opt //tensorflow/lite/c:tensorflowlite_c # see https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/c
```
1. Clone the [Tensorflow repository](https://github.com/tensorflow/tensorflow).
1. Install the [Bazel](https://bazel.build/) package manager.
1. Run the following command to create the shared library:

```bash
bazel build -c opt //tensorflow/lite/c:tensorflowlite_c # see https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/c
```

The script creates the shared library `bazel-bin/tensorflow/lite/c/libtensorflowlite_c.so`. You need to copy the library to a location \(such as `/usr/lib`\) that can be used by Fluent Bit.
The script creates the shared library
`bazel-bin/tensorflow/lite/c/libtensorflowlite_c.so`.
1. Copy the library to a location such as `/usr/lib` that can be used by Fluent Bit.

### Building Fluent Bit with Tensorflow filter plugin
## Building Fluent Bit with Tensorflow filter plugin

Tensorflow filter plugin is disabled by default. You need to build Fluent Bit with Tensorflow plugin enabled. In addition, it requires access to Tensorflow Lite header files to compile. Therefore, you also need to pass the address of the Tensorflow source code on your machine to the [build script](https://github.com/fluent/fluent-bit#build-from-scratch):
The Tensorflow filter plugin is disabled by default. You must build Fluent Bit with the Tensorflow plugin enabled. In addition, it requires access to Tensorflow Lite header files to compile. Therefore, you must pass the address of the Tensorflow source code on your machine to the [build script](https://github.com/fluent/fluent-bit#build-from-scratch):

```bash
cmake -DFLB_FILTER_TENSORFLOW=On -DTensorflow_DIR=<AddressOfTensorflowSourceCode> ...
```

#### Command line
### Command line

If Tensorflow plugin initializes correctly, it reports successful creation of the interpreter, and prints a summary of model's input/output types and dimensions.
If Tensorflow plugin initializes correctly, it reports successful creation of the interpreter, and prints a summary of model's input and output types and dimensions.

The command:

```bash
$ bin/fluent-bit -i mqtt -p 'tag=mqtt.data' -F tensorflow -m '*' -p 'input_field=image' -p 'model_file=/home/user/model.tflite' -p 'include_input_fields=false' -p 'normalization_value=255' -o stdout
bin/fluent-bit -i mqtt -p 'tag=mqtt.data' -F tensorflow -m '*' -p 'input_field=image' -p 'model_file=/home/user/model.tflite' -p
```

produces an output like:

```text
'include_input_fields=false' -p 'normalization_value=255' -o stdout
[2020/08/04 20:00:00] [ info] Tensorflow Lite interpreter created!
[2020/08/04 20:00:00] [ info] [tensorflow] ===== input #1 =====
[2020/08/04 20:00:00] [ info] [tensorflow] type: FLOAT32 dimensions: {1, 224, 224, 3}
[2020/08/04 20:00:00] [ info] [tensorflow] ===== output #1 ====
[2020/08/04 20:00:00] [ info] [tensorflow] type: FLOAT32 dimensions: {1, 2}
```

#### Configuration File
### Configuration file

```text
```python
[SERVICE]
Flush 1
Daemon Off
Expand All @@ -72,9 +89,3 @@ $ bin/fluent-bit -i mqtt -p 'tag=mqtt.data' -F tensorflow -m '*' -p 'input_field
Name stdout
Match *
```

## Limitations

1. Currently supports single-input models
2. Uses Tensorflow 2.3 header files

2 changes: 2 additions & 0 deletions vale-styles/FluentBit/Spelling-exceptions.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ autoscaler
autoscaling
backoff
backpressure
Bazel
BitBake
Blackhole
blocklist
Expand Down Expand Up @@ -175,6 +176,7 @@ Tanzu
Telegraf
templated
temporality
Tensorflow
Terraform
Thanos
Timeshift
Expand Down