Skip to content

Commit

Permalink
refine readme (#168)
Browse files Browse the repository at this point in the history
* readme

* readme

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md
  • Loading branch information
CoinCheung authored Jul 17, 2021
1 parent 0c3f452 commit 44b2307
Show file tree
Hide file tree
Showing 2 changed files with 31 additions and 23 deletions.
20 changes: 13 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,22 @@ mIOUs on cocostuff val2017 set:
| bisenetv1 | 31.49 | 31.42 | 32.46 | 32.55 | [download](https://github.com/CoinCheung/BiSeNet/releases/download/0.0.0/model_final_v1_coco_new.pth) |
| bisenetv2 | 30.49 | 30.55 | 31.81 | 31.73 | [download](https://github.com/CoinCheung/BiSeNet/releases/download/0.0.0/model_final_v2_coco.pth) |

> Where **ss** means single scale evaluation, **ssc** means single scale crop evaluation, **msf** means multi-scale evaluation with flip augment, and **mscf** means multi-scale crop evaluation with flip evaluation. The eval scales and crop size of multi-scales evaluation can be found in [configs](./configs/).
Tips:
1. **ss** means single scale evaluation, **ssc** means single scale crop evaluation, **msf** means multi-scale evaluation with flip augment, and **mscf** means multi-scale crop evaluation with flip evaluation. The eval scales and crop size of multi-scales evaluation can be found in [configs](./configs/).

> The fps is tested in different way from the paper. For more information, please see [here](./tensorrt).
2. The fps is tested in different way from the paper. For more information, please see [here](./tensorrt).

> For cocostuff dataset: The authors of the paper `bisenetv2` used the "old split" of 9k train set and 1k val set, while I used the "new split" of 118k train set and 5k val set. Thus the above results on cocostuff does not match the paper. The authors of bisenetv1 did not report their results on cocostuff, so here I simply provide a "make it work" result. Following the tradition of object detection, I used "1x"(90k) and "2x"(180k) schedule to train bisenetv1(1x) and bisenetv2(2x) respectively. Maybe you can have a better result by picking up hyper-parameters more carefully.
3. For cocostuff dataset: The authors of the paper `bisenetv2` used the "old split" of 9k train set and 1k val set, while I used the "new split" of 118k train set and 5k val set. Thus the above results on cocostuff does not match the paper. The authors of bisenetv1 did not report their results on cocostuff, so here I simply provide a "make it work" result. Following the tradition of object detection, I used "1x"(90k) and "2x"(180k) schedule to train bisenetv1(1x) and bisenetv2(2x) respectively. Maybe you can have a better result by picking up hyper-parameters more carefully.

Note that the model has a big variance, which means that the results of training for many times would vary within a relatively big margin. For example, if you train bisenetv2 for many times, you will observe that the result of **ss** evaluation of bisenetv2 varies between 73.1-75.1.
4. The model has a big variance, which means that the results of training for many times would vary within a relatively big margin. For example, if you train bisenetv2 for many times, you will observe that the result of **ss** evaluation of bisenetv2 varies between 73.1-75.1.


## deploy trained models
1. tensorrt
You can go to [tensorrt](./tensorrt) for details.

2. ncnn
You can go to [ncnn](./ncnn) for details.


## platform
Expand Down Expand Up @@ -132,9 +141,6 @@ You can also evaluate a trained model like this:
$ python tools/evaluate.py --config configs/bisenetv1_city.py --weight-path /path/to/your/weight.pth
```

## Infer with tensorrt
You can go to [tensorrt](./tensorrt) For details.


### Be aware that this is the refactored version of the original codebase. You can go to the `old` directory for original implementation if you need, though I believe you will not need it.

Expand Down
34 changes: 18 additions & 16 deletions ncnn/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,55 +13,57 @@ Though this demo runs on x86 platform, you can also use it on mobile platforms.

### Install ncnn

1. dependencies
```
$ python -m pip install onnx-simplifier
```
#### 1. dependencies
```
$ python -m pip install onnx-simplifier
```

2. build ncnn
Just following the ncnn official tutoral: [build-for-linux](https://github.com/Tencent/ncnn/wiki/how-to-build#build-for-linux) to install ncnn:
#### 2. build ncnn
Just follow the ncnn official tutoral of [build-for-linux](https://github.com/Tencent/ncnn/wiki/how-to-build#build-for-linux) to install ncnn:

1) dependencies
**step 1:** install dependencies
```
# apt install build-essential git libprotobuf-dev protobuf-compiler
```

2) (optional) install vulkan
**step 2:** (optional) install vulkan

3) install opencv from source
**step 3:** install opencv from source

4) build
**step 4:** build
I am using commit `9391fae741a1fb8d58cdfdc92878a5e9800f8567`, and I have not tested over newer commits.
```
## I am using commit 9391fae741a1fb8d58cdfdc92878a5e9800f8567, and I have not tested over newer commits
$ git clone https://github.com/Tencent/ncnn.git
$ $cd ncnn
$ cd ncnn
$ git submodule update --init
$ mkdir -p build
$ cmake -DCMAKE_TOOLCHAIN_FILE=../toolchains/host.gcc.toolchain.cmake ..
$ make -j
$ make install
```

### convert model, build and run the demo
### Convert model, build and run the demo

1. convert pytorch model to ncnn model via onnx
#### 1. convert pytorch model to ncnn model via onnx
```
$ cd BiSeNet/
$ python tools/export_onnx.py --aux-mode eval --config configs/bisenetv2_city.py --weight-path /path/to/your/model.pth --outpath ./model_v2.onnx
$ python -m onnxsim model_v2.onnx model_v2_sim.onnx
$ /path/to/ncnn/build/tools/onnx/onnx2ncnn model_v2_sim.onnx model_v2_sim.param model_v2_sim.bin
$ mkdir -p ncnn/moidels
$ mv model_v2_sim.param ncnn/models
$ mv model_v2_sim.bin ncnn/models
```

2. compile demo code
#### 2. compile demo code
```
mkdir -p ncnn/build
cd ncnn/build
cmake .. -DNCNN_ROOT=/path/to/ncnn/build/install
make
```

3. run demo
#### 3. run demo
```
./segment
```

0 comments on commit 44b2307

Please sign in to comment.