Backbone | Method | pretrain | Crop Size | Lr Schd | mIoU | mIoU (ms+flip) | #params | FLOPs | config | model | log |
---|---|---|---|---|---|---|---|---|---|---|---|
CSWin-T | UPerNet | ImageNet-1K | 512x512 | 160K | 49.3 | 50.7 | 60M | 959G | config |
model | log |
CSWin-S | UperNet | ImageNet-1K | 512x512 | 160K | 50.4 | 51.5 | 65M | 1027G | config |
model | log |
CSWin-B | UperNet | ImageNet-1K | 512x512 | 160K | 51.1 | 52.2 | 109M | 1222G | config |
model | log |
- Install the Swin_Segmentation repository and some required packages.
git clone https://github.com/SwinTransformer/Swin-Transformer-Semantic-Segmentation
bash install_req.sh
- Move the CSWin configs and backbone file to the corresponding folder.
cp -r configs/cswin <MMSEG_PATH>/configs/
cp config/_base/upernet_cswin.py <MMSEG_PATH>/config/_base_/models
cp backbone/cswin_transformer.py <MMSEG_PATH>/mmseg/models/backbones/
cp mmcv_custom/checkpoint.py <MMSEG_PATH>/mmcv_custom/
- Install apex for mixed-precision training
git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" ./
- Follow the guide in mmseg to prepare the ADE20k dataset.
Command format:
tools/dist_train.sh <CONFIG_PATH> <NUM_GPUS> --options model.pretrained=<PRETRAIN_MODEL_PATH>
For example, using a CSWin-T backbone with UperNet:
bash tools/dist_train.sh \
configs/cswin/upernet_cswin_tiny.py 8 \
--options model.pretrained=<PRETRAIN_MODEL_PATH>
pretrained models could be found at main page.
More config files can be found at configs/cswin
.
Command format:
tools/dist_test.sh <CONFIG_PATH> <CHECKPOINT_PATH> <NUM_GPUS> --eval mIoU
tools/dist_test.sh <CONFIG_PATH> <CHECKPOINT_PATH> <NUM_GPUS> --eval mIoU --aug-test
For example, evaluate a CSWin-T backbone with UperNet:
bash tools/dist_test.sh configs/cswin/upernet_cswin_tiny.py \
<CHECKPOINT_PATH> 8 --eval mIoU
This code is built using the mmsegmentation library, Timm library, the Swin repository.