Skip to content

Megvii-BaseDetection/TreeFilter-Torch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TreeFilter-Torch

By Lin Song, Yanwei Li, Zeming Li, Gang Yu, Hongbin Sun, Jian Sun, Nanning Zheng.

This project provides a cuda implementation for "Learnable Tree Filter for Structure-preserving Feature Transform" (NeurIPS2019) on PyTorch. Multiple semantic segmentation experiments are reproduced to verify the effectiveness of tree filtering module on PASCAL VOC2012 and Cityscapes. For the reason that the experiments in the paper were conducted using internal framework, this project reimplements them on PyTorch and reports detailed comparisons below. In addition, many thanks to TorchSeg.

introduce image

Prerequisites

  • PyTorch 1.2
    • sudo pip3 install torch torchvision
  • Easydict
    • sudo pip3 install easydict
  • Apex
    • https://nvidia.github.io/apex/index.html
  • Ninja
    • sudo apt-get install ninja-build
  • tqdm
    • sudo pip3 install tqdm
  • Boost (optional for Prim and Kruskal algorithm)
    • sudo apt-get install libboost-dev

Installation

Building from source

  • git clone https://github.com/StevenGrove/TreeFilter-Torch
  • cd TreeFilter-Torch/furnace/kernels/lib_tree_filter
  • sudo python3 setup.py build develop

This project implements three well-known algorithms of minimal spanning tree, i.e., Boruvka, Kruskal and Prim. The default algorithm is set to Boruvka for its linear computational complexity in the plain graph. The user can change the configuration in the source file "lib_tree_filter/src/mst/mst.cu" .

Pretrained Model

Performance and Benchmarks

Notes

FCN-32d: FCN with decoder whose maximum stride is 32;
Extra: Global average pooling + ResBlock;
TF: Learnable tree filtering module;
SS: Single-scale;
MSF: Multi-scale + Flip.

PASCAL VOC 2012 val set

Methods Backbone mIoU (ss) Acc (ss) mIoU (msf) Acc (msf) Model
FCN-32d R50_v1c 71.82% 93.62% 73.96% 94.14% GoogleDrive
FCN-32d+TF R50_v1c 76.31% 94.57% 77.80% 94.96% GoogleDrive
FCN-32d R101_v1c 74.53% 94.29% 76.08% 94.63% GoogleDrive
FCN-32d+TF R101_v1c 77.82% 94.92% 79.22% 95.22% GoogleDrive
FCN-32d+Extra R101_v1c 78.04% 95.01% 79.69% 95.41% GoogleDrive
FCN-32d+Extra+TF R101_v1c 79.81% 95.38% 80.97% 95.67% GoogleDrive
FCN-32d+Extra+TF* R101_v1c 80.32% 95.66% 82.28% 96.01% GoogleDrive

* further finetuned on the original train set

Cityscapes val set

Methods Backbone mIoU (ss) Acc (ss) mIoU (msf) Acc (msf) Model
FCN-32d+Extra R101_v1c 78.29% 96.09% 79.40% 96.27% GoogleDrive
FCN-32d+Extra+TF R101_v1c 79.58% 96.31% 80.85% 96.46% GoogleDrive

Usage

As in the original TorchSeg, distributed training is recommended for either single machine or multiple machines.
For detailed usage, please refer to the Training and Inference sections in TorchSeg.

To do

  • Experiments on ADE20K
  • Visualization of tree filter
  • Additional tasks
    • Object detection
    • Instance segmentation
    • Optical flow

Citation

Please cite the learnable tree filter in your publications if it helps your research.

@inproceedings{song2019learnable,
    title = {Learnable Tree Filter for Structure-preserving Feature Transform},
    author = {Song, Lin and Li, Yanwei and Li, Zeming and Yu, Gang and Sun, Hongbin and Sun, Jian and Zheng, Nanning},
    booktitle = {Advances in Neural Information Processing Systems},
    year = {2019}
}

Please cite this project in your publications if it helps your research.

@misc{treefilter-torch,
    author = {Song, Lin},
    title = {TreeFiler-Torch},
    howpublished = {\url{https://github.com/StevenGrove/TreeFilter-Torch}},
    year ={2019}
}

About

Learnable Tree Filter for Structure-preserving Feature Transform

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published