Skip to content
/ MIS Public

[ICCV 2023] Implementation of the paper “Multi-granularity Interaction Simulation for Unsupervised Interactive Segmentation”

Notifications You must be signed in to change notification settings

lkhl/MIS

Repository files navigation

MIS: Multi-granularity Interaction Simulation for Unsupervised Interactive Segmentation

This is an official implementation for our ICCV'23 paper "Multi-granularity Interaction Simulation for Unsupervised Interactive Segmentation".

TODO

◻ Upload pre-trained weight

◻ Installation instruction for Windows

Installation

Environment

Step 1: setup python environment

# clone this repository
git clone https://github.com/lkhl/MIS
cd MIS

# create conda environment
conda create -n mis python=3.9
pip install -r requirements.txt

Step 2: install other dependencies

  • CMake
sudo apt-get install cmake
  • Eigen

If you have already installed eigen (3.4.0 is suggested) on your machine, please ignore this step.

# download the source code
wget https://gitlab.com/libeigen/eigen/-/archive/3.4.0/eigen-3.4.0.tar
tar -xf eigen-3.4.0.tar
cd eigen-3.4.0

# install eigen
mkdir build
cd build
cmake ..
sudo make install

Step 3: build C++ extensions

cd mis/ops
bash install.sh

Dataset

Please follow RITM to prepare the GrabCut, Berkeley, SBD, and DAVIS datasets.

Usage

Evaluate

apython evaluate_model.py NoBRS \
	--checkpoint /path/to/checkpoint \
	--datasets GrabCut,Berkeley,SBD,DAVIS

The results and pre-trained model are as follows

Model GrabCut Berkeley SBD DAVIS
NoC@85 NoC@90 NoC@85 NoC@90 NoC@85 NoC@90 NoC@85 NoC@90
ViT-Base 1.94 2.32 3.09 4.58 6.91 9.51 6.33 8.44

Demo

We provide a demo for showing the merging process and the interactive segmentation results based on gradio, which can be launched by

python app.py

Training

Step 1: preprocessing

python preprocess.py -d /path/to/SBD

optional arguments:
  -h, --help            show help message and exit
  --data-root DATA_ROOT, -d DATA_ROOT
                        Root directory for the SBD dataset
  --out-dir OUT_DIR, -o OUT_DIR
                        Output directory for the preprocessed data
  --model-size {small,base,large,giant}, -m {small,base,large,giant}
                        Model size of the ViT
  --patch-size {8,14,16}, -p {8,14,16}
                        Patch size of the ViT
  --n-featurizing-workers N_FEATURIZING_WORKERS
                        Number of workers for featurizing. Set to 0 to disable parallel processing
  --n-merging-workers N_MERGING_WORKERS
                        Number of workers for merging. Set to 0 to disable parallel processing

The processed data will be saved in ./data/proposals/sbd by default.

Step 2: training

Use the following command to train a model based on SimpleClick with randomly sampled proposals.

python train.py models/mis_simpleclick_base448_sbd.py

Acknowledgements

This repository is built upon RITM and SimpleClick. The project page is built using the template of Nerfies. Thank the authors of these open source repositories for their efforts. And thank the ACs and reviewers for their effort when dealing with our paper.

Citing

If you find this repository helpful, please consider citing our paper.

@article{li2023multi,
  title={Multi-granularity interaction simulation for unsupervised interactive segmentation},
  author={Li, Kehan and Zhao, Yian and Wang, Zhennan and Cheng, Zesen and Jin, Peng and Ji, Xiangyang and Yuan, Li and Liu, Chang and Chen, Jie},
  journal={arXiv preprint arXiv:2303.13399},
  year={2023}
}

About

[ICCV 2023] Implementation of the paper “Multi-granularity Interaction Simulation for Unsupervised Interactive Segmentation”

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published