Skip to content

[MICCAI 2024] Official Code for "MambaMIL: Enhancing Long Sequence Modeling with Sequence Reordering in Computational Pathology"

Notifications You must be signed in to change notification settings

isyangshu/MambaMIL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MambaMIL: Enhancing Long Sequence Modeling with Sequence Reordering in Computational Pathology

License: MIT GitHub last commit GitHub issues GitHub stars Arxiv Page

NEWS

2024-05-14: Our paper got early acceptance by MICCAI 2024!!!

Abstract

Multiple Instance Learning (MIL) has emerged as a dominant paradigm to extract discriminative feature representations within Whole Slide Images (WSIs) in computational pathology. Despite driving notable progress, existing MIL approaches suffer from limitations in facilitating comprehensive and efficient interactions among instances, as well as challenges related to time-consuming computations and overfitting. In this paper, we incorporate the Selective Scan Space State Sequential Model (Mamba) in Multiple Instance Learning (MIL) for long sequence modeling with linear complexity, termed as MambaMIL. By inheriting the capability of vanilla Mamba, MambaMIL demonstrates the ability to comprehensively understand and perceive long sequences of instances. Furthermore, we propose the Sequence Reordering Mamba (SR-Mamba) aware of the order and distribution of instances, which exploits the inherent valuable information embedded within the long sequences. With the SR-Mamba as the core component, MambaMIL can effectively capture more discriminative features and mitigate the challenges associated with overfitting and high computational overhead. Extensive experiments on two public challenging tasks across nine diverse datasets demonstrate that our proposed framework performs favorably against state-of-the-art MIL methods.

NOTES

2024-04-12: For subsequent updates of the paper, We will update the arixv version in next month.

2024-04-13: We released the model of MambaMIL. The whole training code is coming soon.

2024-04-24: We released the full version of MambaMIL, including models and train scripts.

Installation

  • Environment: CUDA 11.8 / Python 3.10
  • Create a virtual environment
> conda create -n mambamil python=3.10 -y
> conda activate mambamil
  • Install Pytorch 2.0.1
> pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 --index-url https://download.pytorch.org/whl/cu118
> pip install packaging
  • Install causal-conv1d
> pip install causal-conv1d==1.1.1
  • Install Mamba
> git clone git@github.com:isyangshu/MambaMIL.git
> cd mamba
> pip install .
  • Other requirements
> pip install scikit-survival==0.22.2
> pip install pandas==2.2.1
> pip install tensorboardx
> pip install h5py
> pip install wandb
> pip install tensorboard
> pip install lifelines

Repository Details

  • splits: Splits for reproducation.
  • train_scripts: We provide train scripts for cancer subtyping and survival prediction.

How to Train

Prepare your data

  1. Download diagnostic WSIs from TCGA and BRACS
  2. Use the WSI processing tool provided by CLAM to extract resnet-50 and PLIP pretrained feature for each 512 $\times$ 512 patch (20x), which we then save as .pt files for each WSI. So, we get one pt_files folder storing .pt files for all WSIs of one study.

The final structure of datasets should be as following:

DATA_ROOT_DIR/
    └──pt_files/
        └──resnet50/
            ├── slide_1.pt
            ├── slide_2.pt
            └── ...
        └──plip/
            ├── slide_1.pt
            ├── slide_2.pt
            └── ...
        └──others/
            ├── slide_1.pt
            ├── slide_2.pt
            └── ...

Survival Prediction

We provide train scripts for survival prediction ALL_512_surivial_k_fold.sh.

Below are the supported models and datasets:

model_names='max_mil mean_mil att_mil trans_mil s4_mil mamba_mil'
backbones="resnet50 plip"
cancers='BLCA BRCA COADREAD KIRC KIRP LUAD STAD UCEC'

run the following code for training

sh ./train_scripts/ALL_512_surivial_k_fold.sh

Cancer Subtyping

We provide train scripts for TCGA NSCLC cancer subtyping LUAD_LUSC_512_subtyping.sh and BReAst Carcinoma Subtyping BRACS.sh.

Below are the supported models:

model_names='max_mil mean_mil att_mil trans_mil s4_mil mamba_mil'
backbones="resnet50 plip"

run the following code for training TCGA NSCLC cancer subtyping

sh ./train_scripts/LUAD_LUSC_512_subtyping.sh

run the following code for training BReAst Carcinoma Subtyping

sh ./train_scripts/BRACS.sh

Acknowledgements

Huge thanks to the authors of following open-source projects:

License & Citation

If you find our work useful in your research, please consider citing our paper at:

@article{yang2024mambamil,
  title={MambaMIL: Enhancing Long Sequence Modeling with Sequence Reordering in Computational Pathology},
  author={Yang, Shu and Wang, Yihui and Chen, Hao},
  journal={arXiv preprint arXiv:2403.06800},
  year={2024}
}

This code is available for non-commercial academic purposes. If you have any question, feel free to email Shu YANG and Yihui WANG.

About

[MICCAI 2024] Official Code for "MambaMIL: Enhancing Long Sequence Modeling with Sequence Reordering in Computational Pathology"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published