Skip to content

dev-sungman/Propagate-Yourself-Pytorch

Repository files navigation

Propagate Yourself

TODO:

  • Fix pixcontrast modules

Environment Settings

  • CLI

    git clone https://github.com/Sungman-Cho/Propagate-Yourself-Pytorch.git
    source install_packages.sh

Unsupervised Training (ImageNet-1K)

  • PixPro training

    python train.py --multiprocessing-distributed --batch_size=512 --loss=pixpro
  • PixContrast training

    python train.py --multiprocessing-distributed --batch_size=512 --loss=pixcontrast

Transfer learning

before downstream training

  • Make your current directory downstream.

  • Convert a trained PixPro model to detectron2's format:

    python convert-pretrain-to-detectron2.py '$your_checkpoint.pth.tar' pixpro.pkl
  • Convert a trained Pixcontrast model to detectorn2's format:

    python convert-pretrain-to-detectron2.py '$your_checkpoint.pth.tar' pixcontrast.pkl

VOC

  • Training Epochs: 24K iter

  • Image size : [480,800] in train, 800 at inference.

  • Backbone : R50-C4

  • Training

    # baseline training
    source train_voc_base.sh
    
    # pixpro training
    source train_voc_pixpro.sh
    
    # pixcontrast training
    source train_voc_pixcontrast.sh

COCO

  • Followed 1x settings (detectron2)

  • Backbone : R50-C4

  • Training

    # baseline training
    source train_coco_base.sh
    
    # pixpro training
    source train_coco_pixpro.sh
    
    # pixcontrast training
    source train_coco_pixcontrast.sh

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published