Skip to content

A PaddlePaddle implementation of SupContrast: Supervised Contrastive Learning

License

Notifications You must be signed in to change notification settings

paddorch/SupContrast.paddle

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SupContrast.paddle

A PaddlePaddle implementation of SupContrast: Supervised Contrastive Learning

This repo covers an reference implementation for the following papers in PaddlePaddle 2.x, using CIFAR as an illustrative example:
(1) Supervised Contrastive Learning. Paper
(2) A Simple Framework for Contrastive Learning of Visual Representations. Paper

Loss Function

The loss function SupConLoss in supcon.py takes features (L2 normalized) and labels as input, and return the loss. If labels is None or not passed to the it, it degenerates to SimCLR.

Usage:

from supcon import SupConLoss

# define loss with a temperature `temp`
criterion = SupConLoss(temperature=temp)

# features: [bsz * n_views, f_dim]
# `n_views` is the number of crops from each image
# better be L2 normalized in f_dim dimension
features = ...
# labels: [bsz]
labels = ...

# SupContrast
loss = criterion(features, labels)
# or SimCLR
loss = criterion(features)
...

Comparison

Results on CIFAR-10:

Arch Setting Loss Paper Acc(%) Our Acc(%) abs. improv.
SupCrossEntropy ResNet50 Supervised Cross Entropy 95.0 96.9
(-*)
1.9
SupContrast ResNet50 Supervised Contrastive 96.0 97.3
(96.8*)
1.3
SimCLR ResNet50 Unsupervised Contrastive 93.6 - -

*for no cutout Accuracy

Running

You might use CUDA_VISIBLE_DEVICES to set proper number of GPUs.

We released 3 models, please download from cowtransfer with code 461254:

./logs
|-- resnet50-ce-final/final             # SupCrossEntropy (Acc: 96.9)
|-- resnet50-supcon-final/final         # SupContrast Pretrained
|-- resnet50-linear-final/final         # SupContrast Linear Fine-tuned (Acc: 97.3)

(0) Data Preparing

cd data
wget https://dataset.bj.bcebos.com/cifar/cifar-10-python.tar.gz

(1) Standard Cross-Entropy

  • Train:
python main_ce.py -y config/resnet50_ce.yml
  • Test:

config the continue_from in config/resnet50_ce.yml to specify the checkpoint path, then run:

python main_ce.py -y config/resnet50_ce.yml --test

you will get:

(2) Supervised Contrastive Learning

  • Train:

Pretraining stage:

python main_supcon.py -y config/resnet50_supcon.yml

Linear evaluation stage: config the from_supcon in config/resnet50_linear.yml to specify the checkpoint path, then run:

python main_ce.py -y config/resnet50_linear.yml
  • Test:

config the continue_from in config/resnet50_linear.yml to specify the checkpoint path, then run:

python main_ce.py -y config/resnet50_linear.yml --test

you will get:

Details

  • see config/ for configuration details

Differences

  • Compared to the original batch size of 6144, we use a smaller batch size of 128, which allows us to train on a single GPU card.
  • We use gradient clip to avoid gradient explosion, to make the training of small batch size more stable

Data Augmentation

Reference

@Article{khosla2020supervised,
    title   = {Supervised Contrastive Learning},
    author  = {Prannay Khosla and Piotr Teterwak and Chen Wang and Aaron Sarna and Yonglong Tian and Phillip Isola and Aaron Maschinot and Ce Liu and Dilip Krishnan},
    journal = {arXiv preprint arXiv:2004.11362},
    year    = {2020},
}

About

A PaddlePaddle implementation of SupContrast: Supervised Contrastive Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages