Skip to content

Sara-Ahmed/MCSSL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MC-SSL: Towards Multi-Concept Self-Supervised Learning

This repository contains the official PyTorch self-supervised pretraining, finetuning, and evaluation codes for

MCSSL: Towards Multi-Concept Self-Supervised Learning.

Main Architecture

Visualization of Self-supervised Clustering

Self-supervised pre-training

python -m torch.distributed.launch --nproc_per_node=8 --use_env main_MCSSL.py --batch_size 64 --epochs 800 --data_location 'path/to/imageNet/trainingimgs'

Architecture # paramters Finetuning Accuracy download
ViT-S/16 22M 82.4 % checkpoint
ViT-B/16 85M 84.0 % checkpoint

Finetuning

We rely on the finetuning strategy of Deit

Acknowledgement

This repository is built in top of the SiT and the DINO repository.

Citation

If you use this code for a paper, please cite:

@article{atito2021mc,

  title={MC-SSL0. 0: towards multi-concept self-supervised learning},

  author={Atito, Sara and Awais, Muhammad and Farooq, Ammarah and Feng, Zhenhua and Kittler, Josef},

  journal={arXiv preprint arXiv:2111.15340},

  year={2021}

}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages