Skip to content

Latest commit

 

History

History
35 lines (30 loc) · 1.48 KB

README.md

File metadata and controls

35 lines (30 loc) · 1.48 KB

Choco-SGD

This repository provides code for communication-efficient decentralized ML training (both deep learning, compatible with PyTorch, and traditional convex machine learning models.

We provide code for the main experiments in the papers

Please refer to the folders convex_code and dl_code for more details.

References

If you use the code, please cite the following papers:

@inproceedings{koloskova2019choco,
    title = {Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication},
    author = {Anastasia Koloskova and Sebastian U. Stich and Martin Jaggi},
    booktitle = {ICML 2019 - Proceedings of the 36th International Conference on Machine Learning},
    url = {http://proceedings.mlr.press/v97/koloskova19a.html},
    publisher = {PMLR}, 
    volume = {97},
    pages = {3479--3487},
    year = {2019}
}

and

@inproceedings{koloskova2020decentralized,
  title={Decentralized Deep Learning with Arbitrary Communication Compression},
  author={Anastasia Koloskova* and Tao Lin* and Sebastian U Stich and Martin Jaggi},
  booktitle={ICLR 2020 - International Conference on Learning Representations},
  year={2020},
  url={https://openreview.net/forum?id=SkgGCkrKvH}
}