Skip to content

Latest commit

 

History

History
executable file
·
90 lines (63 loc) · 3.61 KB

README.md

File metadata and controls

executable file
·
90 lines (63 loc) · 3.61 KB

Federated Learning with Local and Global Representations

Pytorch implementation for federated learning with local and global representations.

Correspondence to:

Paper

Think Locally, Act Globally: Federated Learning with Local and Global Representations
Paul Pu Liang*, Terrance Liu*, Liu Ziyin, Ruslan Salakhutdinov, and Louis-Philippe Morency
NeurIPS 2019 Workshop on Federated Learning (distinguished student paper award). (*equal contribution)

If you find this repository useful, please cite our paper:

@article{liang2020think,
  title={Think locally, act globally: Federated learning with local and global representations},
  author={Liang, Paul Pu and Liu, Terrance and Ziyin, Liu and Salakhutdinov, Ruslan and Morency, Louis-Philippe},
  journal={arXiv preprint arXiv:2001.01523},
  year={2020}
}

Installation

First check that the requirements are satisfied:
Python 3.6
torch 1.2.0
torchvision 0.4.0
numpy 1.18.1
sklearn 0.20.0
matplotlib 3.1.2
Pillow 4.1.1

The next step is to clone the repository:

git clone https://github.com/pliang279/LG-FedAvg.git

Data

We run FedAvg and LG-FedAvg experiments on MNIST (link) and CIFAR10 (link). See our paper for a description how we process and partition the data for federated learning experiments.

FedAvg

Results can be reproduced running the following:

MNIST

python main_fed.py --dataset mnist --model mlp --num_classes 10 --epochs 1000 --lr 0.05 --num_users 100 --shard_per_user 2 --frac 0.1 --local_ep 1 --local_bs 10 --results_save run1

CIFAR10

python main_fed.py --dataset cifar10 --model cnn --num_classes 10 --epochs 2000 --lr 0.1 --num_users 100 --shard_per_user 2 --frac 0.1 --local_ep 1 --local_bs 50 --results_save run1

LG-FedAvg

Results can be reproduced by first running the above commands for FedAvg and then running the following:

MNIST

python main_lg.py --dataset mnist --model mlp --num_classes 10 --epochs 200 --lr 0.05 --num_users 100 --shard_per_user 2 --frac 0.1 --local_ep 1 --local_bs 10 --num_layers_keep 3 --results_save run1 --load_fed best_400.pt

CIFAR10

python main_lg.py --dataset cifar10 --model cnn --num_classes 10 --epochs 200 --lr 0.1 --num_users 100 --shard_per_user 2 --frac 0.1 --local_ep 1 --local_bs 50 --num_layers_keep 2 --results_save run1 --load_fed best_1200.pt

MTL

Results can be reproduced running the following:

MNIST

python main_mtl.py --dataset mnist --model mlp --num_classes 10 --epochs 1000 --lr 0.05 --num_users 100 --shard_per_user 2 --frac 0.1 --local_ep 1 --local_bs 10 --num_layers_keep 5 --results_save run1

CIFAR10

python main_mtl.py --dataset cifar10 --model cnn --num_classes 10 --epochs 2000 --lr 0.1 --num_users 100 --shard_per_user 2 --frac 0.1 --local_ep 1 --local_bs 50 --num_layers_keep 5 --results_save run1

If you use this code, please cite our paper:

@article{liang2019_federated,
  title={Think Locally, Act Globally: Federated Learning with Local and Global Representations},
  author={Paul Pu Liang and Terrance Liu and Ziyin Liu and Ruslan Salakhutdinov and Louis-Philippe Morency},
  journal={ArXiv},
  year={2019},
  volume={abs/2001.01523}
}

Acknowledgements

This codebase was adapted from https://github.com/shaoxiongji/federated-learning.