This repository contains the official source code for the paper Masked Autoencoders are Parameter-Efficient Federated Continual Learners.
@article{he2024pmae,
title = {Masked Autoencoders are Parameter-Efficient Federated Continual Learners},
author = {Yuchen He and Xiangfeng Wang},
year = {2024},
journal = {arXiv preprint arXiv:2411.01916}
}
To ensure smooth execution of the code, we recommend setting up a dedicated environment using conda
.
-
Create a new conda environment:
conda create -n pMAE python==3.9.18
-
Activate the environment:
conda activate pMAE
-
Install the required packages:
pip install -r requirements.txt
ImageNet-R and CUB-200 dataset can be downloaded from the link provided in LAMDA-PILOT. Please specify the folder of your dataset in src/utils/conf.py
.
Please download pre-trained MAE models from the Releases and then put the pre-trained models to the folder specified in src/utils/conf.py
.
The frozen pre-trained encoders for the Sup-based MAE and iBOT-based MAE are obtained from vision_transformer and ibot, respectively.
Set the [DATASET]
and [MODEL]
options using the filenames of the .json files in the configs folder. If the selected model includes pMAE, set the [METHOD]
to pmae; otherwise, set it to fedavg.
python src/main_fcl.py --dataset [DATASET] --model [MODEL] --method [METHOD] --device 0
python src/main_fcl.py --dataset cub_T20_beta5e-1 --model sup_pmae --method pmae --device 0
python src/main_fcl.py --dataset cub_T20_beta5e-1 --model sup_coda_prompt --method fedavg --device 0
python src/main_fcl.py --dataset cub_T20_beta5e-1 --model sup_coda_prompt_w_pmae --method pmae --device 0
Run the results_processor.py
script after completing a specific experiment.
python results_processor.py --dataset [DATASET] --model [MODEL]
python results_processor.py --dataset cub_T20_beta5e-1 --model sup_pmae
python results_processor.py --dataset cub_T20_beta5e-1 --model sup_coda_prompt
python results_processor.py --dataset cub_T20_beta5e-1 --model sup_coda_prompt_w_pmae
This repo is heavily based on LAMDA-PILOT, MarsFL, and mae, many thanks.