Skip to content
/ AdaptIR Public

[NeurIPS2024] Tune your restoration model with one 3090 GPU!

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
LICENSE.md
Notifications You must be signed in to change notification settings

csguoh/AdaptIR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Parameter Efficient Adaptation for Image Restoration with Heterogeneous Mixture-of-Experts

Hang Guo, Tao Dai, Yuanchao Bai, Bin Chen, Xudong Ren, Zexuan Zhu, Shu-Tao Xia

Abstract: Designing single-task image restoration models for specific degradation has seen great success in recent years. To achieve generalized image restoration, all-in-one methods have recently been proposed and shown potential for multiple restoration tasks using one single model. Despite the promising results, the existing all-in-one paradigm still suffers from high computational costs as well as limited generalization on unseen degradations. In this work, we introduce an alternative solution to improve the generalization of image restoration models. Drawing inspiration from recent advancements in Parameter Efficient Transfer Learning (PETL), we aim to tune only a small number of parameters to adapt pre-trained restoration models to various tasks. However, current PETL methods fail to generalize across varied restoration tasks due to their homogeneous representation nature. To this end, we propose AdaptIR, a Mixture-of-Experts (MoE) with orthogonal multi-branch design to capture local spatial, global spatial, and channel representation bases, followed by adaptive base combination to obtain heterogeneous representation for different degradations. Extensive experiments demonstrate that our AdaptIR achieves stable performance on single-degradation tasks, and excels in hybrid-degradation tasks, with fine-tuning only 0.6% parameters for 8 hours.

⭐If this work is helpful for you, please help star this repo. Thanks!🤗

📑 Contents

👀Visual Results On Different Restoration Tasks

🆕 News

  • 2023-12-12: arXiv paper available.
  • 2023-12-16: This repo is released.
  • 2023-09-28: 😊Our AdaptIR was accepted by NeurIPS2024!
  • 2024-10-19: 🔈The code is available now, enjoy yourself!

☑️ TODO

  • arXiv version
  • Release code
  • More detailed introductions of README file
  • Further improvements

🥇 Results

We achieve state-of-the-art adaptation performance on various downstream image restoration tasks. Detailed results can be found in the paper.

Evaluation on Second-order Degradation (LR4&Noise30) (click to expand)

Evaluation on Classic SR (click to expand)

Evaluation on Denoise&DerainL (click to expand)

Evaluation on Heavy Rain Streak Removal (click to expand)

Evaluation on Low-light Image Enhancement (click to expand)

Evaluation on Model Scalability (click to expand)

Datasets & Models Preparation

Datasets

Since this work involves various restoration tasks, you may collect the training and testing datasets you need from existing repos, such as Basicsr, Restormer, and PromptIR.

Pre-trained weights

  • IPT pre-trained models download the IPT_pretrain with the link of the IPT repo.

  • EDT pre-trained models download the SRx2x3x4_EDTB_ImageNet200K.pth with the link of the EDT repo

🥰 Citation

Please cite us if our work is useful for your research.

@article{guo2023adaptir,
  title={AdaptIR: Parameter Efficient Multi-task Adaptation for Pre-trained Image Restoration Models},
  author={Guo, Hang and Dai, Tao and Bai, Yuanchao and Chen, Bin and Xia, Shu-Tao and Zhu, Zexuan},
  journal={arXiv preprint arXiv:2312.08881},
  year={2023}
}

License

This project is released under the Apache 2.0 license.

Acknowledgement

This code is based on AirNet, IPT and EDT. Thanks for their awesome work.

Contact

If you have any questions, feel free to approach me at cshguo@gmail.com

About

[NeurIPS2024] Tune your restoration model with one 3090 GPU!

Resources

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
LICENSE.md

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages