This repository is for the RDUNet model proposed in the following paper:
Javier Gurrola-Ramos, Oscar Dalmau and Teresa E. Alarcón, "A Residual Dense U-Net Neural Network for Image Denoising", IEEE Access, vol. 9, pp. 31742-31754, 2021, doi: 10.1109/ACCESS.2021.3061062.
If you use this paper work in your research or work, please cite our paper:
@article{gurrola2021residual,
title={A Residual Dense U-Net Neural Network for Image Denoising},
author={Gurrola-Ramos, Javier and Dalmau, Oscar and Alarcón, Teresa E},
journal={IEEE Access},
volume={9},
pages={31742--31754},
year={2021},
publisher={IEEE},
doi={10.1109/ACCESS.2021.3061062}
}
Link to download the pretrained models.
- Python 3.6
- PyTorch 1.5.1
- pytorch-msssim 0.2.0
- ptflops 0.6.3
- tqdm 4.48.2
- scikit-image 0.17.2
- yaml 0.2.5
- MATLAB (to create testing datasets)
For training, we used DIV2K dataset. You need to download the dataset for training the model and put the high-resolution image folders in the './Dataset' folder. You can modify the train_files.txt
and val_files.txt
to load only part of the dataset.
Default parameters used in the paper are set in the config.yaml
file:
patch size: 64
batch size: 16
learning rate: 1.e-4
weight decay: 1.e-5
scheduler gamma: 0.5
scheduler step: 3
epochs: 21
Additionally, you can choose the device, the number of workers of the data loader, and enable multiple GPU use.
To train the model use the following command:
python main_train.py
Place the pretrained models in the './Pretrained' folder. Modify the config.yaml
file according to the model you want to use: model channels: 3
for the color model and model channels: 1
for the grayscale model.
Test datasets need to be prepared using the MATLAB codes in './Datasets' folder according to the desired noise level. We test the RDUNet model we use the Set12, CBSD68, Kodak24, and Urban100 datasets.
To test the model use the following command:
python main_test.py
Results reported in the paper.
If you have any question about the code or paper, please contact francisco.gurrola@cimat.mx .