An implementation of SCGAN using Torch. You can click here to visit Tensorflow version.
Original paper: https://arxiv.org/abs/2210.07594
Input | Output |
---|---|
- Linux or OSX
- NVIDIA GPU + CUDA CuDNN (CPU mode and CUDA without CuDNN may work with minimal modification, but untested)
- For MAC users, you need the Linux/GNU commands
gfind
andgwc
, which can be installed withbrew install findutils coreutils
.
- Install torch and dependencies from https://github.com/torch/distro
- Install torch packages
nngraph
,class
,display
luarocks install nngraph
luarocks install class
luarocks install https://raw.githubusercontent.com/szym/display/master/display-scm-0.rockspec
- Clone this repo:
git clone https://github.com/junyanz/CycleGAN
cd CycleGAN
First, download the dataset
-
Unpaired dataset: The dataset is built by ourselves, and there are all real haze images from website.
10000 images: Address:Baidu cloud disk Extraction code:zvh6
1000 images: Address:Baidu cloud disk Extraction code:47v9
-
Paired dataset: The dataset is added haze by ourselves according to the image depth.
Address: Baidu cloud disk Extraction code : 63xf
- Now, let's generate dehaze images:
DATA_ROOT=./datasets/test name=dehaze_pretrained model=one_direction_test phase=test loadSize=256 fineSize=256 resize_or_crop="scale_width" th test.lua
The test results will be saved to ./results/dehaze_pretrained/latest_test/index.html
.
- Download a dataset (e.g. zebra and horse images from ImageNet):
bash ./datasets/download_dataset.sh haze2dehaze
- Train a model:
DATA_ROOT=./datasets/haze2dehaze name=haze2dehaze_model th train.lua
- (CPU only) The same training command without using a GPU or CUDNN. Setting the environment variables
gpu=0 cudnn=0
forces CPU only
DATA_ROOT=./datasets/dehaze2haze name=haze2dehaze_model gpu=0 cudnn=0 th train.lua
- (Optionally) start the display server to view results as the model trains. (See Display UI for more details):
th -ldisplay.start 8000 0.0.0.0
- Finally, test the model:
DATA_ROOT=./datasets/haze2dehaze name=haze2dehaze_model phase=test th test.lua
The test results will be saved to a html file here: ./results/haze2dehaze_model/latest_test/index.html
.
Download the pre-trained models with the following script. The model will be saved to ./checkpoints/model_name/latest_net_G.t7
.
Optionally, for displaying images during training and test, use the display package.
- Install it with:
luarocks install https://raw.githubusercontent.com/szym/display/master/display-scm-0.rockspec
- Then start the server with:
th -ldisplay.start
- Open this URL in your browser: http://localhost:8000
By default, the server listens on localhost. Pass 0.0.0.0
to allow external connections on any interface:
th -ldisplay.start 8000 0.0.0.0
Then open http://(hostname):(port)/
in your browser to load the remote desktop.
If you use this code for your research, please cite our paper:
@article{zhang2022see,
title={See Blue Sky: Deep Image Dehaze Using Paired and Unpaired Training Images},
author={Zhang, Xiaoyan and Tang, Gaoyang and Zhu, Yingying and Tian, Qi},
journal={arXiv preprint arXiv:2210.07594},
year={2022}
}
Code borrows from CycleGAN.