Chufeng Xiao, Deng Yu, Xiaoguang Han, Youyi Zheng, Hongbo Fu
[Paper] [Project Page] [Dataset] [Supplemental Material] [Video]
Accepted by SIGGRAPH Asia 2021
Clone this repository and install the dependent libraries (Anaconda recommended):
git clone https://github.com/chufengxiao/SketchHairSalon.git
cd SketchHairSalon
pip install -r requirements.txt
The main dependencies are
- python == 3.6.12
- torch == 1.0.0
- numpy == 1.19.1
- scipy == 1.5.3
- mayavi == 4.7.2
Please download our pretrained models and put the S2M
, S2I_unbraid
, S2I_braid
folders into the checkpoints
folder.
If you only want to fast test some results using the pretrained models, you can directly run the below command to test the samples in the test_img
folder. The testing consists of two stages and produce the final results in the results
folder.
## A full pipeline with S2M and S2I
python full_test.py unbraid # for unbraided hairstyles
python full_test.py braid # for braided hairstyles
## Sketch2Matte
python S2M_test.py unbraid # for unbraided hairstyles
python S2M_test.py braid # for braided hairstyles
## Sketch2Image
python S2I_test.py unbraid # for unbraided hairstyles
python S2I_test.py braid # for braided hairstyles
You can download our created dataset and put them into the dataset
folder for further training and testing. If you download and use the dataset, you agree to the below items:
- The dataset is available for non-commercial research purposes only.
- All images of the dataset are obtained from the Internet which are not property of us. We are not responsible for the content nor the meaning of these images.
- All mattes of the dataset are auto-generated from the images by CGA-matting.
- All sketches of the dataset are traced by users we hired. (Note that for hair sketches, each stroke is assigned with a random color to be distinguished with each other, which will be color-coded to depict hair appearance when training and testing.)
- You agree not to reproduce, duplicate, copy, sell, trade, resell or exploit for any commercial purposes, any portion of the images and any portion of derived data.
- We reserves the right to terminate your access to SketchHairSalon dataset at any time.
You can fine-tune and test each stage of our networks (Sketch2Matte and Sketch2Image) using the scripts:
# You can run one line of the below commands for a certain purpose
## For fine-tuning (The checkpoint and resulting files during training will be saved in the 'checkpoints' folder):
sh scripts/train_S2M.sh # Sketch2Matte
sh scripts/train_S2I_unbraid.sh # Sketch2Image (unbraid)
sh scripts/train_S2I_braid.sh # Sketch2Image (braid)
## For testing (The results will be saved in the 'results' folder):
sh scripts/test_S2M.sh # Sketch2Matte
sh scripts/test_S2I_unbraid.sh # Sketch2Image (unbraid)
sh scripts/test_S2I_braid.sh # Sketch2Image (braid)
If you want to train the models from scratch, then remove --continue_train
and --epoch
command in the training scripts, for example:
python train.py --dataroot ./dataset/ --name S2M --model pix2pix --netG unet_at --dataset_mode matte --use_aug --batch_size 10 --save_epoch_freq 50 --epoch_count 1 --n_epochs 200 --n_epochs_decay 0 --display_freq 10 --save_latest_freq 40000 --print_freq 100 --no_flip --gpu_ids 0
You can test the two auto-completion modules by running the below commands for unbraided and braided hairstyles respectively:
cd autocompletion
python unbraid_completion.py
python braid_completion.py
This code is developed based on pix2pix and DANet.
@article{xiao2021sketchhairsalon,
title={SketchHairSalon: Deep Sketch-based Hair Image Synthesis},
author={Chufeng Xiao and Deng Yu and Xiaoguang Han and Youyi Zheng and Hongbo Fu},
journal = {ACM Transactions on Graphics (Proceedings of ACM SIGGRAPH Asia 2021)},
volume={40},
number={6},
pages={1--16},
year={2021},
publisher={ACM New York, NY, USA}
}