This repository provides code to recreate results presented in In the light of feature distributions: Moment matching for Neural Style Transfer.
For more information, please see the project website and make sure to check out our medium blog post here
If you have any questions, please let me know
Running neural style transfer with Central Moment Discrepancy is as easy as running
python main.py --c_img ./path/to/content.jpg --s_img ./path/to/style.jpg
You have the following command line arguments to change to your needs:
--c_img The content image that is being stylized. --s_img The style image --epsilon Iterative optimization is stopped if delta value of moving average loss is smaller than this value. --max_iter Maximum iterations if epsilon is not surpassed --alpha Convex interpolation of style and content loss (should be set high > 0.9 since we start with content as target) --lr Learning rate of Adam optimizer --im_size Output image size. Can either be single integer for keeping aspect ratio or tuple.
@article{kalischek2021light,
title={In the light of feature distributions: moment matching for Neural Style Transfer},
author={Nikolai Kalischek and Jan Dirk Wegner and Konrad Schindler},
year={2021},
eprint={2103.07208},
archivePrefix={arXiv},
primaryClass={cs.CV}
}