Image synthesis via semantic synthesis [Project Page]
by Yi Wang, Lu Qi, Ying-Cong Chen, Xiangyu Zhang, Jiaya Jia.
This repository gives the implementation of our semantic image synthesis method in ICCV 2021 paper, 'Image synthesis via semantic synthesis'.
git clone https://github.com/dvlab-research/SCGAN.git
cd SCGAN/code
To use this code, please install PyTorch 1.0 and Python 3+. Other dependencies can be installed by
pip install -r requirements.txt
Please refer to SPADE for detailed execution.
-
Downloading pretrained models, then putting the folder containing model weights in the folder
./checkpoints
. -
Producing images with the pretrained models.
python test.py --gpu_ids 0,1,2,3 --dataset_mode [dataset] --config config/scgan_[dataset]_test.yml --fid --gt [gt_path] --visual_n 1
For example,
python test.py --gpu_ids 0,1,2,3 --dataset_mode celeba --config config/scgan_celeba-test.yml --fid --gt /data/datasets/celeba --visual_n 1
- Visual results are stored at
./results/scgan_[dataset]/
by default.
Dataset | Download link |
---|---|
CelebAMask-HQ | Baidu Disk (Code: face) |
ADE20K | Baidu Disk (Code: n021)| Visual results (Code: wu7b) |
COCO | Baidu Disk (Code: ss4b)| Visual results (Code: i4dw) |
Using train.sh
to train new models. Or you can specify training options in config/[config_file].yml
.
Our proposed dynamic computation units (spatial conditional convolution and normalization) are extended from conditionally parameterized convolutions [1]. We generalize the scalar condition into a spatial one and also apply these techniques to normalization.
If our research is useful for you, please consider citing:
@inproceedings{wang2021image,
title={Image Synthesis via Semantic Composition},
author={Wang, Yi and Qi, Lu and Chen, Ying-Cong and Zhang, Xiangyu and Jia, Jiaya},
booktitle={ICCV},
year={2021}
}
This code is built upon SPADE, Imaginaire, and PyTorch-FID.
[1] Brandon Yang, Gabriel Bender, Quoc V Le, and Jiquan Ngiam. Condconv: Conditionally parameterized convolutions for efficient inference. In NeurIPS. 2019.
Please send email to [email protected].