TOWARDS BUILDING A GROUP-BASED UNSUPERVISED REPRESENTATION DISENTANGLEMENT FRAMEWORK
Tao Yang, Xuanchi Ren, Yuwang Wang, Wenjun Zeng, Nanning Zheng
arXiv preprint arXiv:2102.10303
ICLR 2022
🔲 Release code
In this repo, built on the group-based definition and inspired by the n-th dihedral group, we first propose a theoretical framework towards achieving unsupervised representation disentanglement. We then propose a model, based on existing VAEbased methods, to tackle the unsupervised learning problem of the framework.The overview of our method is as follows:
- The Isomorphism Loss in our paper is implemented in "model/share.py"
- The model constraint in our paper is implemented in the function "forward" in "model/XXX_VAE.py"
- build a docker environment by using "./dockerfile"
The VAE-based models:
The evaluation datasets:
Download them to "./dataset_folder/"
The Arguements
usage: main.py [-h] --config_num CONFIG_NUM [--eval]
optional arguments:
-h, --help show this help message and exit
--config_num CONFIG_NUM
the number of settings of hyperparameters and random
seeds
--eval eval model or not (default: False)
The hyperparamter and random seed settings are numbered in './config.csv'. Every row in './config.csv' correspond to a setting.
Train the model under the 1000th setting:
python main.py --config_num = 1000
Evaluate the model we trained above:
python main.py --config_num = 1000 --eval
dSprites | |
---|---|
BetaVAE | DCI |
MIG | FactorVAE |
NOTE: Groupified VAE achieves better performance mean with lower variance.
dSprites AnnealVAE | |
---|---|
C_max = 10, Original | C_max = 20, Original |
C_max = 10, Groupified | C_max = 20, Groupified |
Cars3d and Shapes3d | |
---|---|
Original | Groupified |
Cyclic latent space | |
---|---|
Original | |
Groupified |
dSprites Anneal VAE | |
---|---|
KL divergence of dimensions | Traversal results |
@article{Tao2022groupified,
title = {TOWARDS BUILDING A GROUP-BASED UNSUPERVISED REPRESENTATION DISENTANGLEMENT FRAMEWORK},
author = {Tao, Yang and Xuanchi, Ren and Yuwang, Wang and Wenjun, Zeng and Nanning, Zheng and Pengju,Ren},
journal = {ICLR},
year = {2022}
}