Skip to content

GIFS: Neural Implicit Function for General Shape Representation, CVPR 2022

License

Notifications You must be signed in to change notification settings

jianglongye/gifs

Repository files navigation

GIFS

This repository is the official implementation for GIFS introduced in the paper:

GIFS: Neural Implicit Function for General Shape Representation
Jianglong Ye, Yuntao Chen, Naiyan Wang, Xiaolong Wang
CVPR, 2022

Project Page / ArXiv / Video

Visualization

Multi-layer ball Bus with seats inside

Table of Content

Environment Setup

PyTorch with CUDA support are required, please follow the official installation guide.

(Our code is tested on python 3.9, torch 1.8.0, CUDA 11.1 and RTX 3090)

Besides, following libraries are used in this project:

configargparse
trimesh
tqdm
numba
scipy # for data preparation
point_cloud_utils # for data preparation

wandb # (optional, for training)

Installation instruction:

pip install configargparse trimesh tqdm numba scipy point_cloud_utils wandb

Demo

Download our pretrained model from here and put it in the PROJECT_ROOT/experiments/demo/checkpoints directory. Note that there is no need to unpack the tar file, the full path should be PROJECT_ROOT/experiments/demo/checkpoints/checkpoint_44h:0m:56s_158456.50156092644.tar. On some platforms, the colon (:) in the filename may be replaced by other symbols like a space ( ), please rename it back.

Run demo with the following command:

python generate.py --config configs/demo.txt

Some meshes are generated in the PROJECT_ROOT/experiments/demo/evaluation/generation/ directory. You can check them in meshlab. Note that we do not make normals from neighbouring faces point to the same direction, so you may need to enable the back face -> double option in meshlab for better visualization.

Data Preparation

To reproduce our experiments, please download the raw ShapeNetCore v1 dataset and unzip it into the PROJECT_ROOT/datasets/shapenet/data directory.

Compile the Label Generation Code

Our GT label generation code depends on CGAL which can be installed on ubuntu by following command (or other ways):

apt install libcgal-dev

Compile the label generation code with following commands:

cd PROJECT_ROOT/dataprocessing/intersection_detection/
mdkir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make

Or, we provide a precompiled binary file (static linking) for ubuntu which can be downloaded from here. After downloading, put the binary file to PROJECT_ROOT/dataprocessing/intersection_detection/build/ and change the permission of the binary file:

chmod +x intersection

Data Processing

Run the following command to process the data:

export PYTHONPATH=.
python dataprocessing/create_split.py --config configs/shapenet_cars.txt
python dataprocessing/preprocess.py --config configs/shapenet_cars.txt

Run

We provide a pretrained model here. Please put it into the corresponding directory according to the config file you are using.

Training

Run the following command to train the model:

python ddp_train.py --config configs/shapenet_cars.txt

Generation

Run the following command to generate meshes:

python generate.py --config configs/shapenet_cars.txt

Citation

@inproceedings{ye2022gifs,
  title={GIFS: Neural Implicit Function for General Shape Representation},
  author={Ye, Jianglong and Chen, Yuntao and Wang, Naiyan and Wang, Xiaolong},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2022}
}

Acknowledgement

The code is based on NDF, thanks for the great work!

About

GIFS: Neural Implicit Function for General Shape Representation, CVPR 2022

Topics

Resources

License

Stars

Watchers

Forks