-
Download the preprocessed DTU training data dtu_training.rar. Also download Depth_raw.zip if would like to evaluate the depth accuracy, otherwise no depth is needed for training.
-
Extract
Cameras/
andRectified/
from the above downloadeddtu_training.rar
, and optionally extractDepths
from theDepth_raw.zip
. Link the folders toDTU
, which should have the following structure:
DTU
├── Cameras
└── Rectified
- Please refer to RegNeRF for the downloading of the DTU dataset.
- The folder structure:
DTURaw/
├── Calibration
├── idrmasks
└── Rectified
- Please refer to AttnRend for the downloading of the RealEstate10K dataset.
- The folder structure (
data_download
contains video frames, andRealEstate10K
contains camera poses):
realestate_full
├── data_download
│ ├── test
│ └── train
└── RealEstate10K
├── test
└── train
- The full RealEstate10K dataset is very large, which can be challenging to download. We use a subset provided by AttnRend for ablation experiments in our paper.
- The folder structure of the subset:
realestate_subset
├── data_download
│ └── realestate
│ ├── test
│ └── train
└── poses
└── realestate
├── test.mat
└── train.mat
- Please refer to IBRNet for the downloading of the mixed training datasets.
- Download the LLFF test data with:
gdown https://drive.google.com/uc?id=16VnMcF1KJYxN9QId6TClMsZRahHNMW5g
unzip nerf_llff_data.zip
- The folder structure:
mixdata1/
├── google_scanned_objects
├── ibrnet_collected_1
├── ibrnet_collected_2
├── nerf_llff_data
├── nerf_synthetic
├── RealEstate10K-subset
├── real_iconic_noface
└── spaces_dataset
- Download the dataset with
wget http://storage.googleapis.com/gresearch/refraw360/360_v2.zip
- The folder structure:
mipnerf360/
├── bicycle
├── bonsai
├── counter
├── flowers.txt
├── garden
├── kitchen
├── room
├── stump
└── treehill.txt