-
Download the beta 23 version of matconvnet
-
unpack the downloaded file into a directory of your choosing Call the path to this directory
-
Compile matconvnet (GPU) by following the instructions specified at the original matconvnet website
Vis-Nir:
-
Dowload the original dataset
-
Extract the dataset to datasets/Vis-Nir/data
-
Using Matlab, run the 'CreateTrainingData' and 'CreateTestData' scripts inside datasets/Vis-Nir directory
Vis-Nir_grid:
- After already downloading the original dataset for the Vis-Nir in the keypoint config, run the 'CreateTrainingDataWithLabels' inside Vis-Nir_grid directory
vedai:
- Dowload the original dataset download only the 512x512 two parts.
- Extract the dataset to 'datasets/vedai/data'
- There are some unrelevant images with polygons overlaid on original images, Delete them
- Run the 'CreateTrainingDataWithLabels' inside 'datasets\vedai' directory
cuhk:
- Dowload the original dataset download only the cuhk dataset, using the cropped sketches and cropped photos from both training and test. overall 188 images
- Extract all the images and sketches to 'datasets\cuhk\data'
- Run the 'CreateTrainingDataWithLabels' inside 'datasets\cuhk' directory
We supply CSV files referencing all train and test pairs for research in other platforms than matlab
in order to train any of the models specified in the paper just edit the TrainModels.m script as your wish.
To replicate training as specified in the paper, follow the details in the paper:
- Use default learning parameters:
- Train for 40 epochs in the L2 config and 100 in the softmax
- Use Learning rate of 0.01 and weight decay of 0.0005
- hard mining factor of 0.8 is optimal
We supply our trained models + results for evaluation Evaluation code is inside the Eval directory
There are two scripts
- EvaluateFar_Grid - general evaluation script for all the grid like datasets, i.e. cuhk,vedai and Vis-Nir_grid
- EvaluateFar_Vis_Nir - Dedicated script for the Vis-Nir dataset in its keypoint setup
Training and Evaluation code all need GPU installed