Test-Time Adaptation for Keypoint-Based Spacecraft Pose Estimation Based on Predicted-View Synthesis
Official Pytorch Implementation of Test-Time Adaptation for Keypoint-Based Spacecraft Pose Estimation Based on Predicted-View Synthesis by Juan Ignacio Bravo Pérez-Villar, Álvaro García-Martín, Jesús Bescós, and Juan C. San-Miguel (IEEE Transactions on Aerospace and Electronic Systems).
This work proposes a test-time adaptation approach that leverages the temporal redundancy between images acquired during close proximity operations. Our approach involves extracting features from sequential spacecraft images, estimating their poses, and then using this information to synthesise a reconstructed view. We establish a self-supervised learning objective by comparing the synthesised view with the actual one. During training, we supervise both pose estimation and image synthesis, while at test-time, we optimise the self-supervised objective. Additionally, we introduce a regularisation loss to prevent solutions that are not consistent with the keypoint structure of the spacecraft.
This section contains the instructions to execute the code. The repository has been tested in a system with:
- Ubuntu 18.04
- CUDA 11.2
- Conda 4.8.3
To clone the repository, type in your terminal:
git clone https://github.com/JotaBravo/spacecraft-tta.git
After instaling conda go to the spacecraft-tta folder and type in your terminal:
conda env create -f env.yml
conda activate spacecraft-tta
This work employs the SHIRT Dataset dataset. Place the contents of the dataset under the dataset folder. The folder should look like this:
dataset
roe1
synthetic
lightbox
roe2
synthetic
lightbox
camera.json
Download our keypoints and copy them under the roe1 and roe2 folders.
dataset/roe1/kpts.mat
dataset/roe2/kpts.mat
NOTE: to reproduce our work the keypoints provided by us are needed. However, we encourage using the keypoints from the SPEED+ team. Those can be found in: https://github.com/tpark94/speedplusbaseline
You can download our heatmaps from our Google Drive. Extract them and place them under their corresponding folders
dataset/roe1/synthetic/kptsmap
dataset/roe2/synthetic/kptsmap
Or simply run
python generate_heatmaps.py
You can download the checkpoints from this Google Drive folder
To run the experiments, execute:
python test.py <config_path> <encoder_path> <decoder_path>
To train from scratch you can execute. Optinally (recommended) you can download our speedplus pretrained weights for the pretraining stage and place them under the weights folder. Otherwise you can comment the corresponding lines in pretrain.py
source scripts/launch_training.sh
If you find our work or code useful, please cite:
@article{perez2024test,
title={Test-Time Adaptation for Keypoint-Based Spacecraft Pose Estimation Based on Predicted-View Synthesis},
author={P{\'e}rez-Villar, Juan Ignacio Bravo and Garc{\'\i}a-Mart{\'\i}n, {\'A}lvaro and Besc{\'o}s, Jes{\'u}s and SanMiguel, Juan C},
journal={IEEE Transactions on Aerospace and Electronic Systems},
year={2024},
publisher={IEEE}
}
This work is supported by Comunidad Autónoma de Madrid (Spain) under the Grant IND2020/TIC-17515