Skip to content

Latest commit

 

History

History
75 lines (54 loc) · 4.05 KB

README.md

File metadata and controls

75 lines (54 loc) · 4.05 KB

OPSTL:Unveiling the Hidden Realm: Self-supervised Skeleton-based Action Recognition in Occluded Environments

🛠️ 👷 🚀

🔥 We will release code in the future. 🔥

Occluded Skeleton Sequences                   Imputed Skeleton Sequences

Update

  • 2023.09.14 Init repository.

TODO List

  • Code release.

Abstract

Abstract— In order to integrate action recognition methods into autonomous robotic systems, it is crucial to consider adverse situations involving target occlusions. Such a scenario, despite its practical relevance, is rarely addressed in existing self-supervised skeleton-based action recognition methods. To empower robots with the capacity to address occlusion, we propose a simple and effective method. We first pre-train using occluded skeleton sequences, then use k-means clustering (KMeans) on sequence embeddings to group semantically similar samples. Next, we employ K-nearest-neighbor (KNN) to fill in missing skeleton data based on the closest sample neighbors. Imputing incomplete skeleton sequences to create relatively complete sequences as input provides significant benefits to existing skeleton-based self-supervised models. Meanwhile, building on the state-of-the-art Partial Spatio-Temporal Learning (PSTL), we introduce an Occluded Partial Spatio-Temporal Learning (OPSTL) framework. This enhancement utilizes an Adaptive Spatial Masking (ASM) for a better use of high-quality, intact skeletons. The effectiveness of our imputation methods is verified on the challenging occluded versions of the NTURGB+D 60 and NTURGB+D 120.

Method

(Overview)



Acknowledgement

  • The framework of our code is based on PSTL.
  • The encoder is based on ST-GCN.

Contact

Feel free to contact me if you have any question. Please send me an email at [email protected]