Skip to content
forked from Khrylx/EgoPose

Official PyTorch Implementation of "Ego-Pose Estimation and Forecasting as Real-Time PD Control". ICCV 2019.

License

Notifications You must be signed in to change notification settings

cocosdaison/EgoPose

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EgoPose

Loading EgoPose demo gif Loading EgoPose demo gif

This repo contains the official implementation of our paper:

Ego-Pose Estimation and Forecasting as Real-Time PD Control
Ye Yuan, Kris Kitani
ICCV 2019
[website] [paper] [video]

Installation

Dataset

  • Download the dataset from Google Drive in the form of a single zip or split zips (or BaiduYun link, password: ynui) and place the unzipped dataset folder inside the repo as "EgoPose/datasets". Please see the README.txt inside the folder for details about the dataset.

Environment

  • Supported OS: MacOS, Linux
  • Packages:
  • Additional setup:
    • For linux, the following environment variable needs to be set to greatly improve multi-threaded sampling performance:
      export OMP_NUM_THREADS=1
  • Note: All scripts should be run from the root of this repo.

Pretrained Models

  • Download our pretrained models from this link (or BaiduYun link, password: kieq) and place the unzipped results folder inside the repo as "EgoPose/results".

Quick Demo

Ego-Pose Estimation

  • To visualize the results for MoCap data:
    python ego_pose/eval_pose.py --egomimic-cfg subject_03 --statereg-cfg subject_03 --mode vis
    Here we use the config file for subject_03. Note that in the visualization, the red humanoid represents the GT.

  • To visualize the results for in-the-wild data:
    python ego_pose/eval_pose_wild.py --egomimic-cfg cross_01 --statereg-cfg cross_01 --data wild_01 --mode vis
    Here we use the config file for cross-subject model (cross_01) and test it on in-the-wild data (wild_01).

  • Keyboard shortcuts for the visualizer: keymap.md

Ego-Pose Forecasting

  • To visualize the results for MoCap data:
    python ego_pose/eval_forecast.py --egoforecast-cfg subject_03 --mode vis

  • To visualize the results for in-the-wild data:
    python ego_pose/eval_forecast_wild.py --egoforecast-cfg cross_01 --data wild_01 --mode vis

Training and Testing

  • If you are interested in training and testing with our code, please see train_and_test.md.

Citation

If you find our work useful in your research, please consider citing our paper Ego-Pose Estimation and Forecasting as Real-Time PD Control:

@inproceedings{yuan2019ego,
  title={Ego-Pose Estimation and Forecasting as Real-Time PD Control},
  author={Yuan, Ye and Kitani, Kris},
  booktitle={Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
  year={2019},
  pages={10082--10092}
}

License

The software in this repo is freely available for free non-commercial use. Please see the license for further details.

About

Official PyTorch Implementation of "Ego-Pose Estimation and Forecasting as Real-Time PD Control". ICCV 2019.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%