-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the data format for 3D and 2D joints #22
Comments
Thanks for the questions. In our code, we use a unified data format for 2D/3D joints for different datasets. For 3D joints, the format is [x, y, z, a], where the fourth element indicates the availability of the joint. It is a binary flag. For most cases (such as 3DPW dataset with SMPL style 3D joints), a is always 1. We keep this format for future adaptation to other different 3D joints/mesh topology. For 2D joints, the format is [x, y, v], where v is the visibility flag. |
Thanks so much for your quick reply! I found that you already reorganized the original 3DPW dataset and managed your own 3DPW dataset, consisting of .tsv files. The 3DPW dataset originally has train/validation/test split. However, in your customized 3DPW dataset, I only saw train.img.tsv and test.img.tsv. |
Moreover, I also compared the original 3DPW dataset and your customized 3DPW dataset, and I found the difference.
These two vectors have not only different dimensions but also very different values. The same situation for Thank you so much, and hope to see your answer soon! |
We don't use validation set in our experiments. We don't have validation tsv file. When I look at the first sample in
I couldn't find your provided example in our tsv file. If you find the vector somewhere in our training pipeline (such as model outputs), it would be more explainable. Please note, following the literature, we did some coordinate normalizations for better training (see https://github.com/microsoft/MeshTransformer/blob/main/metro/datasets/human_mesh_tsv.py#L123). During inference, we will undo the normalization in order to visualize the joints in the image (see https://github.com/microsoft/MeshTransformer/blob/main/metro/utils/renderer.py#L276). The operations are commonly used in the literature. We mainly follow GraphCMR for the data pre-processing. |
Oh I am sorry for the mistake. The data I got must be:
|
By the way, the data for 3D joints is:
Could you please elaborate on this difference and share with us your code for creating 3DPW tsv files? |
I am not very sure about the format as you pointed out here. I think there are three possible reasons for this problem.
As I can recall, I actually tried to normalize all these factors before. It has been a while, but I remember that our 3D joints will look similar to the original 3DPW ones as you post here. Unfortunately, the tsv generation code is missing. The code was written 1 year ago, and I can't find it now. I need some time to dig it out. I may need to rewrite a new code when I have time. I will add it to the repo then. Thanks a lot for your interests in our work :) |
You mean you normalized both If that is right, why do you still normalize joint_2d and joint_3d in your dataloader code? |
We did normalization in the dataloader. Please stay tuned for the tsv generation code! |
As far as I figure out, you only provided the tsv generation code for pseudo-3d-dataset (https://github.com/microsoft/MeshTransformer/blob/main/metro/tools/tsv_demo.py) There is no tsv code for 3DPW dataset. If you can provide it soon, it would be highly appreciated! Thank you so much! |
The 3dpw tsv example code can be found at https://github.com/microsoft/MeshTransformer/blob/main/metro/tools/tsv_demo_3dpw.py |
Can you provide the code to generate the TSV file of the human36m dataset? Thank you very much. |
Hi, wjingdan |
Hi all, |
Hello @kevinlin311tw , |
Hi authors,
I found that the original 3DPW dataset has:
However, your customized 3DPW dataset has:
Therefore, my question is "what is the fourth element of your customized 3D joints"?
Thank you!
The text was updated successfully, but these errors were encountered: