Skip to content

Latest commit

 

History

History
35 lines (18 loc) · 1.18 KB

README.md

File metadata and controls

35 lines (18 loc) · 1.18 KB

PointNet2_Grasping_Data_Part

This is the dataset part for our accepted ICRA 2020 paper:

PointNet++ Grasping: Learning An End-to-end Spatial Grasp Generation Algorithm from Sparse Point Clouds

Video:https://www.youtube.com/watch?v=AfU7npscnZ0

Dataset:

Link:https://pan.baidu.com/s/1_prfq4A_Dg9kREqpc3Cikw Password:j7zn

image

The code is to visualize the performance of Single-object Grasp Planning. We refer dexnet(https://berkeleyautomation.github.io/dex-net/)

The installation method is the same with dexnet (https://berkeleyautomation.github.io/dex-net/install/install.html)

You can run show show_labels.py to get the following cases:

Case 1: Show all the major grasps

image2

Case 2: Show the supplementary grasps

(The blue grasp denotes the major grasp and the rest are supplementary grasps)

image3

image4