Detect objects in Lidar point-cloud data from the Waymo Open Dataset. Perform fusion between Lidar and camera detections and track objects using an Extended Kalman Filter. Implement data association and track management for the fusion solution.
The project consists of two main parts:
- Object detection: Extracting Lidar point-clouds from the Waymo data set, visualization, converting to a birds-eye view representation and executing a pre-trained NN on this data (FPN ResNet). Integrated a second NN and calculated standard evaluation metrics to compare their performance.
- Object tracking : An Extended Kalman Filter is used to track objects from both Lidar and camera detections. Data association based on the Single Nearest Neighbour method, track scoring and track management (initialization, confirmation, deletion), gating and field-of-view evaluation.
Answers to the project questions are in writeup.md
Final tracking output, using both Lidar and camera detections, EKF filter and track management:
Height and intensity extracted from Lidar range image:
Point cloud visualization using Open3D
3-channel BEV map containing density (red), height (green) and intensity (blue) data.
Integrated pre-trained FPN ResNet model for object detection
Precision, recall and metrics calculated using Darknet and FPN ResNet for detection