Skip to content

Commit

Permalink
solve conflicts
Browse files Browse the repository at this point in the history
Signed-off-by: ismetatabay <[email protected]>
  • Loading branch information
ismetatabay committed Nov 17, 2023
1 parent 2685955 commit 9dbce8f
Show file tree
Hide file tree
Showing 23 changed files with 34 additions and 70 deletions.

This file was deleted.

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,2 +1,8 @@
nav:
- index.md
- Starting with TIER IV's CalibrationTools: calibration-tools
- Extrinsic manual calibration: extrinsic-manual-calibration
- Lidar-lidar calibration: lidar-lidar-calibration
- Ground plane-lidar calibration: ground-lidar-calibration
- Intrinsic camera calibration: intrinsic-camera-calibration
- Lidar-camera calibration: lidar-camera-calibration
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,35 @@

## Overview

Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack. These sensors must be calibrated correctly and their positions must be defined using either urdf files (as in [sample_sensor_kit](https://github.com/autowarefoundation/sample_sensor_kit_launch/tree/main/sample_sensor_kit_description)) or as tf launch files.
Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack.
Autoware uses fusion techniques to combine information from multiple sensors.
For this to work effectively,
all sensors must be calibrated properly to align their coordinate systems, and their positions must be defined using either urdf files
(as in [sample_sensor_kit](https://github.com/autowarefoundation/sample_sensor_kit_launch/tree/main/sample_sensor_kit_description))
or as tf launch files.
In this documentation,
we will explain TIER IV's [CalibrationTools](https://github.com/tier4/CalibrationTools) repository for the calibration process.
Please look
at [Starting with TIER IV's CalibrationTools page](./calibration-tools) for installation and usage of this tool.

## Camera calibration
If you want to look at other calibration packages and methods, you can check out the following packages.

### Intrinsic Calibration
## Other packages you can check out

### Camera calibration

#### Intrinsic Calibration

- Navigation2 provides a [good tutorial for camera internal calibration](https://navigation.ros.org/tutorials/docs/camera_calibration.html).
- [AutoCore](https://autocore.ai/) provides a [light-weight tool](https://github.com/autocore-ai/calibration_tools/tree/main/camera_intrinsic_calib).

## Lidar-lidar calibration
### Lidar-lidar calibration

### Lidar-Lidar Calibration tool from Autocore
#### Lidar-Lidar Calibration tool from Autocore

[LL-Calib on GitHub](https://github.com/autocore-ai/calibration_tools/tree/main/lidar-lidar-calib), provided by [AutoCore](https://autocore.ai/), is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the above link.

## Lidar-camera calibration
### Lidar-camera calibration

Developed by MathWorks, The Lidar Camera Calibrator app enables you to interactively estimate the rigid transformation between a lidar sensor and a camera.

Expand All @@ -32,7 +45,7 @@ Developed by [AutoCore](https://autocore.ai/), an easy-to-use lightweight toolki

<https://github.com/autocore-ai/calibration_tools/tree/main/lidar-cam-calib-related>

## Lidar-IMU calibration
### Lidar-IMU calibration

Developed by [APRIL Lab](https://github.com/APRIL-ZJU) at Zhejiang University in China, the LI-Calib calibration tool is a toolkit for calibrating the 6DoF rigid transformation and the time offset between a 3D LiDAR and an IMU, based on continuous-time batch optimization.
IMU-based cost and LiDAR point-to-surfel (surfel = surface element) distance are minimized jointly, which renders the calibration problem well-constrained in general scenarios.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,12 @@ CalibrationTools.

We need a sample bag file for the lidar-lidar calibration process
which includes raw lidar topics.
Also, we recommend using an outlier-filtered
point cloud for mapping because this point cloud
includes a cropped vehicle point cloud. Therefore,
vehicle points are not included in the map. When you start
the bag recording,
you should not move the vehicle for the first 5 seconds for better mapping performace.
The following shows an example of a bag file used for this calibration:

??? note "ROS 2 Bag example of our calibration process for tutorial_vehicle"
Expand All @@ -31,7 +37,7 @@ The following shows an example of a bag file used for this calibration:
End: Sep 5 2023 11:25:43.808 (1693902343.808)
Messages: 2256
Topic information: Topic: /sensing/lidar/front/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
Topic: /sensing/lidar/top/outlier_filtered/pointcloud | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
```

## Mapping-based lidar-lidar calibration
Expand Down Expand Up @@ -352,7 +358,7 @@ extrinsic_ground_plane_calibrator/
Then play ROS 2 bag file:

```bash
ros2 bag play <rosbag_path> --clock -l -r 0.2 \
ros2 bag play <rosbag_path> --clock -r 0.2 \
--remap /tf:=/null/tf /tf_static:=/null/tf_static # if tf is recorded
```

Expand Down

0 comments on commit 9dbce8f

Please sign in to comment.