diff --git a/.github/workflows/deploy-docs.yaml b/.github/workflows/deploy-docs.yaml index 5f2d09d4148..b48d70dbacb 100644 --- a/.github/workflows/deploy-docs.yaml +++ b/.github/workflows/deploy-docs.yaml @@ -22,7 +22,7 @@ jobs: prevent-no-label-execution: uses: autowarefoundation/autoware-github-actions/.github/workflows/prevent-no-label-execution.yaml@v1 with: - label: deploy-docs + label: tag:deploy-docs deploy-docs: needs: prevent-no-label-execution diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/.pages index f38a00ca8fc..b3a74158c47 100644 --- a/docs/how-to-guides/integrating-autoware/creating-maps/.pages +++ b/docs/how-to-guides/integrating-autoware/creating-maps/.pages @@ -1,3 +1,6 @@ nav: - index.md - Open-source SLAM algorithms: open-source-slam + - Converting UTM map to MGRS map: converting-utm-to-mgrs-map + - Pointcloud map downsampling: pointcloud-map-downsampling + - Creating a vector map: creating-vector-map diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/converting-utm-to-mgrs-map/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/converting-utm-to-mgrs-map/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/converting-utm-to-mgrs-map/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/converting-utm-to-mgrs-map/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/converting-utm-to-mgrs-map/index.md new file mode 100644 index 00000000000..8a5694f6979 --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/converting-utm-to-mgrs-map/index.md @@ -0,0 +1,110 @@ +# Converting UTM maps to MGRS map format + +## Overview + +If you want to use MGRS (Military Grid Reference System) format in Autoware, +you need to convert UTM (Universal Transverse Mercator) map to MGRS format. +In order to do that, we will use [UTM to MGRS pointcloud converter](https://github.com/leo-drive/pc_utm_to_mgrs_converter) ROS 2 package provided by Leo Drive. + +## Installation + +### Dependencies + +- ROS 2 +- PCL-conversions +- [GeographicLib](https://geographiclib.sourceforge.io/C++/doc/install.html) + +To install dependencies: + +```bash +sudo apt install ros-humble-pcl-conversions \ + geographiclib-tools +``` + +### Building + +```bash + cd /src + git clone https://github.com/leo-drive/pc_utm_to_mgrs_converter.git + cd .. + colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release +``` + +### Usage + +After the installation of converter tool, +we need to define northing, +easting and ellipsoid height of local UTM map origin in `pc_utm_to_mgrs_converter.param.yaml`. +For example, you can use latitude, +longitude and altitude values in the navsatfix message from your GNSS/INS sensor. + +??? note "Sample ROS 2 topic echo from navsatfix message" + + ```sh + header: + stamp: + sec: 1694612439 + nanosec: 400000000 + frame_id: GNSS_INS/gnss_ins_link + status: + status: 0 + service: 1 + latitude: 41.0216110801253 + longitude: 28.887096461148346 + altitude: 74.28264078891529 + position_covariance: + - 0.0014575386885553598 + - 0.0 + - 0.0 + - 0.0 + - 0.004014162812381983 + - 0.0 + - 0.0 + - 0.0 + - 0.0039727711118757725 + position_covariance_type: 2 + ``` + +After that, you need to convert latitude and longitude values to northing and easting values. +You can use any converter on the internet for converting latitude longitude values to UTM. +(i.e., [UTMconverter](https://www.latlong.net/lat-long-utm.html)) + +Now, we are ready to update `pc_utm_to_mgrs_converter.param.yaml`, +example for our navsatfix message: + +```diff +/**: + ros__parameters: + # Northing of local origin +- Northing: 4520550.0 ++ Northing: 4542871.33 + + # Easting of local origin +- Easting: 698891.0 ++ Easting: 658659.84 + + # Elipsoid Height of local origin +- ElipsoidHeight: 47.62 ++ ElipsoidHeight: 74.28 +``` + +Lastly, we will update input and pointcloud the map path in `pc_utm_to_mgrs_converter.launch.xml`: + +```diff +... +- ++ +- ++ +... +``` + +After the setting of the package, we will launch pc_utm_to_mgrs_converter: + +```bash +ros2 launch pc_utm_to_mgrs_converter pc_utm_to_mgrs_converter.launch.xml +``` + +The conversion process will be started, +you should see `Saved data points saved to ` message on your terminal. +MGRS format pointcloud map should be saved on your output map directory. diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/.pages new file mode 100644 index 00000000000..2782f96a2e9 --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/.pages @@ -0,0 +1,8 @@ +nav: + - index.md + - Lanelet2: lanelet2 + - Crosswalk: crosswalk + - Stop line: stop-line + - Traffic light: traffic-light + - Speed bump: speed-bump + - Detection area: detection-area diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/images/crosswalk-test.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/images/crosswalk-test.png new file mode 100644 index 00000000000..83065c11aa6 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/images/crosswalk-test.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/index.md new file mode 100644 index 00000000000..d7c0f24875c --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/crosswalk/index.md @@ -0,0 +1,74 @@ +# Crosswalk attribute + +Behavior velocity planner's [crosswalk module](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_velocity_crosswalk_module/) plans velocity +to stop or decelerate for pedestrians approaching or walking on a crosswalk. +In order to operate that, we will add crosswalk attribute to our lanelet2 map. + +## Creating a crosswalk attribute + +In order to create a crosswalk on your map, please follow these steps: + +1. Click `Abstraction` button on top panel. +2. Select `Crosswalk` from the panel. +3. Click and draw crosswalk on your pointcloud map. + +You can see these steps in the crosswalk creating demonstration video: + +![type:video](https://youtube.com/embed/J6WrL8dkFhI) + +### Testing created crosswalk with planning simulator + +After the completing of creating the map, we need to save it. +To that please click `File` --> `Export Lanelet2Maps` then download. + +After the download is finished, +we need to put lanelet2 map and pointcloud map on the same location. +The directory structure should be like this: + +```diff ++ / ++ ├─ pointcloud_map.pcd ++ └─ lanelet2_map.osm +``` + +If your .osm or .pcd map file's name is different from these names, +you need to update autoware.launch.xml: + +```diff + +- ++ +- ++ +``` + +Now we are ready to launch the planning simulator: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:= vehicle_model:= sensor_model:= +``` + +Example for tutorial_vehicle: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:=$HOME/Files/autoware_map/tutorial_map/ vehicle_model:=tutorial_vehicle sensor_model:=tutorial_vehicle_sensor_kit vehicle_id:=tutorial_vehicle +``` + +1. Click `2D Pose Estimate` button on rviz or press `P` and give a pose for initialization. +2. Click `2D Goal Pose` button on rviz or press `G` and give a pose for goal point. +3. We need to add pedestrians to crosswalk, so activate interactive pedestrians from `Tool Properties` panel on rviz. +4. After that, please press `Shift`, then click right click button for inserting pedestrians. +5. You can control inserted pedestrian via dragging right click. + +Crosswalk markers on rviz: + +
+ ![crosswalk-test](images/crosswalk-test.png){ align=center } +
+ Crosswalk test on the created map. +
+
+ +You can check your crosswalk elements in the planning simulator as this demonstration video: + +![type:video](https://youtube.com/embed/hhwBku_1qmA) diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/images/detection-area-test.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/images/detection-area-test.png new file mode 100644 index 00000000000..cb38440b943 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/images/detection-area-test.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/index.md new file mode 100644 index 00000000000..a708004db1e --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/detection-area/index.md @@ -0,0 +1,76 @@ +# Detection area element + +Behavior velocity planner's [detection area](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_velocity_detection_area_module/) plans velocity +when if pointcloud is detected in a detection area defined on a map, the stop planning will be executed at the predetermined point. +In order to operate that, we will add a detection area element to our lanelet2 map. + +## Creating a detection area element + +In order to create a detection area on your map, please follow these steps: + +1. Click `Lanelet2Maps` button on top panel. +2. Select `Detection Area` from the panel. +3. Please select lanelet which stop line to be added. +4. Click and insert `Detection Area` on your pointcloud map. +5. You can change the dimensions of the detection area with clicking points on the corners of the detection area. For more information, you can check the demonstration video. + +You can see these steps in the detection area creating demonstration video: + +![type:video](https://youtube.com/embed/RUJvXok-ncQ) + +### Testing created detection area with planning simulator + +After the completing of creating the map, we need to save it. +To that please click `File` --> `Export Lanelet2Maps` then download. + +After the download is finished, +we need to put lanelet2 map and pointcloud map on the same location. +The directory structure should be like this: + +```diff ++ / ++ ├─ pointcloud_map.pcd ++ └─ lanelet2_map.osm +``` + +If your .osm or .pcd map file's name is different from these names, +you need to update autoware.launch.xml: + +```diff + +- ++ +- ++ +``` + +Now we are ready to launch the planning simulator: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:= vehicle_model:= sensor_model:= +``` + +Example for tutorial_vehicle: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:=$HOME/Files/autoware_map/tutorial_map/ vehicle_model:=tutorial_vehicle sensor_model:=tutorial_vehicle_sensor_kit vehicle_id:=tutorial_vehicle +``` + +1. Click `2D Pose Estimate` button on rviz or press `P` and give a pose for initialization. +2. Click `2D Goal Pose` button on rviz or press `G` and give a pose for goal point. +3. We need to add pedestrians to detection area, so activate interactive pedestrians from `Tool Properties` panel on rviz. +4. After that, please press `Shift`, then click right click button for inserting pedestrians. +5. You can control inserted pedestrian via dragging right click. So, you should put pedestrian on the detection area for testing. + +Stop detection area on rviz: + +
+ ![detection-area-test](images/detection-area-test.png){ align=center } +
+ Detection area test on the created map. +
+
+ +You can check your detection area elements in the planning simulator as this demonstration video: + +![type:video](https://youtube.com/embed/zjfPnRIz8Xk) diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/index.md new file mode 100644 index 00000000000..d233b33b00d --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/index.md @@ -0,0 +1,29 @@ +# Creating a vector map + +## Overview + +In this section, we will explain how to create Lanelet2 maps with TIER IV's [Vector Map Builder tool](https://tools.tier4.jp/feature/vector_map_builder_ll2/). + +There are alternative tools such as +Unity-based app [MapToolbox](https://github.com/autocore-ai/MapToolbox) and +Java-based app [JOSM](https://josm.openstreetmap.de/) that you may use for creating a Lanelet2 map. +We will be using TIER IV's Vector Map Builder in the tutorial +since it works on a browser without installation of extra dependency applications. + +## Vector Map Builder + +You need a TIER IV account for using Vector Map Builder tool. +If it is the first time to use the tool, +[create a TIER IV account](https://docs.web.auto/en/user-manuals/tier-iv-account/quick-start) +in order to use [Vector Map Builder tool](https://tools.tier4.jp/feature/vector_map_builder_ll2/). +For more information about this tool, +please check the [official guide](https://docs.web.auto/en/user-manuals/vector-map-builder/introduction). + +You can follow these pages for creating a Lanelet2 map and understanding its regulatory elements. + +- [Lanelet2](./lanelet2) +- [Crosswalk](./crosswalk) +- [Stop Line](./stop-line) +- [Traffic Light](./traffic-light) +- [Speed Bump](./speed-bump) +- [Detection Area](./detection-area) diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/images/planning-simulator-map-test.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/images/planning-simulator-map-test.png new file mode 100644 index 00000000000..b7a547ff826 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/images/planning-simulator-map-test.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/images/pointcloud-map.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/images/pointcloud-map.png new file mode 100644 index 00000000000..e6f8dc61644 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/images/pointcloud-map.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/index.md new file mode 100644 index 00000000000..2a5c81cf5fd --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/lanelet2/index.md @@ -0,0 +1,128 @@ +# Creating a Lanelet + +At this page, we will explain how to create a simple lanelet on your point cloud map. +If you didn't have a point cloud map before, +please check +and follow the steps on the [LIO-SAM mapping page](../../open-source-slam/lio-sam) +for how to create a point cloud map for Autoware. + +## Creating a Lanelet2 + +Firstly, we need to import our pointcloud map to Vector Map Builder tool: + +1. Please click `File`. +2. Then, click `Import PCD`. +3. Click `Browse` and select your .pcd file. + +You will display the point cloud on your Vector Map Builder tool after the upload is complete: + +
+ ![pointcloud-map](images/pointcloud-map.png){ align=center } +
+ Uploaded pointcloud map file on Vector Map Builder +
+
+ +Now, we are ready to create lanelet2 map on our pointcloud map: + +1. Please click `Create`. +2. Then, click `Create Lanelet2Maps`. +3. Please fill your map name +4. Please fill your MGRS zone. (At tutorial_vehicle, MGRS grid zone: 35T - MGRS 100,000-meter square: PF) +5. Click `Create`. + +### Creating a simple lanelet + +In order to create a simple lanelet on your map, please follow these steps: + +1. CLick `Lanelet2Maps` on the bar +2. Enable Lanelet mode via selecting `Lanelet`. +3. Then, you can click the pointcloud map to create lanelet. +4. If your lanelet is finished, you can disable `Lanelet`. +5. If you want to change your lanelet width, click `lanelet` --> `Change Lanelet Width`, then you can enter the lanelet width. + +Video Demonstration: + +![type:video](https://youtube.com/embed/183PHi84AeU) + +### Join two lanelets + +In order to join two lanelets, please follow these steps: + +1. Please create two distinct lanelet. +2. Select a Lanelet, then press `Shift` and select other lanelet. +3. Now, you can see `Join Lanelets` button, just press it. +4. These lanelets will be joined. + +Video Demonstration: + +![type:video](https://youtube.com/embed/_tHilFUKDQc) + +### Join Multiple lanelets + +In order to add (join) two or more lanelets to another lanelet, please follow these steps: + +1. Create multiple lanelets. +2. You can join the first two lanelets like the steps before. +3. Please check end points ids of first lanelet. +4. Then you need to change these ids with third lanelet's start point. (Please change with selecting linestring of lanelet) +5. You will see two next lanes of the first lanelet will be appeared. + +Video Demonstration: + +![type:video](https://youtube.com/embed/l5ZnL0Cjmnk) + +### Change Speed Limit Of Lanelet + +In order to change the speed limit of lanelet, please follow these steps: + +1. Select the lanelet where the speed limit will be changed +2. Set `speed limit` on the right panel. + +### Test lanelets with planning simulator + +After the completing of creating lanelets, we need to save it. +To that please click `File` --> `Export Lanelet2Maps` then download. + +After the download is finished, +we need to put lanelet2 map and pointcloud map on the same location. +The directory structure should be like this: + +```diff +/ + ├─ pointcloud_map.pcd + └─ lanelet2_map.osm +``` + +If your .osm or .pcd map file's name is different from these names, +you need to update autoware.launch.xml: + +```diff + +- ++ +- ++ +``` + +Now we are ready to launch the planning simulator: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:= vehicle_model:= sensor_model:= +``` + +Example for tutorial_vehicle: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:=$HOME/Files/autoware_map/tutorial_map/ vehicle_model:=tutorial_vehicle sensor_model:=tutorial_vehicle_sensor_kit vehicle_id:=tutorial_vehicle +``` + +1. Click `2D Pose Estimate` button on rviz or press `P` and give a pose for initialization. +2. Click `2D Goal Pose` button on rviz or press `G` and give a pose for goal point. + +
+ ![planning-simulator-test](images/planning-simulator-map-test.png){ align=center } +
+ Testing our created vector map with planning simulator +
+
diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/images/speed-bump-test.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/images/speed-bump-test.png new file mode 100644 index 00000000000..11894f768b8 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/images/speed-bump-test.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/index.md new file mode 100644 index 00000000000..8e36ae1171d --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/speed-bump/index.md @@ -0,0 +1,82 @@ +# Speed bump + +Behavior velocity planner's [speed bump module](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_velocity_crosswalk_module/) plans velocity +to slow down before speed bump for comfortable and safety driving. +In order to operate that, we will add speed bumps to our lanelet2 map. + +## Creating a speed bump element + +In order to create a speed bump on your pointcloud map, please follow these steps: + +1. Select `Linestring` from Lanelet2Maps section. +2. Click and draw polygon for speed bump. +3. Then please disable `Linestring` from Lanelet2Maps section. +4. CLick `Change to Polygon` from the `Action` panel. +5. Please select this Polygon and enter `speed_bump` as the type. +6. Then, please click lanelet which speed bump to be added. +7. Select `Create General Regulatory ELement`. +8. Go to this element, and please enter `speed_bump` as subtype. +9. Click `Add refers` and type your created speed bump polygon ID. + +You can see these steps in the speed bump creating demonstration video: + +![type:video](https://youtube.com/embed/EenccStyZVg) + +### Testing created the speed bump element with planning simulator + +After the completing of creating the map, we need to save it. +To that please click `File` --> `Export Lanelet2Maps` then download. + +After the download is finished, +we need to put lanelet2 map and pointcloud map on the same location. +The directory structure should be like this: + +```diff ++ / ++ ├─ pointcloud_map.pcd ++ └─ lanelet2_map.osm +``` + +If your .osm or .pcd map file's name is different from these names, +you need to update autoware.launch.xml: + +```diff + +- ++ +- ++ +``` + +!!! note + + The speed bump module not enabled default. To enable that, please uncomment it your [behavior_velocity_planner.param.yaml](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_velocity_planner/behavior_velocity_planner.param.yaml). + +Now we are ready to launch the planning simulator: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:= vehicle_model:= sensor_model:= +``` + +Example for tutorial_vehicle: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:=$HOME/Files/autoware_map/tutorial_map/ vehicle_model:=tutorial_vehicle sensor_model:=tutorial_vehicle_sensor_kit vehicle_id:=tutorial_vehicle +``` + +1. Click `2D Pose Estimate` button on rviz or press `P` and give a pose for initialization. +2. Click `2D Goal Pose` button on rviz or press `G` and give a pose for goal point. +3. You can see the speed bump marker on the rviz screen. + +Speed bump markers on rviz: + +
+ ![speed-bump-test](images/speed-bump-test.png){ align=center } +
+ Speed bump test on the created map. +
+
+ +You can check your speed bump elements in the planning simulator as this demonstration video: + +![type:video](https://youtube.com/embed/rg_a-ipdNAY) diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/images/stop-line-test.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/images/stop-line-test.png new file mode 100644 index 00000000000..aa7a36fc919 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/images/stop-line-test.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/index.md new file mode 100644 index 00000000000..7342ea0ab96 --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/stop-line/index.md @@ -0,0 +1,73 @@ +# Stop Line + +Behavior velocity planner's [stop line module](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_velocity_stop_line_module/) plans velocity +to stop right before stop lines and restart driving after stopped. +In order to operate that, we will add stop line attribute to our lanelet2 map. + +## Creating a stop line regulatory element + +In order to create a stop line on your pointcloud map, please follow these steps: + +1. Please select lanelet which stop line to be added. +2. Click `Abstraction` button on top panel. +3. Select `Stop Line` from the panel. +4. Click on the desired area for inserting stop line. + +You can see these steps in the stop line creating demonstration video: + +![type:video](https://youtube.com/embed/cgTSA50Yfyo) + +### Testing created the stop line element with planning simulator + +After the completing of creating the map, we need to save it. +To that please click `File` --> `Export Lanelet2Maps` then download. + +After the download is finished, +we need to put lanelet2 map and pointcloud map on the same location. +The directory structure should be like this: + +```diff ++ / ++ ├─ pointcloud_map.pcd ++ └─ lanelet2_map.osm +``` + +If your .osm or .pcd map file's name is different from these names, +you need to update autoware.launch.xml: + +```diff + +- ++ +- ++ +``` + +Now we are ready to launch the planning simulator: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:= vehicle_model:= sensor_model:= +``` + +Example for tutorial_vehicle: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:=$HOME/Files/autoware_map/tutorial_map/ vehicle_model:=tutorial_vehicle sensor_model:=tutorial_vehicle_sensor_kit vehicle_id:=tutorial_vehicle +``` + +1. Click `2D Pose Estimate` button on rviz or press `P` and give a pose for initialization. +2. Click `2D Goal Pose` button on rviz or press `G` and give a pose for goal point. +3. You can see the stop line marker on the rviz screen. + +Stop line markers on rviz: + +
+ ![stop-line-test](images/stop-line-test.png){ align=center } +
+ Stop line test on the created map. +
+
+ +You can check your stop line elements in the planning simulator as this demonstration video: + +![type:video](https://youtube.com/embed/cAQ_ulo7LHo) diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/images/traffic-light-test.png b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/images/traffic-light-test.png new file mode 100644 index 00000000000..4f231abc4ad Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/images/traffic-light-test.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/index.md new file mode 100644 index 00000000000..4acbe78a01b --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/creating-vector-map/traffic-light/index.md @@ -0,0 +1,76 @@ +# Traffic light + +Behavior velocity planner's [traffic light module](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_velocity_traffic_light_module/) plans velocity +according to the traffic light status. +In order to operate that, we will add traffic light attribute to our lanelet2 map. + +## Creating a traffic light regulatory element + +In order to create a traffic light on your pointcloud map, please follow these steps: + +1. Please select lanelet which traffic light to be added. +2. Click `Abstraction` button on top panel. +3. Select `Traffic Light` from the panel. +4. Click on the desired area for inserting traffic light. + +You can see these steps in the traffic-light creating demonstration video: + +![type:video](https://youtube.com/embed/P3xcayPkTOg) + +### Testing created the traffic light element with planning simulator + +After the completing of creating the map, we need to save it. +To that please click `File` --> `Export Lanelet2Maps` then download. + +After the download is finished, +we need to put lanelet2 map and pointcloud map on the same location. +The directory structure should be like this: + +```diff ++ / ++ ├─ pointcloud_map.pcd ++ └─ lanelet2_map.osm +``` + +If your .osm or .pcd map file's name is different from these names, +you need to update autoware.launch.xml: + +```diff + +- ++ +- ++ +``` + +Now we are ready to launch the planning simulator: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:= vehicle_model:= sensor_model:= +``` + +Example for tutorial_vehicle: + +```bash +ros2 launch autoware_launch planning_simulator.launch.xml map_path:=$HOME/Files/autoware_map/tutorial_map/ vehicle_model:=tutorial_vehicle sensor_model:=tutorial_vehicle_sensor_kit vehicle_id:=tutorial_vehicle +``` + +1. Click `2D Pose Estimate` button on rviz or press `P` and give a pose for initialization. +2. Click `Panels` -> `Add new panel`, select `TrafficLightPublishPanel`, and then press `OK`. +3. In TrafficLightPublishPanel, set the ID and color of the traffic light. +4. Then, Click `SET` and `PUBLISH` button. +5. Click `2D Goal Pose` button on rviz or press `G` and give a pose for goal point. +6. You can see the traffic light marker on the rviz screen if you set the traffic light color as `RED`. + +Traffic Light markers on rviz: + +
+ ![traffic-light-test](images/traffic-light-test.png){ align=center } +
+ Traffic light test on the created map. +
+
+ +You can check your traffic light elements in the planning simulator as this demonstration video: + +![type:video](https://youtube.com/embed/AaFT24uqbJk) diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/LIO-SAM-imu-direction.png b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/LIO-SAM-imu-direction.png new file mode 100644 index 00000000000..a8a33b7eb37 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/LIO-SAM-imu-direction.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/LIO-SAM-output.png b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/LIO-SAM-output.png new file mode 100644 index 00000000000..6c9db62a2ca Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/LIO-SAM-output.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/pcd-map.png b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/pcd-map.png deleted file mode 100644 index d134d1dbaa6..00000000000 Binary files a/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/images/pcd-map.png and /dev/null differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/index.md index c8854637498..40205476987 100644 --- a/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/index.md +++ b/docs/how-to-guides/integrating-autoware/creating-maps/open-source-slam/lio-sam/index.md @@ -1,8 +1,8 @@ -# LIO_SAM +# LIO-SAM -## What is LIO_SAM? +## What is LIO-SAM? -- A framework that achieves highly accurate, real-time mobile robot trajectory estimation and map-building. It formulates lidar-inertial odometry atop a factor graph, allowing a multitude of relative and absolute measurements, including loop closures, to be incorporated from different sources as factors into the system +- A framework that achieves highly accurate, real-time mobile robot trajectory estimation and map-building. It formulates lidar-inertial odometry atop a factor graph, allowing a multitude of relative and absolute measurements, including loop closures, to be incorporated from different sources as factors into the system. ## Repository Information @@ -12,64 +12,245 @@ ### Required Sensors -- LIDAR [Livox, Velodyne, Ouster] +- LIDAR [Livox, Velodyne, Ouster, Robosense*] - IMU [9-AXIS] - GPS [OPTIONAL] -

drawing

+\*Robosense lidars aren't supported officially, but their Helios series can be used as Velodyne lidars. + +The system architecture of LIO-SAM method described in the following diagram, +please look at the official repository for getting more information. + +
+ ![lio-sam-architecture](images/system.png){ align=center width="960"} +
+ System Architecture of LIO-SAM +
+
+ +We are using [Robosense Helios 5515](https://www.robosense.ai/en/rslidar/RS-Helios) and [CLAP B7](https://en.unicorecomm.com/assets/upload/file/CLAP-B7_Product_Brief_En.pdf) sensor on tutorial_vehicle, +so we will use these sensors for running LIO-SAM. + +Additionally, LIO-SAM tested with [Applanix POS LVX](https://www.applanix.com/downloads/products/specs/POS_LVX-Datasheet.pdf) and [Hesai Pandar XT32](https://www.hesaitech.com/product/xt32/) sensor setup. Some additional information +according to the sensors will be provided in this page. ### ROS Compatibility -- ROS 1 -- [For ROS 2](https://github.com/TixiaoShan/LIO-SAM/tree/ros2) +Since Autoware uses ROS 2 Humble currently, we will continue with ROS 2 version of LIO-SAM. + +- [ROS](https://github.com/TixiaoShan/LIO-SAM/tree/master) +- [ROS 2](https://github.com/TixiaoShan/LIO-SAM/tree/ros2) (Also, it is compatible with Humble distro) ### Dependencies -- ROS -- PCL -- [GTSAM](https://gtsam.org/get_started/) (Georgia Tech Smoothing and Mapping library) +ROS 2 dependencies: - ```bash - sudo add-apt-repository ppa:borglab/gtsam-release-4.0 - sudo apt install libgtsam-dev libgtsam-unstable-dev - ``` +- [perception-pcl](https://github.com/ros-perception/perception_pcl) +- [pcl-msgs](https://github.com/ros-perception/pcl_msgs/tree/ros2) +- [vision-opencv](https://github.com/ros-perception/vision_opencv/tree/humble) +- [xacro](https://github.com/ros/xacro/tree/ros2) + +To install these dependencies, you can use this bash command in your terminal: ```bash - sudo apt-get install -y ros-melodic-navigation - sudo apt-get install -y ros-melodic-robot-localization - sudo apt-get install -y ros-melodic-robot-state-publisher +sudo apt install ros-humble-perception-pcl \ + ros-humble-pcl-msgs \ + ros-humble-vision-opencv \ + ros-humble-xacro +``` + +Other dependencies: + +- [gtsam](https://gtsam.org/get_started/) (Georgia Tech Smoothing and Mapping library) + +To install the gtsam, you can use this bash command in your terminal: + +```bash + # Add GTSAM-PPA + sudo add-apt-repository ppa:borglab/gtsam-release-4.1 + sudo apt install libgtsam-dev libgtsam-unstable-dev ``` ## Build & Run -### 1) Build +### 1) Installation + +In order to use and build LIO-SAM, we will create workspace for LIO-SAM: ```bash - mkdir -p ~/catkin_lio_sam/src - cd ~/catkin_lio_sam/src - git clone https://github.com/TixiaoShan/LIO-SAM.git + mkdir -p ~/lio-sam-ws/src + cd ~/lio-sam-ws/src + git clone -b ros2 https://github.com/TixiaoShan/LIO-SAM.git cd .. - catkin_make - source devel/setup.bash + colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release +``` + +### 2) Settings + +After the building of LIO-SAM, +we need to record ROS 2 Bag file with including necessary topics for LIO-SAM. +The necessary topics are described in the [config file](https://github.com/leo-drive/LIO-SAM/blob/4938d3bb4423b76bf5aa22556dd755526b03a253/config/params.yaml#L4-L8) on LIO-SAM. + +??? note "ROS 2 Bag example for LIO-SAM with Robosense Helios and CLAP B7" + + ```sh + Files: map_bag_13_09_0.db3 + Bag size: 38.4 GiB + Storage id: sqlite3 + Duration: 3295.326s + Start: Sep 13 2023 16:40:23.165 (1694612423.165) + End: Sep 13 2023 17:35:18.492 (1694615718.492) + Messages: 1627025 + Topic information: Topic: /sensing/gnss/clap/ros/imu | Type: sensor_msgs/msg/Imu | Count: 329535 | Serialization Format: cdr + Topic: /sensing/gnss/clap/ros/odometry | Type: nav_msgs/msg/Odometry | Count: 329533 | Serialization Format: cdr + Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 32953 | Serialization Format: cdr + + ``` + +**Note:** +We use `use_odometry` as true at [clap_b7_driver](https://github.com/Robeff-Technology/clap_b7_driver/tree/dev/autoware) for publishing GPS odometry topic from navsatfix. + +Please set topics and sensor settings on `lio_sam/config/params.yaml`. +Here are some example modifications for out tutorial_vehicle. + +- Topic names: + +```diff +- pointCloudTopic: "/points" ++ pointCloudTopic: "/sensing/lidar/top/pointcloud_raw" +- imuTopic: "/imu/data" ++ imuTopic: "/sensing/gnss/clap/ros/imu" + odomTopic: "odometry/imu" +- gpsTopic: "odometry/gpsz" ++ gpsTopic: "/sensing/gnss/clap/ros/odometry" +``` + +Since we will use GPS information with Autoware, +so we need to enable `useImuHeadingInitialization` parameter. + +- GPS settings: + +```diff +- useImuHeadingInitialization: false ++ useImuHeadingInitialization: true +- useGpsElevation: false ++ useGpsElevation: true +``` + +We will update sensor settings also. +Since Robosense Lidars aren't officially supported, +we will set our 32-channel Robosense Helios 5515 lidar as Velodyne: + +- Sensor settings: + +```diff +- sensor: ouster ++ sensor: velodyne +- N_SCAN: 64 ++ N_SCAN: 32 +- Horizon_SCAN: 512 ++ Horizon_SCAN: 1800 +``` + +After that, +we will update extrinsic transformations between Robosense Lidar and CLAP B7 GNSS/INS (IMU) system. + +- Extrinsic transformation: + +```diff +- extrinsicTrans: [ 0.0, 0.0, 0.0 ] ++ extrinsicTrans: [-0.91, 0.0, -1.71] +- extrinsicRot: [-1.0, 0.0, 0.0, +- 0.0, 1.0, 0.0, +- 0.0, 0.0, -1.0 ] ++ extrinsicRot: [1.0, 0.0, 0.0, ++ 0.0, 1.0, 0.0, ++ 0.0, 0.0, 1.0 ] +- extrinsicRPY: [ 0.0, 1.0, 0.0, +- -1.0, 0.0, 0.0, +- 0.0, 0.0, 1.0 ] ++ extrinsicRPY: [ 1.0, 0.0, 0.0, ++ 0.0, 1.0, 0.0, ++ 0.0, 0.0, 1.0 ] + +``` + +!!! warning + + The mapping direction is towards to the going direction in the real world. + If LiDAR sensor is backwards, according to the direction you are moving, + then you need to change the extrinsicRot too. + Unless the IMU tries to go in the wrong direction, and it may occur problems. + +For example, in our Applanix POS LVX and Hesai Pandar XT32 setup, IMU direction was towards to the going direction and +LiDAR direction has 180 degree difference in Z-axis according to the IMU direction. In other words, they were facing back +to each other. The tool may need a transformation for IMU for that. + +- In that situation, the calibration parameters changed as this: + +```diff +- extrinsicRot: [-1.0, 0.0, 0.0, +- 0.0, 1.0, 0.0, +- 0.0, 0.0, -1.0 ] ++ extrinsicRot: [-1.0, 0.0, 0.0, ++ 0.0, -1.0, 0.0, ++ 0.0, 0.0, 1.0 ] +- extrinsicRPY: [ 0.0, 1.0, 0.0, +- -1.0, 0.0, 0.0, +- 0.0, 0.0, 1.0 ] ++ extrinsicRPY: [ -1.0, 0.0, 0.0, ++ 0.0, -1.0, 0.0, ++ 0.0, 0.0, 1.0 ] ``` -### 2) Set parameters +- In the end, we got this transform visualization in RViz: + +
+ ![lio-sam-imu-direction](images/LIO-SAM-imu-direction.png){ align=center width="512"} +
+ Transform Visualization of Applanix POS LVX and Hesai Pandar XT32 in RViz +
+
+ +Now, we are ready to create a map for Autoware. -- Set topics and sensor settings on `lio_sam/config/params.yaml` +### 3) Usage + +If you are set configurations and create bag file for LIO-SAM, you can launch LIO-SAM with: + +```bash +ros2 launch lio_sam run.launch.py +``` -### 3) Run +The rviz2 screen will be open, then you can play your bag file: ```bash - # Run the Launch File - roslaunch lio_sam run.launch +ros2 bag play +``` - # Play bag file in the other terminal - rosbag play xxx.bag --clock +If the mapping process is finished, you can save map with calling this service: + +```bash +ros2 service call /lio_sam/save_map lio_sam/srv/SaveMap "{resolution: 0.2, destination: }" ``` +Here is the video for demonstration of LIO-SAM mapping in our campus environment: + +![type:video](https://youtube.com/embed/0EX9U95oecw) + +The output map format is local UTM, +we will change local UTM map to MGRS format for tutorial_vehicle. +Also, if you want change UTM to MGRS for autoware, +please follow [convert-utm-to-mgrs-map](../../converting-utm-to-mgrs-map) page. + ## Example Result -

drawing

+
+ ![lio-sam-output](images/LIO-SAM-output.png){ align=center width="512"} +
+ Sample Map Output for our Campus Environment +
+
## Paper diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/.pages b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/.pages new file mode 100644 index 00000000000..35fd5a113be --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/.pages @@ -0,0 +1,2 @@ +nav: + - index.md diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/select-map.png b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/select-map.png new file mode 100644 index 00000000000..4b090cbee7c Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/select-map.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/space-subsampling.png b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/space-subsampling.png new file mode 100644 index 00000000000..4e5532c2541 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/space-subsampling.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/subsampled-map.png b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/subsampled-map.png new file mode 100644 index 00000000000..15d4a887d78 Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/images/subsampled-map.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/index.md b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/index.md new file mode 100644 index 00000000000..e0e7336c2e5 --- /dev/null +++ b/docs/how-to-guides/integrating-autoware/creating-maps/pointcloud-map-downsampling/index.md @@ -0,0 +1,62 @@ +# Pointcloud map downsampling + +## Overview + +When your created point cloud map is either too dense or too large (i.e., exceeding 300 MB), +you may want to downsample it for improved computational and memory efficiency. +Also, you can consider using dynamic map loading with partial loading, +please check [map_loader package](https://github.com/autowarefoundation/autoware.universe/tree/main/map/map_loader) for more information. + +At tutorial_vehicle implementation we will use the whole map, +so we will downsample it with using [CloudCompare](https://www.cloudcompare.org/main.html). + +## Installing CloudCompare + +You can install it by snap: + +```bash +sudo snap install cloudcompare +``` + +Please check the [official page](https://www.cloudcompare.org/release/index.html#CloudCompare) +for installing options. + +## Downsampling a pointcloud map + +There are three [subsampling methods on CloudCompare](https://www.cloudcompare.org/doc/wiki/index.php/Edit%5CSubsample), +we are using `Space` method for subsampling, but you can use other methods if you want. + +1. Please open CloudCompare and drag your pointcloud to here, then you can select your pointcloud map by just clicking on the map at the DB tree panel. +2. Then you can click `subsample` button on the top panel. + +
+ ![db-tree-panel](images/select-map.png){ align=center } +
+ CloudCompare +
+
+ +1. Please select on your subsample method, we will use space for tutorial_vehicle. +2. Then you can select options. For example, we need to determine minimum space between points. (Please be careful in this section, subsampling is depending on your map size, computer performance, etc.) We will set this value 0.2 for tutorial_vehicle's map. + +
+ ![space-subsampling](images/space-subsampling.png){ align=center width="512" } +
+ Pointcloud subsampling +
+
+ +- After the subsampling process is finished, + you should select pointcloud on the DB Tree panel as well. + +
+ ![db-tree-panel](images/subsampled-map.png){ align=center width="360" } +
+ Select your downsampled pointcloud +
+
+ +Now, +you can save your downsampled pointcloud with `ctrl + s` +or you can click save button from `File` bar. +Then, this pointcloud can be used by autoware. diff --git a/docs/reference-hw/.pages b/docs/reference-hw/.pages index adb1a5ea36a..09ca91592b3 100644 --- a/docs/reference-hw/.pages +++ b/docs/reference-hw/.pages @@ -10,3 +10,4 @@ nav: - vehicle_platform_suppliers.md - remote_drive.md - full_drivers_list.md + - ad_sensor_kit_suppliers.md diff --git a/docs/reference-hw/ad-computers.md b/docs/reference-hw/ad-computers.md index af39cbd9bb9..509287bca23 100644 --- a/docs/reference-hw/ad-computers.md +++ b/docs/reference-hw/ad-computers.md @@ -2,22 +2,25 @@ ## **ADLINK In-Vehicle Computers** +![ad_comp-adlink.png](images/ad_comp-adlink.png) + ADLINK solutions which is used for autonomous driving and tested by one or more community members are listed below: -| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | -| ------------------------------- | ------------------------------ | ------------------------ | ------------------------------------------------------------ | ---------------------------------------------- | --------------------- | -| AVA-351001 | Intel® Xeon® E-2278GE | Dual RTX 5000 | 64GB RAM,CAN, USB, 10G Ethernet, DIO, Hot-Swap SSD, USIM | 9~36 VDC, MIL-STD-810H,ISO 7637-2 & SAE 113-11 | Y | -| SOAFEE’s AVA Developer Platform | Ampere Altra ARMv8 | optional | USB, Ethernet, DIO, M.2 NVMe SSDs | 110/220 AC | Y | -| RQX-58G | Carmel ARMv8.2 2.26GHz | Nvidia Jetson AGX Xavier | USB, Ethernet, M.2 NVME SSD, CAN, USIM, GMSL2 Camera support | 9~36VDC | Y | -| RQX-59G | 8-core Arm® Cortex®-A78AE v8.2 | Nvidia Jetson AGX Orin | USB, Ethernet, M.2 NVME SSD, CAN, USIM, GMSL2 Camera support | 9~36VDC | N | -| SOAFEE’s AVA AP1 | Ampere Altra ARMv8 | optional | CAN, USB, Ethernet, DIO, M.2 NVMe SSDs | 12 Volt | Y | +| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | +| ------------------------------- | --------------------- | ------------------------ | ------------------------------------------------------------ | ---------------------------------------------------------- | --------------------- | +| AVA-3510 | Intel® Xeon® E-2278GE | Dual MXM RTX 5000 | 64GB RAM,CAN, USB, 10G Ethernet, DIO, Hot-Swap SSD, USIM | 9~36 VDC, MIL-STD-810H,ISO 7637-2 | Y | +| SOAFEE’s AVA Developer Platform | Ampere Altra ARMv8 | optional | USB, Ethernet, DIO, M.2 NVMe SSDs | 110/220 AC | Y | +| RQX-58G | 8-core Arm | Nvidia Jetson AGX Xavier | USB, Ethernet, M.2 NVME SSD, CAN, USIM, GMSL2 Camera support | 9~36VDC, IEC 60068-2-64: Operating 3Grms, 5-500 Hz, 3 axes | Y | +| RQX-59G | 8-core Arm | Nvidia Jetson AGX Orin | USB, Ethernet, M.2 NVME SSD, CAN, USIM, GMSL2 Camera support | 9~36VDC, IEC 60068-2-64: Operating 3Grms, 5-500 Hz, 3 axes | - | Link to company website is [here.](https://www.adlinktech.com/en/Connected-Autonomous-Vehicle-Solutions) ## **NXP In-Vehicle Computers** +![ad_comp-nxp.png](images/ad_comp-nxp.png) + NXP solutions which is used for autonomous driving and tested by one or more community members are listed below: | Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | @@ -28,20 +31,42 @@ Link to company website is [here.](https://www.nxp.com/design/designs/bluebox-3- ## **Neousys In-Vehicle Computers** +![ad_comp-neousys.png](images/ad_comp-neousys.png) + Neousys solutions which is used for autonomous driving and tested by one or more community members are listed below: -| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | -| ----------------------- | --------------------- | --------------------------- | --------------------------------------------------- | ----------------------------------------------- | --------------------- | -| 8208-GC | Intel® Xeon® E-2278GE | Dual RTX 2080ti or RTX 3070 | 128 GB RAM,CAN, USB, Ethernet, Serial, Hot-Swap SSD | 8-35 VoltVibration:MIL-STD810G 5-500 Hz, 3 axes | - | +| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | +| ----------------------- | -------------------------- | --------------------------------- | ---------------------------------------------------------------------------------- | ------------------------------------------------------------ | --------------------- | +| 8805-GC | AMD® EPYC™ 7003 | NVIDIA® RTX A6000/ A4500 | 512GB CAN, USB, Ethernet, Serial, Easy-Swap SSD | 8-48 Volt, Vibration:MIL-STD810G, Method 514.6, Category 4 | Y | +| 10208-GC | Intel® 13th/12th-Gen Core™ | Dual 350W NVIDIA® RTX GPU | 64GB CAN, USB, Ethernet, Serial, M2 NVMe SSD | 8~48 Volt, Vibration: MIL-STD-810H, Method 514.8, Category 4 | Y | +| 9160-GC | Intel® 13th/12th-Gen Core™ | NVIDIA® RTX series up to 130W TDP | 64GB CAN, USB, Ethernet, PoE, Serial, two 2.5" SATA HDD/SSD with RAID, M2 NVMe SSD | 8~48, Vibration: Volt,MIL-STD-810G, Method 514.6, Category 4 | - | -Link to company website is [here.](http://bit.ly/neousys8208GC) +Link to company website is [here.](https://www.neousys-tech.com/en/product/product-lines/edge-ai-gpu-computing) ## **Crystal Rugged In-Vehicle Computers** +![ad_comp-crystal_rugged.png](images/ad_comp-crystal_rugged.png) + Crystal Rugged solutions which is used for autonomous driving and tested by one or more community members are listed below: -| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | -| ----------------------- | --------------------- | ------------------- | ------------------------------------------------ | ---------------------------------------------- | --------------------- | -| AVC 0161-AC | Intel® Xeon® Scalable | Dual GPU RTX Series | 2TB RAM,CAN, USB, Ethernet, Serial, Hot-Swap SSD | 10-32 VoltVibration:2 G RMS 10-1000 Hz, 3 axes | - | +| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | +| ----------------------- | ------------------------------------------------ | ----------------------- | -------------------------------------------------------- | ----------------------------------------------------------------------------- | --------------------- | +| AVC 0161-AC | Intel® Xeon® Scalable | Dual GPU RTX Series | 2TB RAM,CAN, USB, Ethernet, Serial, Hot-Swap SSD | 10-32 VoltVibration:2 G RMS 10-1000 Hz, 3 axes | - | +| AVC0403 | Intel® Xeon® Scalable or AMD EPYC™ | Optional (5 GPU) | 2TB RAM, CAN, USB, Ethernet, Serial, Hot-Swap SSD | 10-32 Volt, Vibration: 2 G RMS 10-1000 Hz, 3 axes | - | +| AVC1322 | Intel® Xeon® D-1718T or Gen 12/13 Core™ i3/i5/i7 | NVIDIA® Jetson AGX Orin | 128 GB DDR4 RAM, USB, Ethernet, Serial, SATA 2.5” SSD | 10-36 Volt, Vibration: 5.5g, 5-2,000Hz, 60 min/axis, 3 axis | - | +| AVC1753 | 10th Generation Intel® Core™ and Xeon® | Optional (1 GPU) | 128 GB DDR4 RAM, USB, Ethernet, NVMe U.2 SSD/ 3 SATA SSD | 8-36 VDC/ 120-240VAC 50/60Hz, Vibration: 5.5g, 5-2,000Hz, 60 min/axis, 3 axis | - | + +Link to company website is [here.](https://www.crystalrugged.com/products/ai-autonomous-vehicle-technology/) + +## **Vecow In-Vehicle Computers** + +![ad_comp-vecow.png](images/ad_comp-vecow.png) + +Vecow solutions which is used for autonomous driving and tested by one or more community members are listed below: + +| Supported Products List | CPU | GPU | RAM, Interfaces | Environmental | Autoware Tested (Y/N) | +| ----------------------- | -------------------------- | ------------------------------------- | ----------------------------------------------------------------- | ------------------------------------------------------------ | --------------------- | +| ECX-3800 PEG | Intel® 13th/12th-Gen Core™ | 200W power of NVIDIA® or AMD graphics | 64GB RAM, CAN, USB, Ethernet, PoE, Serial, M.2/SATA SSD, SIM Card | 12-50 Volt, Vibration:MIL-STD810G, Procedure I, 20°C to 45°C | - | +| IVX-1000 | Intel® 13th/12th-Gen Core™ | NVIDIA Quadro® MXM Graphics | 64GB RAM, Ethernet, PoE, Serial, M.2/SATA/mSATA SSD, SIM Card | 16-160 Volt, Vibration: IEC 61373 : 2010, 40°C to 85°C | - | -Link to company website is [here.](https://www.crystalrugged.com/product/AVC0161-Ai-Autonomy-Solution/) +Link to company website is [here.](https://www.vecow.com/dispPageBox/vecow/VecowHp.aspx?ddsPageID=VECOW_EN) diff --git a/docs/reference-hw/ad_sensor_kit_suppliers.md b/docs/reference-hw/ad_sensor_kit_suppliers.md new file mode 100644 index 00000000000..0fb20b6aa68 --- /dev/null +++ b/docs/reference-hw/ad_sensor_kit_suppliers.md @@ -0,0 +1,40 @@ +# AD Sensor Kit Suppliers + +## **LEO Drive AD Sensor Kit** + +![images/ad_kit-leodrive.png](images/ad_kit-leodrive.png) + +LEO Drive Autonomy Essentials Kit contents are listed below: + +| Supported Products List | Camera | Lidar | GNSS/INS | ROS 2 Support | Autoware Tested (Y/N) | +| ----------------------- | ----------------------- | ------------------------------------------------------------------ | ---------------- | ------------- | --------------------- | +| Autonomy Essentials Kit | 8x Lucid Vision TRI054S | 4x Velodyne Puck
1x Velodyne Alpha Prime
1x RoboSense Bpearl | 1x SBG Ellipse-D | Y | Y | + +Link to company website: +[https://leodrive.ai/](https://leodrive.ai/) + +## **TIER IV AD Kit** + +![images/ad_kit-tieriv.png](images/ad_kit-tieriv.png) + +TIER IV sensor fusion system contents are listed below: + +| Supported Products List | Camera | Lidar | ECU | ROS 2 Support | Autoware Tested (Y/N) | +| ----------------------- | -------------- | -------------------------------- | -------------------------- | ------------- | --------------------- | +| TIER IV ADK | TIER IV C1, C2 | HESAI (AT-128,XT-32)
Velodyne | ADLINK (RQX-58G, AVA-3510) | Y | Y | + +Link to company website: +[https://sensor.tier4.jp/sensor-fusion-system](https://sensor.tier4.jp/sensor-fusion-system) + +## **RoboSense AD Sensor Kit** + +![images/ad_kit-robosense.png](images/ad_kit-robosense.png) + +RoboSense L4 sensor fusion solution system contents are listed below: + +| Supported Products List | Camera | Lidar | ECU | ROS 2 Support | Autoware Tested (Y/N) | +| ----------------------- | ------ | ------------------------------------- | -------- | ------------- | --------------------- | +| P6 | - | 4x Automotive Grade Solid-state Lidar | Optional | - | - | + +Link to company website: +[https://www.robosense.ai/en/rslidar/RS-Fusion-P6](https://www.robosense.ai/en/rslidar/RS-Fusion-P6) diff --git a/docs/reference-hw/cameras.md b/docs/reference-hw/cameras.md index eddec0c5097..8b82c1f4e02 100644 --- a/docs/reference-hw/cameras.md +++ b/docs/reference-hw/cameras.md @@ -2,6 +2,8 @@ ## **TIER IV Automotive HDR Cameras** +![camera-tieriv.png](images/camera-tieriv.png) + [TIER IV's Automotive HDR cameras](https://sensor.tier4.jp/automotive-camera) which have ROS 2 driver and tested by TIER IV are listed below: | Supported Products List | MP | FPS | Interface | HDR | LFM | Trigger
/Synchronization | Ingress
Protection | ROS 2 Driver | Autoware
Tested (Y/N) | @@ -21,6 +23,8 @@ Link to product web site: ## **FLIR Machine Vision Cameras** +![camera-flir.png](images/camera-flir.png) + FLIR Machine Vision cameras which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | MP | FPS | Interface | HDR | LFM | Trigger
/Synchronization | Ingress
Protection | ROS 2 Driver | Autoware Tested (Y/N) | @@ -36,6 +40,8 @@ Link to company website: ## **Lucid Vision Cameras** +![camera-lucid_vision.png](images/camera-lucid_vision.png) + Lucid Vision cameras which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | MP | FPS | Interface | HDR | LFM | Trigger
/Synchronization | Ingress
Protection | ROS 2 Driver | Autoware Tested (Y/N) | @@ -50,6 +56,8 @@ Link to company website: ## **Allied Vision Cameras** +![camera-allied_vision.png](images/camera-allied_vision.png) + Allied Vision cameras which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | MP | FPS | Interface | HDR | LFM | Trigger
/Synchronization | Ingress
Protection | ROS 2 Driver | Autoware Tested (Y/N) | @@ -61,3 +69,19 @@ Link to ROS 2 driver: Link to company website: [https://www.alliedvision.com/en/products/camera-series/mako-g](https://www.alliedvision.com/en/products/camera-series/mako-g) + +## **Neousys Technology Camera** + +![images/camera-neousys.png](images/camera-neousys.png) + +Neousys Technology cameras which has ROS 2 driver and tested by one or more community members are listed below: + +| Supported Products List | MP | FPS | Interface | Sensor Format | Lens | ROS 2 Driver | Autoware Tested (Y/N) | +| ----------------------- | --- | --- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------- | ----------------------------------------------- | ------------ | --------------------- | +| AC-IMX390 | 2.0 | 30 | GMSL2
(over [PCIe-GL26 Grabber Card](https://www.neousys-tech.com/en/product/product-lines/in-vehicle-computing/vehicle-expansion-card/pcie-gl26-gmsl-frame-grabber-card)) | 1/2.7” | 5-axis active adjustment with adhesive dispense | Y | Y | + +Link to ROS 2 driver: +[https://github.com/ros-drivers/gscam](https://github.com/ros-drivers/gscam) + +Link to company website: +[https://www.neousys-tech.com/en/](https://www.neousys-tech.com/en/) diff --git a/docs/reference-hw/full_drivers_list.md b/docs/reference-hw/full_drivers_list.md index 46a4cb00f30..38b8ba100da 100644 --- a/docs/reference-hw/full_drivers_list.md +++ b/docs/reference-hw/full_drivers_list.md @@ -2,19 +2,22 @@ The list of all drivers listed above for easy access as a table with additional information: -| Type | Maker | Driver links | License | Maintainer | -| ------ | ----------------- | ------------------------------------------------------------------------------ | -------- | ---------------------------------------------- | -| Lidar | Velodyne
Hesai | [Link](https://github.com/tier4/nebula) | Apache 2 | david.wong@tier4.jp
abraham.monrroy@map4.jp | -| Lidar | Velodyne | [Link](https://github.com/ros-drivers/velodyne/tree/ros2/velodyne_pointcloud) | BSD | jwhitley@autonomoustuff.com | -| Lidar | Robosense | [Link](https://github.com/RoboSense-LiDAR/rslidar_sdk) | BSD | zdxiao@robosense.cn | -| Lidar | Hesai | [Link](https://github.com/HesaiTechnology/HesaiLidar_General_ROS) | Apache 2 | wuxiaozhou@hesaitech.com | -| Lidar | Leishen | [Link](https://github.com/leishen-lidar) | - | - | -| Lidar | Livox | [Link](https://github.com/Livox-SDK/livox_ros2_driver) | MIT | dev@livoxtech.com | -| Lidar | Ouster | [Link](https://github.com/ros-drivers/ros2_ouster_drivers) | Apache 2 | stevenmacenski@gmail.com
tom@boxrobotics.ai | -| Radar | smartmicro | [Link](https://github.com/smartmicro/smartmicro_ros2_radars) | Apache 2 | opensource@smartmicro.de | -| Camera | Flir | [Link](https://github.com/berndpfrommer/flir_spinnaker_ros2) | Apache 2 | bernd.pfrommer@gmail.com | -| Camera | Lucid Vision | [Link](https://gitlab.com/leo-drive/Drivers/arena_camera) | - | kcolak@leodrive.ai | -| Camera | Allied Vision | [Link](https://github.com/neil-rti/avt_vimba_camera) | Apache 2 | at@email.com | -| GNSS | NovAtel | [Link](https://github.com/swri-robotics/novatel_gps_driver/tree/dashing-devel) | BSD | preed@swri.org | -| GNSS | SBG Systems | [Link](https://github.com/SBG-Systems/sbg_ros2_driver) | MIT | support@sbg-systems.com | -| GNSS | PolyExplore | [Link](https://github.com/polyexplore/ROS2_Driver) | - | support@polyexplore.com | +| Type | Maker | Driver links | License | Maintainer | +| ------ | ----------------------- | ------------------------------------------------------------------------------ | -------- | --------------------------------------------------- | +| Lidar | Velodyne
Hesai | [Link](https://github.com/tier4/nebula) | Apache 2 | david.wong@tier4.jp
abraham.monrroy@map4.jp | +| Lidar | Velodyne | [Link](https://github.com/ros-drivers/velodyne/tree/ros2/velodyne_pointcloud) | BSD | jwhitley@autonomoustuff.com | +| Lidar | Robosense | [Link](https://github.com/RoboSense-LiDAR/rslidar_sdk) | BSD | zdxiao@robosense.cn | +| Lidar | Hesai | [Link](https://github.com/HesaiTechnology/HesaiLidar_General_ROS) | Apache 2 | wuxiaozhou@hesaitech.com | +| Lidar | Leishen | [Link](https://github.com/leishen-lidar) | - | - | +| Lidar | Livox | [Link](https://github.com/Livox-SDK/livox_ros2_driver) | MIT | dev@livoxtech.com | +| Lidar | Ouster | [Link](https://github.com/ros-drivers/ros2_ouster_drivers) | Apache 2 | stevenmacenski@gmail.com
tom@boxrobotics.ai | +| Radar | smartmicro | [Link](https://github.com/smartmicro/smartmicro_ros2_radars) | Apache 2 | opensource@smartmicro.de | +| Radar | Continental Engineering | [Link](https://github.com/tier4/ars408_driver) | Apache 2 | abraham.monrroy@tier4.jp
satoshi.tanaka@tier4.jp | +| Camera | Flir | [Link](https://github.com/berndpfrommer/flir_spinnaker_ros2) | Apache 2 | bernd.pfrommer@gmail.com | +| Camera | Lucid Vision | [Link](https://gitlab.com/leo-drive/Drivers/arena_camera) | - | kcolak@leodrive.ai | +| Camera | Allied Vision | [Link](https://github.com/neil-rti/avt_vimba_camera) | Apache 2 | at@email.com | +| Camera | Tier IV | [Link](https://github.com/tier4/tier4_automotive_hdr_camera) | GPL | - | +| Camera | Neousys Technology | [Link](https://github.com/ros-drivers/gscam) | BSD | jbo@jhu.edu | +| GNSS | NovAtel | [Link](https://github.com/swri-robotics/novatel_gps_driver/tree/dashing-devel) | BSD | preed@swri.org | +| GNSS | SBG Systems | [Link](https://github.com/SBG-Systems/sbg_ros2_driver) | MIT | support@sbg-systems.com | +| GNSS | PolyExplore | [Link](https://github.com/polyexplore/ROS2_Driver) | - | support@polyexplore.com | diff --git a/docs/reference-hw/images/ad_comp-adlink.png b/docs/reference-hw/images/ad_comp-adlink.png new file mode 100644 index 00000000000..baf19e6e926 Binary files /dev/null and b/docs/reference-hw/images/ad_comp-adlink.png differ diff --git a/docs/reference-hw/images/ad_comp-crystal_rugged.png b/docs/reference-hw/images/ad_comp-crystal_rugged.png new file mode 100644 index 00000000000..f7b59eb0b34 Binary files /dev/null and b/docs/reference-hw/images/ad_comp-crystal_rugged.png differ diff --git a/docs/reference-hw/images/ad_comp-neousys.png b/docs/reference-hw/images/ad_comp-neousys.png new file mode 100644 index 00000000000..904dab8fa54 Binary files /dev/null and b/docs/reference-hw/images/ad_comp-neousys.png differ diff --git a/docs/reference-hw/images/ad_comp-nxp.png b/docs/reference-hw/images/ad_comp-nxp.png new file mode 100644 index 00000000000..339ecac5e38 Binary files /dev/null and b/docs/reference-hw/images/ad_comp-nxp.png differ diff --git a/docs/reference-hw/images/ad_comp-vecow.png b/docs/reference-hw/images/ad_comp-vecow.png new file mode 100644 index 00000000000..752dcfa5e0a Binary files /dev/null and b/docs/reference-hw/images/ad_comp-vecow.png differ diff --git a/docs/reference-hw/images/ad_kit-leodrive.png b/docs/reference-hw/images/ad_kit-leodrive.png new file mode 100644 index 00000000000..3eacb8dd19b Binary files /dev/null and b/docs/reference-hw/images/ad_kit-leodrive.png differ diff --git a/docs/reference-hw/images/ad_kit-robosense.png b/docs/reference-hw/images/ad_kit-robosense.png new file mode 100644 index 00000000000..2c71e846a95 Binary files /dev/null and b/docs/reference-hw/images/ad_kit-robosense.png differ diff --git a/docs/reference-hw/images/ad_kit-tieriv.png b/docs/reference-hw/images/ad_kit-tieriv.png new file mode 100644 index 00000000000..6db7096feef Binary files /dev/null and b/docs/reference-hw/images/ad_kit-tieriv.png differ diff --git a/docs/reference-hw/images/camera-allied_vision.png b/docs/reference-hw/images/camera-allied_vision.png new file mode 100644 index 00000000000..f19ca97bf0c Binary files /dev/null and b/docs/reference-hw/images/camera-allied_vision.png differ diff --git a/docs/reference-hw/images/camera-flir.png b/docs/reference-hw/images/camera-flir.png new file mode 100644 index 00000000000..3997ab527b5 Binary files /dev/null and b/docs/reference-hw/images/camera-flir.png differ diff --git a/docs/reference-hw/images/camera-lucid_vision.png b/docs/reference-hw/images/camera-lucid_vision.png new file mode 100644 index 00000000000..dff993897fe Binary files /dev/null and b/docs/reference-hw/images/camera-lucid_vision.png differ diff --git a/docs/reference-hw/images/camera-neousys.png b/docs/reference-hw/images/camera-neousys.png new file mode 100644 index 00000000000..a136c456087 Binary files /dev/null and b/docs/reference-hw/images/camera-neousys.png differ diff --git a/docs/reference-hw/images/camera-tieriv.png b/docs/reference-hw/images/camera-tieriv.png new file mode 100644 index 00000000000..0f3d76e4fd1 Binary files /dev/null and b/docs/reference-hw/images/camera-tieriv.png differ diff --git a/docs/reference-hw/images/dbw-astuff.png b/docs/reference-hw/images/dbw-astuff.png new file mode 100644 index 00000000000..1d89683f036 Binary files /dev/null and b/docs/reference-hw/images/dbw-astuff.png differ diff --git a/docs/reference-hw/images/dbw-dataspeed.png b/docs/reference-hw/images/dbw-dataspeed.png new file mode 100644 index 00000000000..db84076df80 Binary files /dev/null and b/docs/reference-hw/images/dbw-dataspeed.png differ diff --git a/docs/reference-hw/images/dbw-schaffler.png b/docs/reference-hw/images/dbw-schaffler.png new file mode 100644 index 00000000000..33a3d6b02ee Binary files /dev/null and b/docs/reference-hw/images/dbw-schaffler.png differ diff --git a/docs/reference-hw/images/gnss-applanix.png b/docs/reference-hw/images/gnss-applanix.png new file mode 100644 index 00000000000..1b3a752e6a0 Binary files /dev/null and b/docs/reference-hw/images/gnss-applanix.png differ diff --git a/docs/reference-hw/images/gnss-fixposition.png b/docs/reference-hw/images/gnss-fixposition.png new file mode 100644 index 00000000000..bb666b56bb7 Binary files /dev/null and b/docs/reference-hw/images/gnss-fixposition.png differ diff --git a/docs/reference-hw/images/gnss-novatel.png b/docs/reference-hw/images/gnss-novatel.png new file mode 100644 index 00000000000..10f32ac0957 Binary files /dev/null and b/docs/reference-hw/images/gnss-novatel.png differ diff --git a/docs/reference-hw/images/gnss-polyexplore.png b/docs/reference-hw/images/gnss-polyexplore.png new file mode 100644 index 00000000000..4df8a517757 Binary files /dev/null and b/docs/reference-hw/images/gnss-polyexplore.png differ diff --git a/docs/reference-hw/images/gnss-sbg.png b/docs/reference-hw/images/gnss-sbg.png new file mode 100644 index 00000000000..0b63c6ffcd4 Binary files /dev/null and b/docs/reference-hw/images/gnss-sbg.png differ diff --git a/docs/reference-hw/images/gnss-xsens.png b/docs/reference-hw/images/gnss-xsens.png new file mode 100644 index 00000000000..b1558aae862 Binary files /dev/null and b/docs/reference-hw/images/gnss-xsens.png differ diff --git a/docs/reference-hw/images/lidar-hesai.png b/docs/reference-hw/images/lidar-hesai.png new file mode 100644 index 00000000000..eab21f30522 Binary files /dev/null and b/docs/reference-hw/images/lidar-hesai.png differ diff --git a/docs/reference-hw/images/lidar-leishen.png b/docs/reference-hw/images/lidar-leishen.png new file mode 100644 index 00000000000..c7ba35e089e Binary files /dev/null and b/docs/reference-hw/images/lidar-leishen.png differ diff --git a/docs/reference-hw/images/lidar-livox.png b/docs/reference-hw/images/lidar-livox.png new file mode 100644 index 00000000000..f0a8ee61d23 Binary files /dev/null and b/docs/reference-hw/images/lidar-livox.png differ diff --git a/docs/reference-hw/images/lidar-ouster.png b/docs/reference-hw/images/lidar-ouster.png new file mode 100644 index 00000000000..ab4d475e040 Binary files /dev/null and b/docs/reference-hw/images/lidar-ouster.png differ diff --git a/docs/reference-hw/images/lidar-robosense.png b/docs/reference-hw/images/lidar-robosense.png new file mode 100644 index 00000000000..89c2b9e140e Binary files /dev/null and b/docs/reference-hw/images/lidar-robosense.png differ diff --git a/docs/reference-hw/images/lidar-velodyne.png b/docs/reference-hw/images/lidar-velodyne.png new file mode 100644 index 00000000000..7e68a73af74 Binary files /dev/null and b/docs/reference-hw/images/lidar-velodyne.png differ diff --git a/docs/reference-hw/images/platform-autonomoustuff.png b/docs/reference-hw/images/platform-autonomoustuff.png new file mode 100644 index 00000000000..2bd0421967b Binary files /dev/null and b/docs/reference-hw/images/platform-autonomoustuff.png differ diff --git a/docs/reference-hw/images/platform-pix_moving.png b/docs/reference-hw/images/platform-pix_moving.png new file mode 100644 index 00000000000..d5f9affbf28 Binary files /dev/null and b/docs/reference-hw/images/platform-pix_moving.png differ diff --git a/docs/reference-hw/images/radar-aptiv.png b/docs/reference-hw/images/radar-aptiv.png new file mode 100644 index 00000000000..dc289eef8d8 Binary files /dev/null and b/docs/reference-hw/images/radar-aptiv.png differ diff --git a/docs/reference-hw/images/radar-continental.png b/docs/reference-hw/images/radar-continental.png new file mode 100644 index 00000000000..3dc6ce4475c Binary files /dev/null and b/docs/reference-hw/images/radar-continental.png differ diff --git a/docs/reference-hw/images/radar-smartmicro.png b/docs/reference-hw/images/radar-smartmicro.png new file mode 100644 index 00000000000..fc04a3c1a0c Binary files /dev/null and b/docs/reference-hw/images/radar-smartmicro.png differ diff --git a/docs/reference-hw/images/remote-fort.png b/docs/reference-hw/images/remote-fort.png new file mode 100644 index 00000000000..cf033d2efc1 Binary files /dev/null and b/docs/reference-hw/images/remote-fort.png differ diff --git a/docs/reference-hw/images/remote-logitech.png b/docs/reference-hw/images/remote-logitech.png new file mode 100644 index 00000000000..c67b9490173 Binary files /dev/null and b/docs/reference-hw/images/remote-logitech.png differ diff --git a/docs/reference-hw/images/thermal_camera-flir.png b/docs/reference-hw/images/thermal_camera-flir.png new file mode 100644 index 00000000000..3a5b9038eff Binary files /dev/null and b/docs/reference-hw/images/thermal_camera-flir.png differ diff --git a/docs/reference-hw/imu_ahrs_gnss_ins.md b/docs/reference-hw/imu_ahrs_gnss_ins.md index 082d6e93610..af5dd26ba0a 100644 --- a/docs/reference-hw/imu_ahrs_gnss_ins.md +++ b/docs/reference-hw/imu_ahrs_gnss_ins.md @@ -2,6 +2,8 @@ ## **NovAtel GNSS/INS Sensors** +![images/gnss-novatel.png](images/gnss-novatel.png) + NovAtel GNSS/INS sensors which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | INS Rate | Roll, Pitch, Yaw Acc. | GNSS | ROS 2 Driver  | Autoware Tested (Y/N) | @@ -17,6 +19,8 @@ Link to company website: ## **XSens GNSS/INS & IMU Sensors** +![images/gnss-novatel.png](images/gnss-xsens.png) + XSens GNSS/INS sensors which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | INS/IMU Rate | Roll, Pitch, Yaw Acc. | GNSS | ROS 2 Driver  | Autoware Tested (Y/N) | @@ -32,6 +36,8 @@ Link to company website: ## **SBG GNSS/INS & IMU Sensors** +![images/gnss-sbg.png](images/gnss-sbg.png) + SBG GNSS/INS sensors which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | INS/IMU Rate | Roll, Pitch, Yaw Acc. | GNSS | ROS 2 Driver  | Autoware Tested (Y/N) | @@ -49,6 +55,8 @@ Link to company website: +![images/gnss-applanix.png](images/gnss-applanix.png) + SBG GNSS/INS sensors which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | INS/IMU Rate | Roll, Pitch, Yaw Acc. | GNSS | ROS 2 Driver  | Autoware Tested (Y/N) | @@ -64,6 +72,8 @@ Link to company website: ## **PolyExplore GNSS/INS Sensors** +![images/gnss-polyexplore.png](images/gnss-polyexplore.png) + PolyExplore GNSS/INS sensors which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | INS/IMU Rate | Roll, Pitch, Yaw Acc. | GNSS | ROS 2 Driver  | Autoware Tested (Y/N) | @@ -76,3 +86,17 @@ Link to ROS 2 driver: Link to company website: [https://www.polyexplore.com/](https://www.polyexplore.com/) + +## **Fix Position GNSS/INS Sensors** + +![images/gnss-fixposition.png](images/gnss-fixposition.png) + +| Supported Products List | INS/IMU Rate | Roll, Pitch, Yaw Acc. | GNSS | ROS 2 Driver  | Autoware Tested (Y/N) | +| ----------------------- | ------------ | --------------------- | --------------- | ------------- | --------------------- | +| Vision-RTK 2 | 200Hz | - | 5 Hz
L1 / L2 | Y | - | + +Link to ROS 2 driver: +[https://github.com/fixposition/fixposition_driver](https://github.com/fixposition/fixposition_driver) + +Link to company website: +[https://www.fixposition.com/](https://www.fixposition.com/) diff --git a/docs/reference-hw/index.md b/docs/reference-hw/index.md index ead8e0a6188..46eecc3e8a9 100644 --- a/docs/reference-hw/index.md +++ b/docs/reference-hw/index.md @@ -1,6 +1,6 @@ # Reference HW Design -This document is created to describe and give additional information of the sensors and systems supported by Autoware.Auto software. +This document is created to describe and give additional information of the sensors and systems supported by Autoware.Universe software. All equipment listed in this document has available ROS 2 drivers and has been tested by one or more of the community members on field in autonomous vehicle and robotics applications. @@ -35,6 +35,8 @@ The documents consists of the sections listed below: - FLIR Machine Vision Cameras - Lucid Vision Cameras - Allied Vision Cameras + - Tier IV Cameras + - Neousys Technology Cameras - Thermal CAMERAs @@ -47,11 +49,11 @@ The documents consists of the sections listed below: - SBG GNSS/INS & IMU Sensors - Applanix GNSS/INS Sensors - PolyExplore GNSS/INS Sensors + - Fix Position GNSS/INS Sensors - Vehicle Drive By Wire Suppliers - - New Eagle DBW Solutions - Dataspeed DBW Solutions - AStuff Pacmod DBW Solutions - Schaeffler-Paravan Space Drive DBW Solutions @@ -61,7 +63,6 @@ The documents consists of the sections listed below: - PIX MOVING Autonomous Vehicle Solutions - Autonomoustuff AV Solutions - NAVYA AV Solutions - - ZING ROBOTICS AV Solutions - Remote Drive @@ -69,3 +70,9 @@ The documents consists of the sections listed below: - LOGITECH - Full Drivers List + +- AD Sensor Kit Suppliers + + - LEO Drive AD Sensor Kit + - TIER IV AD Kit + - RoboSense AD Sensor Kit diff --git a/docs/reference-hw/lidars.md b/docs/reference-hw/lidars.md index 1e650bc0230..e0ef06be5f3 100644 --- a/docs/reference-hw/lidars.md +++ b/docs/reference-hw/lidars.md @@ -2,6 +2,8 @@ ## **Velodyne 3D LIDAR Sensors** +![lidar-velodyne.png](images/lidar-velodyne.png) + Velodyne Lidars which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | @@ -22,14 +24,18 @@ Link to company website: ## **RoboSense 3D LIDAR Sensors** +![images/lidar-robosense.png](images/lidar-robosense.png) + RoboSense Lidars which has ROS 2 driver and tested by one or more community members are listed below: -| Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | -| ----------------------- | ----- | -------------------- | ------------ | --------------------- | -| RS-Ruby | 250m | (+15°)/(-25°), (360) | Y | - | -| RS-Ruby-Lite | 230m | (+15°)/(-25°), (360) | Y | - | -| RS-LiDAR-32 | 200m | (+15°)/(-25°), (360) | Y | - | -| RS-LiDAR-16 | 150m | (+15°)/(-15), (360) | Y | - | +| Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | +| ----------------------- | ----- | ------------------------------------ | ------------ | --------------------- | +| M1 | 200m | 25°/120° | - | - | +| E1 | 30m | 90°/120° | - | - | +| Bpearl | 100m | 90°/360° | Y | Y | +| Ruby Plus | 250m | 40°/360° | Y | ? | +| Helios 32 | 150m | 70°/360°
31°/360°
26°/360° | Y | Y | +| Helios 16 | 150m | 30°/360° | Y | ? | Link to ROS 2 driver: [https://github.com/RoboSense-LiDAR/rslidar_sdk](https://github.com/RoboSense-LiDAR/rslidar_sdk) @@ -39,15 +45,22 @@ Link to company website: ## **HESAI 3D LIDAR Sensors** +![images/lidar-hesai.png](images/lidar-hesai.png) + Hesai Lidars which has ROS 2 driver and tested by one or more community members are listed below: -| Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | -| ----------------------- | ----- | ---------------------- | ------------ | --------------------- | -| Pandar 128 | 200m | (+15°)/(-25°), (360°) | Y | - | -| Pandar 64 | 200m | (+15°)/(-25°), (360°) | Y | Y | -| Pandar 40P | 200m | (+15°)/(-25°), (360°) | Y | Y | -| Pandar XT | 120m | (+15°)/(-16°), (360°) | Y | Y | -| Pandar QT | 20m | (-52.1°/+52.1°)/(360°) | Y | Y | +| Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | +| ----------------------- | ----- | ----------------------- | ------------ | --------------------- | +| Pandar 128 | 200m | (+15°)/(-25°), (360°) | Y | - | +| Pandar 64 | 200m | (+15°)/(-25°), (360°) | Y | Y | +| Pandar 40P | 200m | (+15°)/(-25°), (360°) | Y | Y | +| QT 128 | 50m | (-52.6°/+52.6°), (360°) | Y | Y | +| QT 64 | 20m | (-52.1°/+52.1°), (360°) | Y | Y | +| AT128 | 200m | (25.4°), (120°) | Y | Y | +| XT32 | 120m | (-16°/+15°), (360°) | Y | Y | +| XT16 | 120m | (-15°/+15°), (360°) | Y | - | +| FT120 | 100m | (75°), (100°) | - | - | +| ET25 | 250m | (25°), (120°) | - | - | Link to ROS 2 drivers: [https://github.com/tier4/nebula](https://github.com/tier4/nebula) @@ -58,6 +71,8 @@ Link to company website: ## **Leishen 3D LIDAR Sensors** +![images/lidar-leishen.png](images/lidar-leishen.png) + Leishen Lidars which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | @@ -66,6 +81,7 @@ Leishen Lidars which has ROS 2 driver and tested by one or more community member | LS C32  | 150m | (+15°/-15°), (360°) | Y | - | | CH 32 | 120m | (+3.7°/-6.7°),(120°) | Y | - | | CH 128 | 20m | (+14°/-17°)/(150°) | Y | - | +| C32W | 160m | (+15°/-55°), (360°) | Y | - | Link to ROS 2 driver: [https://github.com/leishen-lidar](https://github.com/leishen-lidar) @@ -75,14 +91,20 @@ Link to company website: ## **Livox 3D LIDAR Sensors** +![images/lidar-livox.png](images/lidar-livox.png) + Livox Lidars which has ROS 2 driver and tested by one or more community members are listed below: -| Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | -| ----------------------- | ----- | ----------------- | ------------ | --------------------- | -| Horizon | 260m | (81.7°), (25.1°) | Y | Y | -| Mid-70 | 90m | (70.4°), (77.2°) | Y | - | -| Avia | 190m | (70.4°), Circular | Y | - | -| HAP | 150m | (25°), (120°) | - | - | +| Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | +| ----------------------- | ----- | ------------------ | ------------ | --------------------- | +| Horizon | 260m | (81.7°), (25.1°) | Y | Y | +| Mid-40 | 260m | (38.4°), Circular | Y | - | +| Mid-70 | 90m | (70.4°), (77.2°) | Y | - | +| Mid-100 | 260m | (38.4°), (98.4°) | Y | - | +| Mid-360 | 70m | (+52°/-7°), (360°) | Y | - | +| Avia | 190m | (70.4°), Circular | Y | - | +| HAP | 150m | (25°), (120°) | - | - | +| Tele-15 | 320m | (16.2°), (14.5°) | - | - | Link to ROS 2 driver: [https://github.com/Livox-SDK/livox_ros2_driver](https://github.com/Livox-SDK/livox_ros2_driver) @@ -92,13 +114,16 @@ Link to company website: ## **Ouster 3D LIDAR Sensors** +![images/lidar-ouster.png](images/lidar-ouster.png) + Ouster Lidars which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | Range | FOV (V), (H) | ROS 2 Driver | Autoware Tested (Y/N) | | ----------------------- | ----- | --------------- | ------------ | --------------------- | -| OS0 | 50m | (90°), (360°) | Y | - | -| OS1 | 120m | (45°), (360°) | Y | - | -| OS2 | 240m | (22,5°), (360°) | Y | Y | +| OSDome | 45m | (180°), (360°) | Y | - | +| OS0 | 100m | (90°), (360°) | Y | - | +| OS1 | 200m | (45°), (360°) | Y | - | +| OS2 | 400m | (22,5°), (360°) | Y | Y | Link to ROS 2 driver: [https://github.com/ros-drivers/ros2_ouster_drivers](https://github.com/ros-drivers/ros2_ouster_drivers) diff --git a/docs/reference-hw/radars.md b/docs/reference-hw/radars.md index 4dceecde24b..6f5138d3eeb 100644 --- a/docs/reference-hw/radars.md +++ b/docs/reference-hw/radars.md @@ -2,12 +2,15 @@ ## **Smartmicro Automotive Radars** +![images/radar-smartmicro.png](images/radar-smartmicro.png) + Smartmicro Radars which has ROS 2 driver and tested by one or more community members are listed below: -| Supported Products List | Range | FOV (Azimuth), (Elevation) | ROS 2 Driver | Autoware Tested (Y/N) | -| ----------------------------------------- | --------------------------------------- | ---------------------------------------------------------------- | ------------ | --------------------- | -| Type 153 (Triple Mode Short, Medium Long) | S:0.2...19 m M:0.4...55 m L:0.8...120 m | Short: (130°), (15°) Medium: (130°), (15°)
Long: (100°),(15°) | Y | Y | -| Type 132 ,(Dual Mode ,Medium, Long) | M:0.5...64 m  L:1...175 m | Medium: (100°), (15°) Long: (32°), (15°) | Y | Y | +| Supported Products List | Range | FOV (Azimuth), (Elevation) | ROS 2 Driver | Autoware Tested (Y/N) | +| ---------------------------------------------- | --------------------------------------------------------------------- | ------------------------------------------- | ------------ | --------------------- | +| DRVEGRD 152 (Dual Mode Medium, Long) | M: 0.33...66
m L: 0.9…180 m
| (100°), (20°) | Y | - | +| DRVEGRD 169 (Ultra-Short, Short, Medium, Long) | US: 0.1…9.5 m
S: 0.2…19 m
M: 0.6...56 m
L: 1.3...130 m | US: (140°), (28°)
S/M/L: (130°), (15°) | Y | - | +| DRVEGRD 171 (Triple Mode Short, Medium Long) | S: 0.2...40 m
M: 0.5...100 m
L: 1.2...240 m | (100°), (20°) | Y | - | Link to ROS 2 driver: [https://github.com/smartmicro/smartmicro_ros2_radars](https://github.com/smartmicro/smartmicro_ros2_radars) @@ -17,6 +20,8 @@ Link to company website: ## **Aptiv Automotive Radars** +![images/radar-aptiv.png](images/radar-aptiv.png) + Aptiv Radars which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | Range | FOV (Azimuth), (Elevation) | ROS 2 Driver | Autoware Tested (Y/N) | @@ -29,11 +34,17 @@ Link to company website: ## **Continental Engineering Radars** +![images/radar-continental.png](images/radar-continental.png) + Continental Engineering Radars which has ROS 2 driver and tested by one or more community members are listed below: -| Supported Products List | Range | FOV (Azimuth), (Elevation) | ROS 2 Driver | Autoware Tested (Y/N) | -| ----------------------- | ----- | -------------------------- | ------------ | --------------------- | -| ARS430DI | 250m | (120), (18°) | - | - | +| Supported Products List | Range | FOV (Azimuth), (Elevation) | ROS 2 Driver | Autoware Tested (Y/N) | +| ----------------------- | ------------------------- | ------------------------------------------- | ------------ | --------------------- | +| ARS404 | Near: 70m
Far: 170m | Near: (90°), (18°)
Far: (18°), (18°) | - | - | +| ARS408 | Near: 20m
Far: 250m | Near: (120°), (20°)
Far: (18°), (14°) | - | - | + +Link to ROS 2 driver: +[https://github.com/tier4/ars408_driver](https://github.com/tier4/ars408_driver) Link to company website: [https://conti-engineering.com/components/ars430/](https://conti-engineering.com/components/ars430/) diff --git a/docs/reference-hw/remote_drive.md b/docs/reference-hw/remote_drive.md index 376a75dedad..a4f255e26be 100644 --- a/docs/reference-hw/remote_drive.md +++ b/docs/reference-hw/remote_drive.md @@ -2,6 +2,8 @@ ## **FORT ROBOTICS** +![images/remote-fort.png](images/remote-fort.png) + Fort Robotics remote control & E-stop devices which are used for autonomous driving and tested by one or more community members are listed below: | Supported Products | Op.Frequency | Controller | ROS 2 Support | Autoware Tested (Y/N) | @@ -13,6 +15,8 @@ Link to company website: ## **LOGITECH** +![images/remote-logitech.png](images/remote-logitech.png) + Logitech joysticks which are used for autonomous driving and tested by one or more community members are listed below: | Supported Products | Op.Frequency | Controller | ROS 2 Support | Autoware Tested (Y/N) | | diff --git a/docs/reference-hw/thermal_cameras.md b/docs/reference-hw/thermal_cameras.md index 9102aee893b..8f932d380ec 100644 --- a/docs/reference-hw/thermal_cameras.md +++ b/docs/reference-hw/thermal_cameras.md @@ -2,6 +2,8 @@ ## **FLIR Thermal Automotive Dev. Kit** +![images/thermal_camera-flir.png](images/thermal_camera-flir.png) + FLIR ADK Thermal Vision cameras which has ROS 2 driver and tested by one or more community members are listed below: | Supported Products List | MP | FPS | Interface | Spectral Band | FOV | ROS 2 Driver | Autoware Tested (Y/N) | diff --git a/docs/reference-hw/vehicle_drive_by_wire_suppliers.md b/docs/reference-hw/vehicle_drive_by_wire_suppliers.md index a515b9121bb..3b4ef812fea 100644 --- a/docs/reference-hw/vehicle_drive_by_wire_suppliers.md +++ b/docs/reference-hw/vehicle_drive_by_wire_suppliers.md @@ -1,34 +1,27 @@ # Vehicle Drive By Wire Suppliers -## **New Eagle DBW Solutions** - -New Eagle DBW Controllers which is used for autonomous driving and tested by one or more community members are listed below: - -| Supported Vehicles | Power | Remote Control | ROS 2 Support | Autoware Tested (Y/N) | -| ------------------------------------------------------------------------------------------------------- | ------------------------------------------------- | ------------------- | ------------- | --------------------- | -| Jeep Cherokee
Chrysler Pacifica
Toyota Prius
Chevy Bolt
Ford Transit
RAM 1500
Custom  | 500W Sine Inverter
2000 Watts
8 Channel PDS | Optional, Available | Y | Y | - -Link to company website: -[https://neweagle.net/autonomous-machines/](https://neweagle.net/autonomous-machines/) - ## **Dataspeed DBW Solutions** +![images/dbw-dataspeed.png](images/dbw-dataspeed.png) + Dataspeed DBW Controllers which is used for autonomous driving and tested by one or more community members are listed below: -| Supported Vehicles | Power | Remote Control | ROS 2 Support | Autoware Tested (Y/N) | -| ----------------------------------------------------------------------------------------------------------------------------- | -------------------------------- | ------------------- | ------------- | --------------------- | -| Lincoln MKZ, Nautilus
Ford Fusion, F150, Transit Connect, Ranger
Chrysler Pacifica
Jeep Cherokee
Polaris GEM, RZR | 12 Channel PDS,15 A Each at 12 V | Optional, Available | Y | - | +| Supported Vehicles | Power | Remote Control | ROS 2 Support | Autoware Tested (Y/N) | +| ------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------- | ------------------- | ------------- | --------------------- | +| Lincoln MKZ, Nautilus
Ford Fusion, F150, Transit Connect, Ranger
Chrysler Pacifica
Jeep Cherokee
Polaris GEM, RZR, Lincoln Aviator, Jeep Grand Cherokee | 12 Channel PDS,15 A Each at 12 V | Optional, Available | Y | - | Link to company website: [https://www.dataspeedinc.com/](https://www.dataspeedinc.com/) ## **AStuff Pacmod DBW Solutions** +![images/dbw-astuff.png](images/dbw-astuff.png) + Autonomous Stuff Pacmod DBW Controllers which is used for autonomous driving and tested by one or more community members are listed below: -| Supported Vehicles | Power | Remote Control | ROS 2 Support | Autoware Tested (Y/N) | -| ------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------ | ------------------- | ------------- | --------------------- | -| Polaris GEM Series
Polaris eLXD MY 2016+
Polaris Ranger X900
International ProStar
Lexus RX-450h MY
Ford Ranger
Toyota Minivan | Power distribution panel | Optional, Available | Y | Y | +| Supported Vehicles | Power | Remote Control | ROS 2 Support | Autoware Tested (Y/N) | +| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------ | ------------------- | ------------- | --------------------- | +| Polaris GEM Series
Polaris eLXD MY 2016+
Polaris Ranger X900
International ProStar
Lexus RX-450h MY
Ford Ranger
Toyota Minivan
Ford Transit
Honda CR-V | Power distribution panel | Optional, Available | Y | Y | Link to company website: [https://autonomoustuff.com/platform/pacmod](https://autonomoustuff.com/platform/pacmod) @@ -37,6 +30,8 @@ Link to company website: ## **Schaeffler-Paravan Space Drive DBW Solutions** +![images/dbw-astuff.png](images/dbw-schaffler.png) + Schaeffler-Paravan Space Drive DBW Controllers which is used for autonomous driving and tested by one or more community members are listed below: | Supported Vehicles | Power | Remote Control | ROS 2 Support | Autoware Tested (Y/N) | diff --git a/docs/reference-hw/vehicle_platform_suppliers.md b/docs/reference-hw/vehicle_platform_suppliers.md index e7880700265..7ba062da6c1 100644 --- a/docs/reference-hw/vehicle_platform_suppliers.md +++ b/docs/reference-hw/vehicle_platform_suppliers.md @@ -2,6 +2,8 @@ ## **PIX MOVING Autonomous Vehicle Solutions** +![images/platform-pix_moving.png](images/platform-pix_moving.png) + PIX Moving AV solutions which is used for autonomous development and tested by one or more community members are listed below: | Vehicle Types | Sensors Integrated | Autoware Installed | ROS 2 Support | Autoware Tested (Y/N) | @@ -17,6 +19,8 @@ Different sizes of platforms ## **Autonomoustuff AV Solutions** +![images/platform-autonomoustuff.png](images/platform-autonomoustuff.png) + Autonomoustuff platform solutions which is used for autonomous development and tested by one or more community members are listed below: | Vehicle Types | Sensors Integrated | Autoware Installed | ROS 2 Support | Autoware Tested (Y/N) | @@ -25,25 +29,3 @@ Autonomoustuff platform solutions which is used for autonomous development and t Link to company website: [https://autonomoustuff.com/platform](https://autonomoustuff.com/platform) - -## **NAVYA AV Solutions** - -NAVYA platform solutions which is used for autonomous development and tested by one or more community members are listed below: - -| Vehicle Types | Sensors Integrated | Autoware Installed | ROS 2 Support | Autoware Tested (Y/N) | -| ---------------------------------- | ------------------ | ------------------ | ------------- | --------------------- | -| Shuttle Bus, Taxi and Tow Tractors | Y | Y | - | - | - -Link to company website: -[https://navya.tech/en](https://navya.tech/en) - -## **ZING ROBOTICS AV Solutions** - -ZING Robotics platform solutions which is used for autonomous development and tested by one or more community members are listed below: - -| Vehicle Types | Sensors Integrated | Autoware Installed | ROS 2 Support | Autoware Tested (Y/N) | -| ---------------------------------------------------------------------- | ------------------ | ------------------ | ------------- | --------------------- | -| Purpose built electric autonomous vehicles for aviation, military etc. | Y | Y | - | - | - -Link to company website: -[https://www.zingrobotics.com/](https://www.zingrobotics.com/)