Skip to content

Opletts/ros-bridge

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROS bridge for CARLA simulator

Actions Status Build Status GitHub GitHub release (latest by date)

This ROS package aims at providing a simple ROS bridge for CARLA simulator.

rviz setup

This version requires CARLA 0.9.10

Features

  • Provide Sensor Data (Lidar, Semantic lidar, Cameras (depth, segmentation, rgb, dvs), GNSS, Radar, IMU)
  • Provide Object Data (Transforms (via tf), Traffic light status, Visualization markers, Collision, Lane invasion)
  • Control AD Agents (Steer/Throttle/Brake)
  • Control CARLA (Support synchronous mode, Play/pause simulation, Set simulation parameters)

Additional Functionality

Beside the bridging functionality, there are many more features provided in separate packages.

Name Description
Carla Ego Vehicle Provides a generic way to spawn an ego vehicle and attach sensors to it.
Carla Manual Control A ROS-based visualization and control tool for an ego vehicle (similar to carla_manual_control.py provided by CARLA)
Carla Infrastructure Provides a generic way to spawn a set of infrastructure sensors defined in a config file.
Carla Waypoint Publisher Provide routes and access to the Carla waypoint API
Carla ROS Scenario Runner ROS node that wraps the functionality of the CARLA scenario runner to execute scenarios.
Carla Ackermann Control A controller to convert ackermann commands to steer/throttle/brake
Carla AD Agent A basic AD agent, that follows a route, avoids collisions with other vehicles and stops on red traffic lights.
Carla Walker Agent A basic walker agent, that follows a route.
RVIZ Carla Plugin A RVIZ plugin to visualize/control CARLA.
RQT Carla Plugin A RQT plugin to control CARLA.

For a quick overview, after following the Setup section, please run the CARLA AD Demo. It provides a ready-to-use demonstrator of many of the features.

Setup

For Users

First add the apt repository:

sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 1AF1527DE64CB8D9
sudo add-apt-repository "deb [arch=amd64] http://dist.carla.org/carla $(lsb_release -sc) main"

Then simply install the ROS bridge:

sudo apt-get update
sudo apt-get install carla-ros-bridge

This will install carla-ros-bridge in /opt/carla-ros-bridge

For Developers

Create a catkin workspace and install carla_ros_bridge package

#setup folder structure
mkdir -p ~/carla-ros-bridge/catkin_ws/src
cd ~/carla-ros-bridge
git clone https://github.com/carla-simulator/ros-bridge.git
cd ros-bridge
git submodule update --init
cd ../catkin_ws/src
ln -s ../../ros-bridge
source /opt/ros/<kinetic or melodic or noetic>/setup.bash
cd ..

#install required ros-dependencies
rosdep update
rosdep install --from-paths src --ignore-src -r

#build
catkin_make

For more information about configuring a ROS environment see http://wiki.ros.org/ROS/Tutorials/InstallingandConfiguringROSEnvironment

Start the ROS bridge

First run the simulator (see carla documentation: http://carla.readthedocs.io/en/latest/)

# run carla in background
SDL_VIDEODRIVER=offscreen ./CarlaUE4.sh -opengl

Wait a few seconds

export PYTHONPATH=$PYTHONPATH:<path-to-carla>/PythonAPI/carla/dist/carla-<carla_version_and_arch>.egg
For Users
source /opt/carla-ros-bridge/<kinetic or melodic or noetic>/setup.bash
For Developers
source ~/carla-ros-bridge/catkin_ws/devel/setup.bash

Start the ros bridge (choose one option):

# Option 1: start the ros bridge
roslaunch carla_ros_bridge carla_ros_bridge.launch

# Option 2: start the ros bridge together with an example ego vehicle
roslaunch carla_ros_bridge carla_ros_bridge_with_example_ego_vehicle.launch

Configuration

Settings file

You can modify the ros bridge configuration by editing carla_ros_bridge/config/settings.yaml.

If the rolename is within the list specified by ROS parameter /carla/ego_vehicle/rolename, the client is interpreted as an controllable ego vehicle and all relevant ROS topics are created.

Launch file

Certain parameters can be set within the launch file carla_ros_bridge.launch.

Map

The bridge is able to load a CARLA map by setting the launch-file parameter town. Either specify an available CARLA Town (e.g. 'Town01') or a OpenDRIVE file (with ending '.xodr').

Mode

Default Mode

In default mode (synchronous_mode: false) data is published:

  • on every world.on_tick() callback
  • on every sensor.listen() callback
Synchronous Mode

CAUTION: In synchronous mode, only the ros-bridge is allowed to tick. Other CARLA clients must passively wait.

In synchronous mode (synchronous_mode: true), the bridge waits for all sensor data that is expected within the current frame. This might slow down the overall simulation but ensures reproducible results.

Additionally you might set synchronous_mode_wait_for_vehicle_control_command to true to wait for a vehicle control command before executing the next tick.

Control Synchronous Mode

It is possible to control the simulation execution:

  • Pause/Play
  • Execute single step

The following topic allows to control the stepping.

Topic Type
/carla/control carla_msgs.CarlaControl

A CARLA Control rqt plugin is available to publish to the topic.

Available ROS Topics

Ego Vehicle

Sensors

The ego vehicle sensors are provided via topics with prefix /carla/ego_vehicle/<sensor_topic>

Currently the following sensors are supported:

RGB camera
Topic Type
/carla/<ROLE NAME>/camera/rgb/<SENSOR ROLE NAME>/image_color sensor_msgs.Image
/carla/<ROLE NAME>/camera/rgb/<SENSOR ROLE NAME>/camera_info sensor_msgs.CameraInfo
Depth camera
Topic Type
/carla/<ROLE NAME>/camera/depth/<SENSOR ROLE NAME>/image_depth sensor_msgs.Image
/carla/<ROLE NAME>/camera/depth/<SENSOR ROLE NAME>/camera_info sensor_msgs.CameraInfo
Semantic segmentation camera
Topic Type
/carla/<ROLE NAME>/camera/semantic_segmentation/<SENSOR ROLE NAME>/image_segmentation sensor_msgs.Image
/carla/<ROLE NAME>/camera/semantic_segmentation/<SENSOR ROLE NAME>/camera_info sensor_msgs.CameraInfo
DVS camera
Topic Type
/carla/<ROLE NAME>/camera/dvs/<SENSOR ROLE NAME>/events sensor_msgs.PointCloud2
/carla/<ROLE NAME>/camera/dvs/<SENSOR ROLE NAME>/image_events sensor_msgs.Image
/carla/<ROLE NAME>/camera/dvs/<SENSOR ROLE NAME>/camera_info sensor_msgs.CameraInfo
Lidar
Topic Type
/carla/<ROLE NAME>/lidar/<SENSOR ROLE NAME>/point_cloud sensor_msgs.PointCloud2
Semantic lidar
Topic Type
/carla/<ROLE NAME>/semantic_lidar/<SENSOR ROLE NAME>/point_cloud sensor_msgs.PointCloud2
Radar
Topic Type
/carla/<ROLE NAME>/radar/<SENSOR ROLE NAME>/radar_points sensor_msgs.PointCloud2
IMU
Topic Type
/carla/<ROLE NAME>/imu sensor_msgs.Imu
GNSS
Topic Type Description
/carla/<ROLE NAME>/gnss/<SENSOR ROLE NAME>/fix sensor_msgs.NavSatFix publish gnss location
Collision Sensor
Topic Type Description
/carla/<ROLE NAME>/collision carla_msgs.CarlaCollisionEvent publish collision events
Lane Invasion Sensor
Topic Type Description
/carla/<ROLE NAME>/lane_invasion carla_msgs.CarlaLaneInvasionEvent publish events on lane-invasion

Object Sensor

Topic Type Description
/carla/<ROLE NAME>/objects derived_object_msgs.ObjectArray all vehicles and walkers, except the ego vehicle

Control

Topic Type
/carla/<ROLE NAME>/vehicle_control_cmd (subscriber) carla_msgs.CarlaEgoVehicleControl
/carla/<ROLE NAME>/vehicle_control_cmd_manual (subscriber) carla_msgs.CarlaEgoVehicleControl
/carla/<ROLE NAME>/vehicle_control_manual_override (subscriber) std_msgs.Bool
/carla/<ROLE NAME>/vehicle_status carla_msgs.CarlaEgoVehicleStatus
/carla/<ROLE NAME>/vehicle_info carla_msgs.CarlaEgoVehicleInfo

There are two modes to control the vehicle.

  1. Normal Mode (reading commands from /carla/<ROLE NAME>/vehicle_control_cmd)
  2. Manual Mode (reading commands from /carla/<ROLE NAME>/vehicle_control_cmd_manual)

This allows to manually override a Vehicle Control Commands published by a software stack. You can toggle between the two modes by publishing to /carla/<ROLE NAME>/vehicle_control_manual_override.

carla_manual_control makes use of this feature.

For testing purposes, you can stear the ego vehicle from the commandline by publishing to the topic /carla/<ROLE NAME>/vehicle_control_cmd.

Examples for a ego vehicle with role_name 'ego_vehicle':

Max forward throttle:

 rostopic pub /carla/ego_vehicle/vehicle_control_cmd carla_msgs/CarlaEgoVehicleControl "{throttle: 1.0, steer: 0.0}" -r 10

Max forward throttle with max steering to the right:

 rostopic pub /carla/ego_vehicle/vehicle_control_cmd carla_msgs/CarlaEgoVehicleControl "{throttle: 1.0, steer: 1.0}" -r 10

The current status of the vehicle can be received via topic /carla/<ROLE NAME>/vehicle_status. Static information about the vehicle can be received via /carla/<ROLE NAME>/vehicle_info

Additional way of controlling
Topic Type
/carla/<ROLE NAME>/twist_cmd (subscriber) geometry_msgs.Twist

CAUTION: This control method does not respect the vehicle constraints. It allows movements impossible in the real world, like flying or rotating.

You can also control the vehicle via publishing linear and angular velocity within a Twist datatype.

Currently this method applies the complete linear vector, but only the yaw from angular vector.

Carla Ackermann Control

In certain cases, the Carla Control Command is not ideal to connect to an AD stack. Therefore a ROS-based node carla_ackermann_control is provided which reads AckermannDrive messages. You can find further documentation here.

Other Topics

Object information of other actors

Topic Type Description
/carla/objects derived_object_msgs.ObjectArray all vehicles and walkers
/carla/marker visualization_msgs.Marker visualization of vehicles and walkers
/carla/actor_list carla_msgs.CarlaActorList list of all carla actors
/carla/traffic_lights carla_msgs.CarlaTrafficLightStatusList list of all traffic lights with their status
/carla/traffic_lights_info carla_msgs.CarlaTrafficLightInfoList static information for all traffic lights (e.g. position)
/carla/weather_control carla_msgs.CarlaWeatherParameters set the CARLA weather parameters

Status of CARLA

Topic Type Description
/carla/status carla_msgs.CarlaStatus
/carla/world_info carla_msgs.CarlaWorldInfo Info about the CARLA world/level (e.g. OPEN Drive map)

Walker

Topic Type Description
/carla/walker/<ID>/walker_control_cmd (subscriber) carla_msgs.CarlaWalkerControl Control a walker
/carla/walker/<ID>/odometry nav_msgs.Odometry odometry of walker

Other Vehicles

Topic Type Description
/carla/vehicle/<ID>/odometry nav_msgs.Odometry odometry of vehicle

TF

The tf data is published for all traffic participants and sensors.

TF for traffic participants

The child_frame_id corresponds with the CARLA actor id. If a role name is specified, an additional (static) transform with role name as child_frame_id is published.

TF for sensors

Sensors publish the transform, when the measurement is done. The child_frame_id corresponds with the prefix of the sensor topics.

Debug Marker

It is possible to draw markers in CARLA.

Caution: Markers might affect the data published by sensors.

The following markers are supported in 'map'-frame:

  • Arrow (specified by two points)
  • Points
  • Cube
  • Line Strip
Topic Type Description
/carla/debug_marker (subscriber) visualization_msgs.MarkerArray draw markers in CARLA world

Troubleshooting

ImportError: No module named carla

You're missing Carla Python. Please execute:

export PYTHONPATH=$PYTHONPATH:<path/to/carla/>/PythonAPI/carla/dist/<your_egg_file>

Please note that you have to put in the complete path to the egg-file including the egg-file itself. Please use the one, that is supported by your Python version. Depending on the type of CARLA (pre-build, or build from source), the egg files are typically located either directly in the PythonAPI folder or in PythonAPI/dist.

Check the installation is successfull by trying to import carla from python:

python -c 'import carla;print("Success")'

You should see the Success message without any errors.

About

ROS bridge for CARLA Simulator

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 85.0%
  • C++ 9.9%
  • CMake 3.2%
  • Other 1.9%