Skip to content

Collision safety node  #1899

@SteveMacenski

Description

@SteveMacenski

While this is an area I've typically not found belonging to navigation, this has come up so many times maybe its worth just including it. The goal is to make a node that takes in raw sensor data from a lidar (2D, 3D, I suppose a depth camera, just something that gives ranges in the forward motion direction) and either stop or slow the robot to avoid collisions.

We've added heartbeat server alive + lifecycle transition support and this falls into the same category of safety awareness / promises we can round it off with (e.g. we now know within a timeout everything is running properly and activated, this will now ensure no collisions using raw safety data which nicely rounds off the minimum needs for a safe robot regardless of algorithm for planning and control or localization quality).

In most industrial situations, you might use something like a SICK lidar that has this capability built in and is safety rated. For smaller robots falling under less strict functional safety requirements and budgets, a laser like this may be out of scope and needing a solution based on the ROS sensors they have available. It is not a suitable safety certification alternative, but will give you comparable non-certified capability as long as the sensor and ROS driver are publishing information.

There are 2 major options (probably do both and give the user option of which to use):

  • Define a bounding box and a point threshold and if > N points appear in this box in front of the robot, stop it. Usually with a larger bounding box in front of the stop box to warn of a collision and slow the maximum speeds.
  • Look at sensor data and look at the closest few points. With the current speed, estimate the time to collision to those points. If the time is < N seconds (1, 5, 10, etc), slow the robot until its = N seconds. The effect here would be to make the robot always N seconds from a collision and continuously scale down the speeds.

This can be trivially made in about a day:

  • Transform data from source frame to base_link
  • Take in user-generated box sizes
  • Check if data in boxes -> react
  • Take current odometry speed, do basic kinematics to tell the time to collision. If < N, limit speed.

Useful for both autonomous navigation and as an important element in an assisted teleop system (does this replace the assisted part?)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions