The problem: let's say you have a diff-steer telepresence robot, equipped with a forward-looking camera, and a pretty nice 16-channel 360 degree LiDAR scanner. The robot can be remotely driven via a joystick or a keyboard interface. Imagery from the camera is streamed to a human operator, at a fairly low framerate, say 3 Hz, and sometimes with a latency of one second or more. It can be challenging to safely operate a robot under these conditions How do we help our users avoid crashing the robot?
At my day job, we had this problem, and one of my colleagues solved it by implementing a 3D proximity detector that rejects all command velocity message (cmd_vel
)
when something above the ground plane is near. Works great. However, for kicks, I implemented a 2D Command Velocity Filter based on
costmap_2d, since it's already running on the robot
The result was the package that you see here, cmd_vel_filter
. It runs a ROS node that, given a nav_msgs/OccupancyGrid
and a geometry_msgs/Twist
message,
checks for the intersection of the resulting circular trafectory with any obstacles in the costmap. We drop (i.e., filter) Twist messages that would result in a
collision within a configurable time threshold.
This package was designed with a diff-steer robot in mind. In particular, it's tested against a simulated Husky from Clearpath Robotics.
Pro
- Prevents a user from running headlong into a wall.
Con
- Doesn't take the robot footprint into account, so it's possible to graze an obstacle and take some paint off the robot.
- Doesn't work very well if the robot is not moving along a circular trajectory, which is very possible in the real world.
The node assumes that a cost map in the form of a nav_msgs/OccupancyGrid
is
available, along with a source of geometry_msgs/Twist
messages. For the sake
of demonstration, the easiest way to get this is to run a simulation of
Clearpath Robotics' robot: Husky Simulation Tutorial
To run the code in a Gazebo simulation:
roslaunch cmd_vel_filter demo.launch --screen
Then, you can use the Robot Steering
plugin in rqt_gui
to send velocity
commands to the input of the filter (see Subscriptions, below).
lookahead_time
(default: 1s):
The current trajectory is estimated forward in time, out to lookahead_time
,
in seconds.
/cmd_vel_filter/in [geometry_msgs/Twist]
:
Filter input.
/tf [tf/tfMessage], /tf_static [tf2_msgs/TFMessage]
:
The filter needs to know the transform between the map
and base_link
frames.
/move_base/local_costmap/costmap [nav_msgs/OccupancyGrid]
:
A working local_costmap
is required.
/cmd_vel [geometry_msgs/Twist]
:
Filtered output.
/cmd_vel_filter/path [nav_msgs/Path]
:
The projected trajectory, for visualization purposes.