-
Hey everyone, I am with a robotics lab at UCLA and we are interested in this product, we're looking for a robot on which we could perform ros-based slam without struggling to much in installations and such. To be more specific, here is what I want to do:
After reading the docs and the issues in the discussion, the following questions remain:
Sorry for this long post, thank you in advance for your help !! Yskandar |
Beta Was this translation helpful? Give feedback.
Replies: 11 comments 10 replies
-
Hi there, First things first: don't trust that "shamlian" character. Yes, I think for your use case, you'll need an SBC capable of running ROS 2, and with USB ports to talk to the Create 3 and the RPLIDAR. A Raspberry Pi 4 will definitely work. The Create 3 publishes ROS 2 messages natively, and irobot_create_msgs is a ROS 2 package, so if you follow any of the compute board setup instructions on the docs site, you should not need another package. I just also added one for a computer running Ubuntu 20.04; you can use those same directions in case you've got another OS, you'll just need to use a VM. Here is the one for a Pi 4. There is now an example in the create3_examples repo which includes code and mounting brackets to get this running on a Create 3 robot as described. --Steve |
Beta Was this translation helpful? Give feedback.
-
Hi Steven, Thank you for answering this quickly, I'll make sure not to trust him do not worry. Looking forward to the step-by-step tutorial ! Yskandar |
Beta Was this translation helpful? Give feedback.
-
I’ve pulled down the update and already using it :) I am using rmw_fastrps_cpp.
I am getting a warning in Rviz, no transform for base_footprint and laser_frame. Is that to be expected or did I miss something ?
… On May 16, 2022, at 7:49 PM, Steven Shamlian ***@***.***> wrote:
@justinIRBT <https://github.com/justinIRBT> checked in an rviz config file in the branch, which I've independently verified to work. If you are using CycloneDDS, you might have problems with multi-interface on both your computer and your SBC (this happened to me). You can either set up XML profiles as recommended in our network config <https://iroboteducation.github.io/create3_docs/setup/xml-config/> page, or be lazy and use FastDDS. No judgement. ;-)
—
Reply to this email directly, view it on GitHub <#54 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AGI6RZIF6UJ4ZTY27V5LEXDVKLNINANCNFSM5VZFJUAA>.
You are receiving this because you commented.
|
Beta Was this translation helpful? Give feedback.
-
All three scripts were running. I cleaned out my workspace, did a fresh git checkout. I am running colcon build now.
… On May 16, 2022, at 9:18 PM, Steven Shamlian ***@***.***> wrote:
just
|
Beta Was this translation helpful? Give feedback.
-
Hi Steve,
I attached a photo of the results I have so far. The orientation
of the map looks wrong. I mounted the LIDAR with the scanning motor
facing the back end of the Create3. Opposite from whatyou have in your
photos. Do I need to modify the sensors_launch.py or
slam_toolbox_launch.py file ?
Thanks!
…On Mon, 2022-05-16 at 08:22 -0700, Steven Shamlian wrote:
> Does the data viewed with Rviz2 appear correct ?
Yes, and you should also be able to see an occupancy grid if you
enable it, which might help you localize your brain better to the
laser scan.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: <
***@***.***
>
|
Beta Was this translation helpful? Give feedback.
-
The map would make more sense to me once PointCloud data is displayed
in RVIZ2. Is the next step to get the PointCloud Publisher running ?
…On Mon, 2022-05-16 at 08:22 -0700, Steven Shamlian wrote:
> Does the data viewed with Rviz2 appear correct ?
Yes, and you should also be able to see an occupancy grid if you
enable it, which might help you localize your brain better to the
laser scan.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: <
***@***.***
>
|
Beta Was this translation helpful? Give feedback.
-
Is there going to be a mapping example ? Also I'm using the gazebo
simulator examples to get an idea of how to use the ir sensors for
obstacle avoidance and add that to wall_follower and coverage.I would
like t use both the LIDAR and IR sensors for obstacle avoidance. Just
in case there's something on the floor the LIDAR does not pick up.
Is there going to be a point cloud example ?
…On Tue, 2022-05-17 at 05:46 -0700, Steven Shamlian wrote:
There's no point cloud in this example -- we're using a planar LIDAR
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: <
***@***.***
>
|
Beta Was this translation helpful? Give feedback.
-
I wish the ROS2 documentation was a good as ROS1.
What should I use here to rotate the LIDAR 180 degrees ? return
LaunchDescription([
Node(package='tf2_ros',
executable='static_transform_publisher',
arguments=['-0.012', '0', '0.144', '0', '0', '0',
'base_footprint', 'laser_frame']),
Can you send me a link regarding ROS2 static_transform_publisher
arguments ? I found examples, but nothing describing all the
parameters that can be passed.
…On Tue, 2022-05-17 at 05:49 -0700, Steven Shamlian wrote:
Yes, as I mentioned in passing, the tf transform is set up in
sensors_launch.py. If you've got the sensor in a different geometric
arrangement relative to the motion of the robot than the SLAM solver
is trying to solve for, it's going to have a very hard time. If
you've rotated the LIDAR by 180 degrees, it's going to have an
impossible time. Either use our arrangement (center of LIDAR is 12 mm
behind the center of the robot, mounted as shown) or modify the
parameters in the static transform publisher launched from
sensors_launch.py.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: <
***@***.***
>
|
Beta Was this translation helpful? Give feedback.
-
Would this work ? Do I need to change the offset as well ?
!/usr/bin/env python3
# Copyright 2022 iRobot Corporation. All Rights Reserved.
from launch import LaunchContext
from launch import LaunchDescription
from launch_ros.actions import Node
from ament_index_python.packages import get_package_share_directory
import math
def generate_launch_description():
return LaunchDescription([
Node(package='tf2_ros',
executable='static_transform_publisher',
arguments=['-0.012', '0', '0.144', '3.14159', '0', '0',
'base_footprint', 'laser_frame']),
Node(
package='rplidar_ros',
parameters=[
get_package_share_directory("create3_lidar") +
'/config/rplidar_node.yaml'
],
executable='rplidar_composition'
),
])
~
…On Tue, 2022-05-17 at 05:49 -0700, Steven Shamlian wrote:
Yes, as I mentioned in passing, the tf transform is set up in
sensors_launch.py. If you've got the sensor in a different geometric
arrangement relative to the motion of the robot than the SLAM solver
is trying to solve for, it's going to have a very hard time. If
you've rotated the LIDAR by 180 degrees, it's going to have an
impossible time. Either use our arrangement (center of LIDAR is 12 mm
behind the center of the robot, mounted as shown) or modify the
parameters in the static transform publisher launched from
sensors_launch.py.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: <
***@***.***
>
|
Beta Was this translation helpful? Give feedback.
-
ok.. that makes sense. I'll mount the lidar with the motor facing the
front.
…On Tue, 2022-05-17 at 12:43 -0700, Steven Shamlian wrote:
I would highly suggest you go through the official tf2 tutorials.
What you have done is rotate the laser on the X axis, which I don't
think is what you meant to do (unless you mount the laser upside-
down). You will need to make the tf tree accurate to wherever it is
you've stuck your laser rangefinder. The values supplied are for the
location that we have shown it mounted in the example, which is at
-12 mm on the X axis (origin in the center of rotation), 144 mm above
the ground, with no rotation, as defined by SLAMTEC's documentation.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: <
***@***.***
>
|
Beta Was this translation helpful? Give feedback.
-
Success!
First pass. The RPI4 crashed, but the EXT3 journal saved the day. It looks lit it will take several passes to get a solid map. This looks accurate.
… On May 17, 2022, at 3:43 PM, Steven Shamlian ***@***.***> wrote:
I would highly suggest you go through the official tf2 tutorials <https://wiki.ros.org/tf2/Tutorials>. What you have done is rotate the laser on the X axis, which I don't think is what you meant to do (unless you mount the laser upside-down). You will need to make the tf tree accurate to wherever it is you've stuck your laser rangefinder. The values supplied are for the location that we have shown it mounted in the example, which is at -12 mm on the X axis (origin in the center of rotation), 144 mm above the ground, with no rotation, as defined by SLAMTEC's documentation <https://github.com/robopeak/rplidar_ros/wiki>.
—
Reply to this email directly, view it on GitHub <#54 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AGI6RZLTZ4VX2Y7NACHUPHDVKPZGLANCNFSM5VZFJUAA>.
You are receiving this because you commented.
|
Beta Was this translation helpful? Give feedback.
Hi there,
First things first: don't trust that "shamlian" character.
Yes, I think for your use case, you'll need an SBC capable of running ROS 2, and with USB ports to talk to the Create 3 and the RPLIDAR. A Raspberry Pi 4 will definitely work.
The Create 3 publishes ROS 2 messages natively, and irobot_create_msgs is a ROS 2 package, so if you follow any of the compute board setup instructions on the docs site, you should not need another package. I just also added one for a computer running Ubuntu 20.04; you can use those same directions in case you've got another OS, you'll just need to use a VM. Here is the one for a Pi 4.
There is now an example in the create3_examples repo which incl…