Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add example launch & config for ease of use #18

Merged
merged 8 commits into from
Jan 12, 2022

Conversation

DoanNguyenTrong
Copy link
Contributor

It took me a while to figure it out, so just want to share it with everybody.

@peci1
Copy link
Owner

peci1 commented Jan 4, 2022

Thank you for your contribution 👍 . Adding an example launch file that would run just robot_body_filter via sensor_filters is definitely a good idea.

However, it is not (yet) suitable for merging. There are several issues I see:

  • The readme updates do not seem very informative; I'd rather add the note about the need for sensor_filters inside the example launch file, as there is no direct dependency between sensor_filters and robot_body_filter. You can write your own filter runner package, completely different from sensor_filters.
  • The filter config has >75% commented lines. Please, clean it up and update the header of the file to actually reflect what robot do you run it on and with which sensor (e.g. the config says VLP-16, but you remap the points topic from /rslidar..., which would point to RoboSense). Please also try to express what is interesting on this filter config.
  • The launch file should be moved to the examples folder, as well as the filter config file.
  • The launch file contains a lot of unnecessary and distracting arguments. I would suggest removing most of them (keeping them at their defaults) (both their definitions and their use in the <include>). Keep description_name. This is an example file for public, so please do not try to use it directly (instead, keep your own copy/modification of the launch file in your own package). Files from the examples folder are not installed by CMake, so they are not available to those who use the binary distribution of this package.
  • I suggest using $(dirname) in the launch file to reference to the config file instead of $(find robot_body_filter).
  • Do not start topic names with slashes - it reduces reusability of the code.

@peci1 peci1 added enhancement New feature or request documentation Documentation and usage examples labels Jan 4, 2022
@DoanNguyenTrong
Copy link
Contributor Author

Hi @peci1,
Sorry for my late response while you are extremely responsive.
I sadly haven't had enough time, dive deep into the code to contribute more to this amazing project.
All of the points that you mentioned are absolutely valid and I've fixed them in the last commit.
Sorry for the bad content at the beginning.

Btw, if I want to remove points that are not in the URDF of my robot, but above the robot footprint, Can I do it and how (if possible)?
Thanks

frames/sensor: lidar_frame
frames/filtering: lidar_frame
frames/output: lidar_frame
frames/fixed: odom # This example is for outdoor, non-mapping application. If your robot navigate in a map, change to fixed frame (e.g., "map")
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Map should never go here. That would create a cycle if you build your map using laserscans. Basically, you would need the map-aligned scan to filter the same scan, but it cannot be aligned until it is filtered. odom usually doesn't depend on laserscans, so it is okay to have it here.

Moreover, as explained in https://github.com/peci1/robot_body_filter#tf-frames , you don't even need fixed frame for point_by_point: false scans.

However, depending on how fast your robot moves, your performance constraints and quality of your odometry, I would think about using point_by_point: true for Velodyne, as the scan duration is not negligible.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you so much for your explanation. From my understanding, the fixed frame is the global one for reference. In my case (mobile, outdoor, non-map), Odom is the root of tf tree so I picked it. And as I don't have a map, probably I cannot use it.

I am confused a little bit as you said "Map should never go here". If a mobile robot navigates in a created map, map frame is the (only) fixed one. If you don't use it, which one you will use?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is probably some misunderstanding. Of course it would be best to set world or some really high-level frame as fixed frame. However, as this filter usually acts as a preprocessing step for pointclouds entering SLAM, it can only operate on frames "before SLAM". Because SLAM depends on the filter's output to generate the map frame. Therefore, even if you have a SLAM-localized robot, robot_body_filter must not set map as frames/fixed, otherwise the filter won't work at all (unless you have some SLAM that does not depend on the pointcloud, but that is usually not the case). So even for a SLAM-localized robot, odom is usually the "closest to fixed" frame available and has to be used as such.

The choice of fixed frame also defines the lowest latency the filter is able to achieve. The filter can finish only when all points from the scan can be transformed to the fixed frame. So if you choose a slow of high-latency fixed frame (e.g. one created by some non-pointcloud SLAM), the pointclouds going through the filter would always have to wait for a SLAM alignment newer than the last point in the cloud (because TF doesn't allow extrapolation in future). This is another reason why it is wise to set the fixed frame to something faster, which odom usually is (the robots I've seen had all odometry >20 Hz and very low delay, while SLAM ran at most 10 Hz and latencies are 100s of milisecs).

Another "workaround" is pretending the scan was captured in a single time instant (as your config does setting point_by_point: false). If it were a depth camera pointcloud, point_by_point: false is the right choice because the camera actually captures all points at once. However, you configure a rotating lidar which captures each point at a different time instant, and if you set point_by_point: false, all points in the scan use the sensor->fixed frame transform that is formally valid just for the first point in the scan. And as there is only a single time instant in play (stamp of the first point), the fixed frame loses its meaning (as fixed frame is only helpful if you need to transfer some transform to a different time). As said earlier, if the robot moves slowly (especially rotates slowly), this might be okay, but generally you would want to transform each point with a transform that is valid or very close to the valid one. That is what point_by_point: true does.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for this detailed explanation. I saw the reason why you said not using map as the fixed frame.

Just to share with you my use case: I use GPS as the main source of information for odom (and robot localization). The setup is in RTK mode so the accuracy is very high (mm error), which means the odom frame is relatively "equal" to world. The only drawback is the frequency is not so high( ~10Hz). I use robot_body_filter to remove points that hit the robot's body. While the robot is moving, local and global costmaps pick those points and consider them as obstacles. Although it happens occasionally, that's not good for path planning and control (especially in backward motion).

@peci1
Copy link
Owner

peci1 commented Jan 9, 2022

Sorry for my late response while you are extremely responsive.

No hurry, it was just coincidence ;)

Sorry for the bad content at the beginning.

Don't be sorry for it, you took a great job addressing my comments!

Btw, if I want to remove points that are not in the URDF of my robot, but above the robot footprint, Can I do it and how (if possible)?

Take a look at https://wiki.ros.org/pcl_ros/Tutorials/filters#PassThrough . But this filter can only cut by a plane or two planes, which might not be enough. There is also the undocumented CropBox filter which would probably be more suitable as it cuts out a box. But beware that it has problems with organized point clouds in Melodic. Unfortunately, all of these pcl_ros filters are not written using the filters API, but as a nodeletes, so you can't directly plug the pcl_ros filter alongside robot_body_filter in the YAML config file. I'm really thinking about sending a pull request to pcl_ros to fix this, but the repo seems mostly unmaintained, so it's not sure how fast (if ever) it will get merged.


Please, think through the frames/fixed and point_by_point setting as mentioned in the previous comment, and I think then this PR will be ready for merging.

examples/vlp16_params.yaml Outdated Show resolved Hide resolved
@peci1 peci1 merged commit c72a42c into peci1:master Jan 12, 2022
@peci1
Copy link
Owner

peci1 commented Jan 12, 2022

Thank you for the contribution and patience!

@DoanNguyenTrong
Copy link
Contributor Author

Learned a lot from you. Thanks and have joy with robots.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Documentation and usage examples enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants