-
-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add example launch & config for ease of use #18
Conversation
Add dependencies
minor
Thank you for your contribution 👍 . Adding an example launch file that would run just robot_body_filter via sensor_filters is definitely a good idea. However, it is not (yet) suitable for merging. There are several issues I see:
|
Hi @peci1, Btw, if I want to remove points that are not in the URDF of my robot, but above the robot footprint, Can I do it and how (if possible)? |
examples/vlp16_params.yaml
Outdated
frames/sensor: lidar_frame | ||
frames/filtering: lidar_frame | ||
frames/output: lidar_frame | ||
frames/fixed: odom # This example is for outdoor, non-mapping application. If your robot navigate in a map, change to fixed frame (e.g., "map") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Map should never go here. That would create a cycle if you build your map using laserscans. Basically, you would need the map-aligned scan to filter the same scan, but it cannot be aligned until it is filtered. odom usually doesn't depend on laserscans, so it is okay to have it here.
Moreover, as explained in https://github.com/peci1/robot_body_filter#tf-frames , you don't even need fixed frame for point_by_point: false
scans.
However, depending on how fast your robot moves, your performance constraints and quality of your odometry, I would think about using point_by_point: true
for Velodyne, as the scan duration is not negligible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you so much for your explanation. From my understanding, the fixed frame is the global one for reference. In my case (mobile, outdoor, non-map), Odom is the root of tf tree so I picked it. And as I don't have a map, probably I cannot use it.
I am confused a little bit as you said "Map should never go here". If a mobile robot navigates in a created map, map
frame is the (only) fixed one. If you don't use it, which one you will use?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is probably some misunderstanding. Of course it would be best to set world
or some really high-level frame as fixed frame. However, as this filter usually acts as a preprocessing step for pointclouds entering SLAM, it can only operate on frames "before SLAM". Because SLAM depends on the filter's output to generate the map
frame. Therefore, even if you have a SLAM-localized robot, robot_body_filter must not set map
as frames/fixed
, otherwise the filter won't work at all (unless you have some SLAM that does not depend on the pointcloud, but that is usually not the case). So even for a SLAM-localized robot, odom
is usually the "closest to fixed" frame available and has to be used as such.
The choice of fixed frame also defines the lowest latency the filter is able to achieve. The filter can finish only when all points from the scan can be transformed to the fixed frame. So if you choose a slow of high-latency fixed frame (e.g. one created by some non-pointcloud SLAM), the pointclouds going through the filter would always have to wait for a SLAM alignment newer than the last point in the cloud (because TF doesn't allow extrapolation in future). This is another reason why it is wise to set the fixed frame to something faster, which odom
usually is (the robots I've seen had all odometry >20 Hz and very low delay, while SLAM ran at most 10 Hz and latencies are 100s of milisecs).
Another "workaround" is pretending the scan was captured in a single time instant (as your config does setting point_by_point: false
). If it were a depth camera pointcloud, point_by_point: false
is the right choice because the camera actually captures all points at once. However, you configure a rotating lidar which captures each point at a different time instant, and if you set point_by_point: false
, all points in the scan use the sensor->fixed frame transform that is formally valid just for the first point in the scan. And as there is only a single time instant in play (stamp of the first point), the fixed frame loses its meaning (as fixed frame is only helpful if you need to transfer some transform to a different time). As said earlier, if the robot moves slowly (especially rotates slowly), this might be okay, but generally you would want to transform each point with a transform that is valid or very close to the valid one. That is what point_by_point: true
does.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for this detailed explanation. I saw the reason why you said not using map
as the fixed frame.
Just to share with you my use case: I use GPS as the main source of information for odom
(and robot localization). The setup is in RTK mode so the accuracy is very high (mm error), which means the odom
frame is relatively "equal" to world
. The only drawback is the frequency is not so high( ~10Hz). I use robot_body_filter to remove points that hit the robot's body. While the robot is moving, local and global costmaps pick those points and consider them as obstacles. Although it happens occasionally, that's not good for path planning and control (especially in backward motion).
No hurry, it was just coincidence ;)
Don't be sorry for it, you took a great job addressing my comments!
Take a look at https://wiki.ros.org/pcl_ros/Tutorials/filters#PassThrough . But this filter can only cut by a plane or two planes, which might not be enough. There is also the undocumented CropBox filter which would probably be more suitable as it cuts out a box. But beware that it has problems with organized point clouds in Melodic. Unfortunately, all of these pcl_ros filters are not written using the Please, think through the |
Thank you for the contribution and patience! |
Learned a lot from you. Thanks and have joy with robots. |
It took me a while to figure it out, so just want to share it with everybody.