You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Firstly, thank you for developing and sharing the Direct LiDAR-Inertial Odometry (DLIO) project. I have been using it extensively with large datasets such as KITTI, which cover long distances and generate a significant number of keyframes.
During these prolonged runs, I have observed that the memory usage increases substantially, eventually consuming all available RAM and causing the algorithm to crash. This appears to be due to the accumulation of a large number of keyframes and other data structures that are not being efficiently managed over time.
Could you provide guidance on any parameters or configuration options that can help manage and limit memory usage? Specifically:
Are there settings to control the maximum number of keyframes stored in memory?
Is there a way to clear or downsample old data while preserving essential information for the algorithm to function correctly?
Any recommended practices for running DLIO with large datasets to avoid excessive memory consumption?
Thank you in advance for your assistance!
The text was updated successfully, but these errors were encountered:
Yes, memory management is certainly an area that DLIO could improve on. In general, as you mentioned, the memory scales with the number of keyframes. There is currently no way to control the memory usage besides tuning the keyframes and the voxelization, but there are certainly things that you can do to avoid crashes. Try tuning the keyframe threshold and voxelization; since you are working with the KITTI dataset (which is outdoors), you can probably get away with a higher keyframe threshold (e.g., 10m) and a higher voxelization (e.g. 0.5m). If you're limited on physical memory, I recommend increasing your swap space so that the algorithm doesn't crash. Let me know if that helps.
Hi,
Firstly, thank you for developing and sharing the Direct LiDAR-Inertial Odometry (DLIO) project. I have been using it extensively with large datasets such as KITTI, which cover long distances and generate a significant number of keyframes.
During these prolonged runs, I have observed that the memory usage increases substantially, eventually consuming all available RAM and causing the algorithm to crash. This appears to be due to the accumulation of a large number of keyframes and other data structures that are not being efficiently managed over time.
Could you provide guidance on any parameters or configuration options that can help manage and limit memory usage? Specifically:
Thank you in advance for your assistance!
The text was updated successfully, but these errors were encountered: