Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Memory Management] RAM Allocation increases unlimited till crash #66

Open
jorgeffonte opened this issue Aug 8, 2024 · 1 comment
Open

Comments

@jorgeffonte
Copy link

jorgeffonte commented Aug 8, 2024

Hi,

Firstly, thank you for developing and sharing the Direct LiDAR-Inertial Odometry (DLIO) project. I have been using it extensively with large datasets such as KITTI, which cover long distances and generate a significant number of keyframes.

During these prolonged runs, I have observed that the memory usage increases substantially, eventually consuming all available RAM and causing the algorithm to crash. This appears to be due to the accumulation of a large number of keyframes and other data structures that are not being efficiently managed over time.

Could you provide guidance on any parameters or configuration options that can help manage and limit memory usage? Specifically:

  • Are there settings to control the maximum number of keyframes stored in memory?
  • Is there a way to clear or downsample old data while preserving essential information for the algorithm to function correctly?
  • Any recommended practices for running DLIO with large datasets to avoid excessive memory consumption?

Thank you in advance for your assistance!

@kennyjchen
Copy link
Contributor

Hi @jorgeffonte --

Yes, memory management is certainly an area that DLIO could improve on. In general, as you mentioned, the memory scales with the number of keyframes. There is currently no way to control the memory usage besides tuning the keyframes and the voxelization, but there are certainly things that you can do to avoid crashes. Try tuning the keyframe threshold and voxelization; since you are working with the KITTI dataset (which is outdoors), you can probably get away with a higher keyframe threshold (e.g., 10m) and a higher voxelization (e.g. 0.5m). If you're limited on physical memory, I recommend increasing your swap space so that the algorithm doesn't crash. Let me know if that helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants