Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about memory #189

Open
wbooker opened this issue Sep 19, 2024 · 1 comment
Open

Question about memory #189

wbooker opened this issue Sep 19, 2024 · 1 comment

Comments

@wbooker
Copy link

wbooker commented Sep 19, 2024

I was wondering if you could give me a sense if the memory use of a big_spLinReg I am running seems to be appropriate, and if not if there was a way to reduce the memory requirements. My dataset is 300 individuals x 23626816 sites stored as a double FBM, and in order to run efficiently with many cores I need ~600Gb of memory. Does this seem correct to you? Just wondering if I am doing something wrong or if there are ways to reduce memory usage without sacrificing efficiency here.

Thanks!

@privefl
Copy link
Owner

privefl commented Sep 20, 2024

If my calculations are correct, the backingfile should take 53 GB on disk.
So, you should not need much more than that.
And using less memory should still be fine since the model should work on a subset of the data most of the time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants