Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

not enough memory for BAM sorting #870

Closed
beetlejuice007 opened this issue Mar 31, 2020 · 9 comments
Closed

not enough memory for BAM sorting #870

beetlejuice007 opened this issue Mar 31, 2020 · 9 comments

Comments

@beetlejuice007
Copy link

beetlejuice007 commented Mar 31, 2020

Hi

I am running STAR with 400 GB memory. However I am still getting an error that says not enough memory. STAR then recommends me to use ~41 GB memory (--limitBAMsortRAM 41143265264 if my interpretation is correct). This does not makes sense to me. Can you please explain and provide a solution ? I have tried running with --limitBAMsortRAM. But then I get an std bad alloc error.

Thanks
Hemant

Mar 30 13:37:36 ..... started STAR run
Mar 30 13:37:36 ..... loading genome
Mar 30 13:42:00 ..... started 1st pass mapping
Mar 30 13:50:30 ..... finished 1st pass mapping
Mar 30 13:50:33 ..... inserting junctions into the genome indices
Mar 30 13:53:37 ..... started mapping
Mar 30 14:06:19 ..... finished mapping
Mar 30 14:06:20 ..... started sorting BAM

EXITING because of fatal ERROR: not enough memory for BAM sorting:
SOLUTION: re-run STAR with at least --limitBAMsortRAM 41143265264
Mar 30 14:06:20 ...... FATAL ERROR, exiting

@alexdobin
Copy link
Owner

Hi Hemant

have you actually specified --limitBAMsortRAM 41143265264 (or even a bit more :) in the STAR parameters?
If you did and still got the error, please send me the Log.out file.

Cheers
Alex

@beetlejuice007
Copy link
Author

Hi Alex

Here is the logout file. I had to re-run as i had deleted the previous files. This time star is telling me to use 90 GB with --limitBAMsortRAM. I will try this again and let you know.

SRRXXXXXX_cleanLog.out.txt

@alexdobin
Copy link
Owner

Hi @hemantgujar

what is the size of the input FASTQ files?
You can try to increase --outBAMsortingBinsN from the default 50 to 100 or even 200, though it will require a large number of open files.
It seems like there a few loci that are very highly expressed, which STAR sorting cannot deal nicely with... So it may be a safer option to use the samtools sort.

Cheers
Alex

@beetlejuice007
Copy link
Author

Here is the file that gave bad_alloc error

Apr 03 22:51:12 ..... started STAR run
Apr 03 22:51:12 ..... loading genome
Apr 03 22:56:06 ..... started 1st pass mapping
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
/var/spool/slurm/slurmd/spool/job6767136/slurm_script: line 139: 27443 Aborted (core dumped) STAR --genomeDir /home/ --readFilesIn /staging/SRRxxxxxx.f_pair.fastq /staging/SRRxxxxxx.r_pair.fastq --runThreadN 20 --limitBAMsortRAM 80000000000 --sjdbOverhang 50 --quantMode GeneCounts --twopassMode Basic --outSAMtype BAM SortedByCoordinate --outFileNamePrefix SRRxxxxxxxx_clean

File size is
3.2G SRRxxxxxx.sra
SRRxxxxx_cleanLog.txt

@alexdobin
Copy link
Owner

Hi @hemantgujar

sorry for the belayed reply. This seems like another problem that happened before the completion of the 1st pass, while the original one happened after the completion of the 2nd pass, at the sorting step. It seems that the amount of available RAM on your server varies from run to run, which prevents STAR from allocating enough memory.

Cheers
Alex

@beetlejuice007
Copy link
Author

Thanks Alex.
I was able to solve this using --outBAMsortingBinsN. Thanks for help. You can close this issue.
thanks

@alexdobin
Copy link
Owner

Thanks for letting me know!

@peraltomas
Copy link

@hemantgujar hello, I'm facing a similar problem. May you tell me which value you used for --outBAMsortingBinsN? Thanks

@mengyuankan
Copy link

@hemantgujar hello, I'm facing a similar problem. May you tell me which value you used for --outBAMsortingBinsN? Thanks

Hi, I came across this problem and I set --outBAMsortingBinsN 200 which works for a 130G bam file. Hope this would help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants