You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The majority of the errors I've been seeing while running the pipeline on Google Cloud have to do with insufficient disk space or memory.
I've tried to set some reasonable default values for these within the Snakefile and in run-gcp based on a sample that I ran locally on our cluster, but I'm sure those won't be sufficient for all of the samples that we want to run in the future.
After discussing with the Snakemake team, they recommended the following:
Regarding your disk_mb problem, you need to add --default-resources to the snakemake invocation. This way, disk_mb will be automatically set to a reasonable default (max(2*input.size_mb, 1000)). The same for mem_mb.
So we should probably try that to see if it will help.
The Snakemake documentation talks about it in their execution section, but some digging might need to be done to figure out how to use this properly for ourselves.
The text was updated successfully, but these errors were encountered:
The majority of the errors I've been seeing while running the pipeline on Google Cloud have to do with insufficient disk space or memory.
I've tried to set some reasonable default values for these within the Snakefile and in
run-gcp
based on a sample that I ran locally on our cluster, but I'm sure those won't be sufficient for all of the samples that we want to run in the future.After discussing with the Snakemake team, they recommended the following:
So we should probably try that to see if it will help.
The Snakemake documentation talks about it in their execution section, but some digging might need to be done to figure out how to use this properly for ourselves.
The text was updated successfully, but these errors were encountered: