-
Notifications
You must be signed in to change notification settings - Fork 664
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when using file.size()
to request memory for job on cluster
#922
Comments
Not convinced this is a NF issue. Can men be a float on PBS? |
Nextflow Version is 18.10.1, Java 8u152, Linux (CentOS7). Never had that issue before and PBS was fine with jobs being submitted and this just came up with that specific feature. According to https://www.osc.edu/supercomputing/batch-processing-at-osc/pbs-directives-summary , one can submit either in bytes, MB or GB mode. Not sure whether we can have 57.4GB though... |
What executor is this? PBS or SGE ? |
PBS/Torque |
Somehow is expected because the result of the dyn rule you have specified is taken as it is.
Maybe NF should use always MiB to avoid the decimal. For now you should round to giga units. |
|
I think the problem is this |
Thanks @pditommaso, unfortunately, while the job can now be submitted the pipeline fails on the
I think this is because the type of
|
I'm lost here, I don't see the use of |
I couldn't get |
Bug report
Expected behavior and actual behavior
For nf-core/deepvariant dev branch
memory is set as follows in
base.config
:When the pipeline is run on a computing cluster the job fails to submit despite the requested resources being within the limits. (The requested node, has a max of 28 cores, 48hours of wall time and 128GB RAM per node).
Program output
Requested resources from .command.run:
Issue only appeared after using
file.size()
to set memory, see here for original issue/threadSteps to reproduce the problem
@apeltzer please feel free to add more information such as any steps & environment if possible, thanks
Environment
The text was updated successfully, but these errors were encountered: