You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using --max_memory 80 should be validated properly as a Nextflow memory unit and if not to generate an error. The correct way of specifying this would be for example --max_memory '42 GB' otherwise the pipeline just allocates a memory value of 0 and still proceeds as normal. This was initially reported here.
The text was updated successfully, but these errors were encountered:
This should be handled by the JSON schema parameter validation if we set the schema validation to only accept values with a GB (or TB / PB) suffix. This will mean that people are then not able to specify a huge value in bytes, but I think that's ok.
Using
--max_memory 80
should be validated properly as a Nextflow memory unit and if not to generate an error. The correct way of specifying this would be for example --max_memory '42 GB' otherwise the pipeline just allocates a memory value of 0 and still proceeds as normal. This was initially reported here.The text was updated successfully, but these errors were encountered: