Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

createBigWig uses /tmp rather than specified scratch location, singularity #115

Closed
Simon-Coetzee opened this issue Oct 29, 2018 · 2 comments

Comments

@Simon-Coetzee
Copy link

Simon-Coetzee commented Oct 29, 2018

When using this pipeline on our HPC, createBigWig will always fail due to filling up it's TEMP directory.
It writes to /tmp which is very small (10 M) on the worker nodes. Rather, it should be writing to /scratch.

This is specified in nextflow config process.scratch=/scratch, passed with envWhitelist='TMP' where TMP is specified as TMP=/scratch on the submit node, -Djava.io.tmpdir=/scratch when launching nextflow, if there's anyplace else to try I'd be happy to.

This may be the same sort of error happening here #82 with regards to TMPDIR behaviour.

The resulting error is listed here:

ERROR ~ Error executing process > 'createBigWig (RNA3-Aligned.)'

Caused by:
  Process `createBigWig (RNA3-Aligned.)` terminated with an error exit status (1)

Command executed:

  samtools index RNA3-Aligned.sortedByCoord.out.bam
  bamCoverage -b RNA3-Aligned.sortedByCoord.out.bam -p 10 -o RNA3-Aligned.sortedByCoord.out.bigwig

Command exit status:
  1

Command output:
  (empty)

Command error:
  minFragmentLength: 0
  verbose: False
  out_file_for_raw_data: None
  numberOfSamples: None
  bedFile: None
  bamFilesList: ['RNA3-Aligned.sortedByCoord.out.bam']
  numberOfProcessors: 10
  samFlag_exclude: None
  save_data: False
  stepSize: 50
  smoothLength: None
  blackListFileName: None
  center_read: False
  ignoreDuplicates: False
  defaultFragmentLength: read length
  chrsToSkip: []
  region: None
  maxPairedFragmentLength: 1000
  samFlag_include: None
  binLength: 50
  maxFragmentLength: 0
  minMappingQuality: None
  zerosToNans: False
  Traceback (most recent call last):
    File "/opt/conda/envs/nf-core-rnaseq-1.1/bin/bamCoverage", line 12, in <module>
      main(args)
    File "/opt/conda/envs/nf-core-rnaseq-1.1/lib/python2.7/site-packages/deeptools/bamCoverage.py", line 256, in main
      format=args.outFileFormat, smoothLength=args.smoothLength)
    File "/opt/conda/envs/nf-core-rnaseq-1.1/lib/python2.7/site-packages/deeptools/writeBedGraph.py", line 152, in run
      numberOfProcessors=self.numberOfProcessors)
    File "/opt/conda/envs/nf-core-rnaseq-1.1/lib/python2.7/site-packages/deeptools/mapReduce.py", line 142, in mapReduce
      res = pool.map_async(func, TASKS).get(9999999)
    File "/opt/conda/envs/nf-core-rnaseq-1.1/lib/python2.7/multiprocessing/pool.py", line 572, in get
      raise self._value
  IOError: [Errno 28] No space left on device

Work dir:
  /hpc/home/coetzeesg/work/a0/2b68269730119025498ad222dd65c0

Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`

 -- Check '.nextflow.log' file for details
[nfcore/rnaseq] Pipeline Complete
WARN: Killing pending tasks (2)
@Simon-Coetzee
Copy link
Author

So, it looks to me as though the problem is that deeptools does not respect, in my specific instance at least, $TMP or $TMPDIR, but rather is putting things in /tmp or /var/tmp when /tmp is full. $TEMP is not passed into singularity at all. When $TEMP is assigned in the main.nf file, I believe deeptools does use it however. My work around for this is to add runOptions = '-B /scratch:/tmp' to the singularity section of the nextflow config, so that badly behaving apps can write to what they think is /tmp but is in reality the directory that I want them to use.

@apeltzer
Copy link
Member

Hi @Simon-Coetzee ! We removed the bigWig creationg again in the dev branch, so this should be gone by the next release of the pipeline hopefully 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants