Skip to content

Commit

Permalink
Merge pull request #294 from jfnavarro/dev
Browse files Browse the repository at this point in the history
Add troubleshooting section to docs with an entry on Spark issues
  • Loading branch information
maxulysse authored Oct 20, 2020
2 parents e565082 + 872489f commit cce883e
Show file tree
Hide file tree
Showing 2 changed files with 41 additions and 2 deletions.
3 changes: 1 addition & 2 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,7 @@
The nf-core/sarek documentation is split into the following pages:

- [Usage](usage.md)
- An overview of how the pipeline works, how to run it and a description of all of the different command-line flags.
- An overview of how the pipeline works, how to run it, a description of all of the different command-line flags and solutions to common issues.
- [Output](output.md)
- An overview of the different results produced by the pipeline and how to interpret them.

You can find a lot more documentation about installing, configuring and running nf-core pipelines on the website: [https://nf-co.re](https://nf-co.re)
40 changes: 40 additions & 0 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,8 @@
- [--awsqueue](#--awsqueue)
- [--awsregion](#--awsregion)
- [--awscli](#--awscli)
- [Troubleshooting](#troubleshooting)
- [Spark related issues](#spark)

## Running the pipeline

Expand Down Expand Up @@ -1737,3 +1739,41 @@ The [AWS CLI](https://www.nextflow.io/docs/latest/awscloud.html#aws-cli-installa
Default: `/home/ec2-user/miniconda/bin/aws`.

Please make sure to also set the `-w/--work-dir` and `--outdir` parameters to a S3 storage bucket of your choice - you'll get an error message notifying you if you didn't.

## Troubleshooting

### Spark related issues

If you have problems running processes that make use of Spark such as ```MarkDuplicates```.
You are probably experiencing issues with the limit of open files in your system.
You can check your current limit by typing the following:

```bash
ulimit -n
```

The default limit size is usually 1024 which is quite low to run Spark jobs.
In order to increase the size limit permanently you can:

Edit the file ```/etc/security/limits.conf``` and add the lines:

```bash
* soft nofile 65535
* hard nofile 65535
```

Edit the file ```/etc/sysctl.conf``` and add the line:

```bash
fs.file-max = 65535
```

Edit the file ```/etc/sysconfig/docker``` and add the new limits to OPTIONS like this:

```bash
OPTIONS=”—default-ulimit nofile=65535:65535"
```
Re-start your session.
Note that the way to increase the open file limit in your system may be slightly different or require additional steps.

0 comments on commit cce883e

Please sign in to comment.