Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add troubleshooting section to docs with an entry on Spark issues #294

Merged
merged 6 commits into from
Oct 20, 2020
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
The nf-core/sarek documentation is split into the following pages:

- [Usage](usage.md)
- An overview of how the pipeline works, how to run it and a description of all of the different command-line flags.
- An overview of how the pipeline works, how to run it, a description of all of the different command-line flags and solutions to common issues.
- [Output](output.md)
- An overview of the different results produced by the pipeline and how to interpret them.

maxulysse marked this conversation as resolved.
Show resolved Hide resolved
You can find a lot more documentation about installing, configuring and running nf-core pipelines on the website: [https://nf-co.re](https://nf-co.re)
41 changes: 41 additions & 0 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,8 @@
- [--awsqueue](#--awsqueue)
- [--awsregion](#--awsregion)
- [--awscli](#--awscli)
- [Troubleshooting](#troubleshooting)
- [Spark related issues](#spark)

## Running the pipeline

Expand Down Expand Up @@ -1737,3 +1739,42 @@ The [AWS CLI](https://www.nextflow.io/docs/latest/awscloud.html#aws-cli-installa
Default: `/home/ec2-user/miniconda/bin/aws`.

Please make sure to also set the `-w/--work-dir` and `--outdir` parameters to a S3 storage bucket of your choice - you'll get an error message notifying you if you didn't.

## Troubleshooting

jfnavarro marked this conversation as resolved.
Show resolved Hide resolved

### Spark related issues

If you have problems running processes that make use of Spark such as ```MarkDuplicates```.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
If you have problems running processes that make use of Spark such as ```MarkDuplicates```.
If you have problems running processes that make use of Spark such as ```MarkDuplicates```.

You are probably experiencing issues with the limit of open files in your system. You can
check your current limit by typing the following:
maxulysse marked this conversation as resolved.
Show resolved Hide resolved

```bash
ulimit -n
```

The default limit size is usually 1024 which is quite low to run Spark jobs. In order to increase
the size limit permantly you can:
maxulysse marked this conversation as resolved.
Show resolved Hide resolved

Edit the file ```/etc/security/limits.conf``` and add the lines:

```bash
* soft nofile 65535
* hard nofile 65535
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* hard nofile 65535
* hard nofile 65535

```
Edit the file ```/etc/sysctl.conf```and add the line:

```bash
fs.file-max = 65535
```

Edit the file ```/etc/sysconfig/docker```and add the new limits to OPTIONS like this:

```bash
OPTIONS=”—default-ulimit nofile=65535:65535"
```

Re-start your session.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Re-start your session.
Re-start your session.


Note that the way to increase the open file limit in your system may be slightly different
or require additional steps.
maxulysse marked this conversation as resolved.
Show resolved Hide resolved