Skip to content

Commit

Permalink
Merge branch 'dev' into modules_update
Browse files Browse the repository at this point in the history
  • Loading branch information
jasmezz committed Oct 2, 2023
2 parents 7228cd9 + 6cdd651 commit 5e7cc6b
Show file tree
Hide file tree
Showing 16 changed files with 235 additions and 79 deletions.
1 change: 1 addition & 0 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"name": "nfcore",
"image": "nfcore/gitpod:latest",
"remoteUser": "gitpod",
"runArgs": ["--privileged"],

// Configure tool-specific properties.
"customizations": {
Expand Down
4 changes: 3 additions & 1 deletion .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@ Please use the pre-filled template to save time.
However, don't be put off by this template - other more general issues and suggestions are welcome!
Contributions to the code are even more welcome ;)

> If you need help using or modifying nf-core/funcscan then the best place to ask is on the nf-core Slack [#funcscan](https://nfcore.slack.com/channels/funcscan) channel ([join our Slack here](https://nf-co.re/join/slack)).
:::info
If you need help using or modifying nf-core/funcscan then the best place to ask is on the nf-core Slack [#funcscan](https://nfcore.slack.com/channels/funcscan) channel ([join our Slack here](https://nf-co.re/join/slack)).
:::

## Contribution workflow

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ jobs:

- uses: actions/setup-python@v4
with:
python-version: "3.8"
python-version: "3.11"
architecture: "x64"

- name: Install dependencies
Expand Down
68 changes: 68 additions & 0 deletions .github/workflows/release-announcments.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
name: release-announcements
# Automatic release toot and tweet anouncements
on:
release:
types: [published]
workflow_dispatch:

jobs:
toot:
runs-on: ubuntu-latest
steps:
- uses: rzr/fediverse-action@master
with:
access-token: ${{ secrets.MASTODON_ACCESS_TOKEN }}
host: "mstdn.science" # custom host if not "mastodon.social" (default)
# GitHub event payload
# https://docs.github.com/en/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#release
message: |
Pipeline release! ${{ github.repository }} v${{ github.event.release.tag_name }} - ${{ github.event.release.name }}!
Please see the changelog: ${{ github.event.release.html_url }}
send-tweet:
runs-on: ubuntu-latest

steps:
- uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install dependencies
run: pip install tweepy==4.14.0
- name: Send tweet
shell: python
run: |
import os
import tweepy
client = tweepy.Client(
access_token=os.getenv("TWITTER_ACCESS_TOKEN"),
access_token_secret=os.getenv("TWITTER_ACCESS_TOKEN_SECRET"),
consumer_key=os.getenv("TWITTER_CONSUMER_KEY"),
consumer_secret=os.getenv("TWITTER_CONSUMER_SECRET"),
)
tweet = os.getenv("TWEET")
client.create_tweet(text=tweet)
env:
TWEET: |
Pipeline release! ${{ github.repository }} v${{ github.event.release.tag_name }} - ${{ github.event.release.name }}!
Please see the changelog: ${{ github.event.release.html_url }}
TWITTER_CONSUMER_KEY: ${{ secrets.TWITTER_CONSUMER_KEY }}
TWITTER_CONSUMER_SECRET: ${{ secrets.TWITTER_CONSUMER_SECRET }}
TWITTER_ACCESS_TOKEN: ${{ secrets.TWITTER_ACCESS_TOKEN }}
TWITTER_ACCESS_TOKEN_SECRET: ${{ secrets.TWITTER_ACCESS_TOKEN_SECRET }}

bsky-post:
runs-on: ubuntu-latest
steps:
- uses: zentered/bluesky-post-action@v0.0.2
with:
post: |
Pipeline release! ${{ github.repository }} v${{ github.event.release.tag_name }} - ${{ github.event.release.name }}!
Please see the changelog: ${{ github.event.release.html_url }}
env:
BSKY_IDENTIFIER: ${{ secrets.BSKY_IDENTIFIER }}
BSKY_PASSWORD: ${{ secrets.BSKY_PASSWORD }}
#
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### `Fixed`

- [#306](https://github.com/nf-core/funcscan/pull/306) Added new parameter `annotation_prokka_retaincontigheaders` to allow prokka to retain the original contig headers/locus tag. (by @darcy220606)
- [#307](https://github.com/nf-core/funcscan/pull/307) Fixed stability of deepARG tests by using Zenodo copy of database (❤️ to Gustavo Arango and Liqing Zhang for uploading, fix by @jfy133)

### `Dependencies`
Expand Down
133 changes: 102 additions & 31 deletions CODE_OF_CONDUCT.md

Large diffs are not rendered by default.

22 changes: 13 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# ![nf-core/funscan](docs/images/nf-core-funcscan_logo_flat_light.png#gh-light-mode-only) ![nf-core/funscan](docs/images/nf-core-funcscan_logo_flat_dark.png#gh-dark-mode-only)

[![AWS CI](https://img.shields.io/badge/CI%20tests-full%20size-FF9900?labelColor=000000&logo=Amazon%20AWS)](https://nf-co.re/funcscan/results)[![Cite with Zenodo](http://img.shields.io/badge/DOI-10.5281/zenodo.7643099-1073c8?labelColor=000000)](https://doi.org/10.5281/zenodo.7643099)
[![GitHub Actions CI Status](https://github.com/nf-core/funcscan/workflows/nf-core%20CI/badge.svg)](https://github.com/nf-core/funcscan/actions?query=workflow%3A%22nf-core+CI%22)
[![GitHub Actions Linting Status](https://github.com/nf-core/funcscan/workflows/nf-core%20linting/badge.svg)](https://github.com/nf-core/funcscan/actions?query=workflow%3A%22nf-core+linting%22)[![AWS CI](https://img.shields.io/badge/CI%20tests-full%20size-FF9900?labelColor=000000&logo=Amazon%20AWS)](https://nf-co.re/funcscan/results)[![Cite with Zenodo](http://img.shields.io/badge/DOI-10.5281/zenodo.7643099-1073c8?labelColor=000000)](https://doi.org/10.5281/zenodo.7643099)

[![Nextflow](https://img.shields.io/badge/nextflow%20DSL2-%E2%89%A523.04.0-23aa62.svg)](https://www.nextflow.io/)

[![run with conda](http://img.shields.io/badge/run%20with-conda-3EB049?labelColor=000000&logo=anaconda)](https://docs.conda.io/en/latest/)
Expand Down Expand Up @@ -33,10 +35,11 @@ The nf-core/funcscan AWS full test dataset are contigs generated by the MGnify s

## Usage

> **Note**
> If you are new to Nextflow and nf-core, please refer to [this page](https://nf-co.re/docs/usage/installation) on how
> to set-up Nextflow. Make sure to [test your setup](https://nf-co.re/docs/usage/introduction#how-to-run-a-pipeline)
> with `-profile test` before running the workflow on actual data.
:::note
If you are new to Nextflow and nf-core, please refer to [this page](https://nf-co.re/docs/usage/installation) on how
to set-up Nextflow. Make sure to [test your setup](https://nf-co.re/docs/usage/introduction#how-to-run-a-pipeline)
with `-profile test` before running the workflow on actual data.
:::

First, prepare a samplesheet with your input data that looks as follows:

Expand All @@ -63,10 +66,11 @@ nextflow run nf-core/funcscan \
--run_bgc_screening
```

> **Warning:**
> Please provide pipeline parameters via the CLI or Nextflow `-params-file` option. Custom config files including those
> provided by the `-c` Nextflow option can be used to provide any configuration _**except for parameters**_;
> see [docs](https://nf-co.re/usage/configuration#custom-configuration-files).
:::warning
Please provide pipeline parameters via the CLI or Nextflow `-params-file` option. Custom config files including those
provided by the `-c` Nextflow option can be used to provide any configuration _**except for parameters**_;
see [docs](https://nf-co.re/usage/configuration#custom-configuration-files).
:::

For more details and further functionality, please refer to the [usage documentation](https://nf-co.re/funcscan/usage) and the [parameter documentation](https://nf-co.re/funcscan/parameters).

Expand Down
4 changes: 2 additions & 2 deletions assets/multiqc_config.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
report_comment: >
This report has been generated by the <a href="https://github.com/nf-core/funcscan/1.1.4dev" target="_blank">nf-core/funcscan</a>
This report has been generated by the <a href="https://github.com/nf-core/funcscan/releases/tag/dev" target="_blank">nf-core/funcscan</a>
analysis pipeline. For information about how to interpret these results, please see the
<a href="https://nf-co.re/funcscan/1.1.4dev/output" target="_blank">documentation</a>.
<a href="https://nf-co.re/funcscan/dev/docs/output" target="_blank">documentation</a>.
report_section_order:
"nf-core-funcscan-methods-description":
order: -1000
Expand Down
3 changes: 1 addition & 2 deletions conf/modules.config
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,7 @@ process {
"--mincontiglen ${params.annotation_prokka_mincontiglen}",
"--evalue ${params.annotation_prokka_evalue}",
"--coverage ${params.annotation_prokka_coverage}",
"--centre ${params.annotation_prokka_centre}",
"--locustag ${params.annotation_prokka_locustag}",
params.annotation_prokka_retaincontigheaders ? "--force" : "--locustag PROKKA --centre CENTER" ,
params.annotation_prokka_singlemode ? '' : '--metagenome' ,
params.annotation_prokka_cdsrnaolap ? '--cdsrnaolap' : '',
params.annotation_prokka_rawproduct ? '--rawproduct' : '',
Expand Down
1 change: 1 addition & 0 deletions docs/output.md
Original file line number Diff line number Diff line change
Expand Up @@ -575,6 +575,7 @@ Output Summaries:
- Reports generated by Nextflow: `execution_report.html`, `execution_timeline.html`, `execution_trace.txt` and `pipeline_dag.dot`/`pipeline_dag.svg`.
- Reports generated by the pipeline: `pipeline_report.html`, `pipeline_report.txt` and `software_versions.yml`. The `pipeline_report*` files will only be present if the `--email` / `--email_on_fail` parameter's are used when running the pipeline.
- Reformatted samplesheet files used as input to the pipeline: `samplesheet.valid.csv`.
- Parameters used by the pipeline run: `params.json`.

</details>

Expand Down
16 changes: 12 additions & 4 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -249,7 +249,9 @@ If you wish to repeatedly use the same parameters for multiple runs, rather than

Pipeline settings can be provided in a `yaml` or `json` file via `-params-file <file>`.

> ⚠️ Do not use `-c <file>` to specify parameters as this will result in errors. Custom config files specified with `-c` must only be used for [tuning process resource specifications](https://nf-co.re/docs/usage/configuration#tuning-workflow-resources), other infrastructural tweaks (such as output directories), or module arguments (args).
:::warning
Do not use `-c <file>` to specify parameters as this will result in errors. Custom config files specified with `-c` must only be used for [tuning process resource specifications](https://nf-co.re/docs/usage/configuration#tuning-workflow-resources), other infrastructural tweaks (such as output directories), or module arguments (args).
:::

The above pipeline run specified with a params file in yaml format:

Expand Down Expand Up @@ -286,19 +288,25 @@ This version number will be logged in reports when you run the pipeline, so that

To further assist in reproducibility, you can use share and re-use [parameter files](#running-the-pipeline) to repeat pipeline runs with the same settings without having to write out a command with every single parameter.

> 💡 If you wish to share such profile (such as upload as supplementary material for academic publications), make sure to NOT include cluster specific paths to files, nor institutional specific profiles.
:::tip
If you wish to share such profile (such as upload as supplementary material for academic publications), make sure to NOT include cluster specific paths to files, nor institutional specific profiles.
:::

## Core Nextflow arguments

> **NB:** These options are part of Nextflow and use a _single_ hyphen (pipeline parameters use a double-hyphen).
:::note
These options are part of Nextflow and use a _single_ hyphen (pipeline parameters use a double-hyphen).
:::

### `-profile`

Use this parameter to choose a configuration profile. Profiles can give configuration presets for different compute environments.

Several generic profiles are bundled with the pipeline which instruct the pipeline to use software packaged using different methods (Docker, Singularity, Podman, Shifter, Charliecloud, Apptainer, Conda) - see below.

> We highly recommend the use of Docker or Singularity containers for full pipeline reproducibility, however when this is not possible, Conda is also supported.
:::info
We highly recommend the use of Docker or Singularity containers for full pipeline reproducibility, however when this is not possible, Conda is also supported.
:::

The pipeline also dynamically loads configurations from [https://github.com/nf-core/configs](https://github.com/nf-core/configs) when it runs, making multiple config profiles for various institutional clusters available at run time. For more information and to see if your system is available in these configs please see the [nf-core/configs documentation](https://github.com/nf-core/configs#documentation).

Expand Down
16 changes: 16 additions & 0 deletions lib/NfcoreTemplate.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
//

import org.yaml.snakeyaml.Yaml
import groovy.json.JsonOutput

class NfcoreTemplate {

Expand Down Expand Up @@ -222,6 +223,21 @@ class NfcoreTemplate {
}
}

//
// Dump pipeline parameters in a json file
//
public static void dump_parameters(workflow, params) {
def output_d = new File("${params.outdir}/pipeline_info/")
if (!output_d.exists()) {
output_d.mkdirs()
}

def timestamp = new java.util.Date().format( 'yyyy-MM-dd_HH-mm-ss')
def output_pf = new File(output_d, "params_${timestamp}.json")
def jsonStr = JsonOutput.toJson(params)
output_pf.text = JsonOutput.prettyPrint(jsonStr)
}

//
// Print pipeline summary on completion
//
Expand Down
2 changes: 2 additions & 0 deletions lib/WorkflowFuncscan.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,8 @@ class WorkflowFuncscan {

public static String toolCitationText(params) {

// Can use ternary operators to dynamically construct based conditions, e.g. params["run_xyz"] ? "Tool (Foo et al. 2023)" : "",
// Uncomment function in methodsDescriptionText to render in MultiQC report
def preprocessing_text = "The pipeline used the following tools: preprocessing included bioawk (Li 2023)."

def annotation_text = [
Expand Down
12 changes: 6 additions & 6 deletions nextflow.config
Original file line number Diff line number Diff line change
Expand Up @@ -62,10 +62,9 @@ params {
annotation_prokka_mincontiglen = 1
annotation_prokka_evalue = 1E-06
annotation_prokka_coverage = 80
annotation_prokka_centre = null
annotation_prokka_compliant = false
annotation_prokka_locustag = 'PROKKA'
annotation_prokka_compliant = true
annotation_prokka_addgenes = false
annotation_prokka_retaincontigheaders = false

// Database downloading options
save_databases = false
Expand Down Expand Up @@ -211,7 +210,7 @@ params {
// Schema validation default options
validationFailUnrecognisedParams = false
validationLenientMode = false
validationSchemaIgnoreParams = 'genomes'
validationSchemaIgnoreParams = 'genomes,igenomes_base'
validationShowHiddenParams = false
validate_params = true

Expand Down Expand Up @@ -313,6 +312,7 @@ profiles {
}
apptainer {
apptainer.enabled = true
apptainer.autoMounts = true
conda.enabled = false
docker.enabled = false
singularity.enabled = false
Expand All @@ -322,8 +322,8 @@ profiles {
}
gitpod {
executor.name = 'local'
executor.cpus = 16
executor.memory = 60.GB
executor.cpus = 4
executor.memory = 8.GB
}
test { includeConfig 'conf/test.config' }
test_bgc { includeConfig 'conf/test_bgc.config' }
Expand Down
26 changes: 5 additions & 21 deletions nextflow_schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -297,30 +297,22 @@
"help_text": "Activates [RNAmmer](https://services.healthtech.dtu.dk/service.php?RNAmmer-1.2) instead of the Prokka default [Barrnap](https://github.com/tseemann/barrnap) for rRNA prediction during the annotation process. RNAmmer classifies ribosomal RNA genes in genome sequences by using two levels of Hidden Markov Models. Barrnap uses the nhmmer tool that includes HMMER 3.1 for HMM searching in RNA:DNA style.\n\nFor more information please check Prokka [documentation](https://github.com/tseemann/prokka).\n\n> Modifies tool parameter(s):\n> - Prokka: `--rnammer`",
"fa_icon": "fas fa-adjust"
},
"annotation_prokka_centre": {
"type": "string",
"description": "Sequencing centre ID.",
"fa_icon": "fas fa-map-marker-alt",
"help_text": "Add the sequencing center ID used in generating the raw sequences. This flag is typically requested in combination with the `--compliant` flag when contigs need to be renamed due to non-conforming contig headers. For more information please check Prokka [documentation](https://github.com/tseemann/prokka). \n\n> Modifies tool parameter(s):\n> - Prokka: `--centre`"
},
"annotation_prokka_compliant": {
"type": "boolean",
"fa_icon": "far fa-check-circle",
"description": "Force contig name to Genbank/ENA/DDJB naming rules.",
"help_text": "Force the contig headers to conform to the Genbank/ENA/DDJB contig header standards. This is activated in combination with `--centre [X]` when contig headers supplied by the user are non-conforming and therefore need to be renamed before Prokka can start annotation. This flag activates `--genes --mincontiglen 200`. For more information please check Prokka [documentation](https://github.com/tseemann/prokka). \n\n> Modifies tool parameter(s):\n> - Prokka: `--compliant`"
},
"annotation_prokka_locustag": {
"type": "string",
"default": "Prokka",
"fa_icon": "fas fa-tags",
"description": "Assign the locus tag for the contig header.",
"help_text": "Assign a special name to the contig. This is used when a specific group of samples are run in a batch. For more information please check Prokka [documentation](https://github.com/tseemann/prokka). \n\n> Modifies tool parameter(s):\n> - Prokka: `--locustag`"
},
"annotation_prokka_addgenes": {
"type": "boolean",
"fa_icon": "fas fa-dna",
"description": "Add the gene features for each CDS hit.",
"help_text": "For every CDS annotated, this flag adds the gene that encodes for that CDS region. For more information please check Prokka [documentation](https://github.com/tseemann/prokka). \n\n> Modifies tool parameter(s):\n> - Prokka: `--addgenes`"
},
"annotation_prokka_retaincontigheaders": {
"type": "boolean",
"help_text": "This parameter allows prokka to retain the original contig names by activating `PROKKA`'s `--force` flag. If this parameter is set to `false` it activates `PROKKA`'s flags `--locus-tag PROKKA --centre CENTER` so the locus tags (contig names) will be PROKKA_# and the center tag will be CENTER. By default `PROKKA` changes contig headers to avoid errors that might rise due to long contig headers, so this must be turned on if the user has short contig names that should be retained by `PROKKA`. \n\n> Modifies tool parameter(s):\n> - Prokka: `--locus-tag PROKKA --centre CENTER`\n> - Prokka: `--force`",
"description": "Retains contig names."
}
},
"fa_icon": "fas fa-tools",
Expand Down Expand Up @@ -1129,14 +1121,6 @@
"fa_icon": "far fa-file-code",
"hidden": true
},
"igenomes_base": {
"type": "string",
"format": "directory-path",
"description": "Directory / URL base for iGenomes references.",
"default": "s3://ngi-igenomes/igenomes",
"fa_icon": "fas fa-cloud-download-alt",
"hidden": true
},
"igenomes_ignore": {
"type": "boolean",
"description": "Do not load the iGenomes reference config.",
Expand Down
Loading

0 comments on commit 5e7cc6b

Please sign in to comment.