Skip to content

Commit

Permalink
Merge branch 'dev' into test-yml-create-cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
KevinMenden authored Jun 23, 2021
2 parents 8c4de16 + 9ccd25c commit fed86ce
Show file tree
Hide file tree
Showing 86 changed files with 3,163 additions and 2,256 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/create-lint-wf.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ jobs:
- name: nf-core list
run: nf-core --log-file log.txt list

- name: nf-core licences
run: nf-core --log-file log.txt licences nf-core-testpipeline
# - name: nf-core licences
# run: nf-core --log-file log.txt licences nf-core-testpipeline

- name: nf-core sync
run: nf-core --log-file log.txt sync nf-core-testpipeline/
Expand Down
41 changes: 41 additions & 0 deletions .github/workflows/create-test-wf.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
name: Create a pipeline and test it
on: [push, pull_request]

# Uncomment if we need an edge release of Nextflow again
# env: NXF_EDGE: 1

jobs:
RunTestWorkflow:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
name: Check out source-code repository

- name: Set up Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7

- name: Install python dependencies
run: |
python -m pip install --upgrade pip
pip install .
- name: Install Nextflow
env:
CAPSULE_LOG: none
run: |
wget -qO- get.nextflow.io | bash
sudo mv nextflow /usr/local/bin/
- name: Run nf-core/tools
run: |
nf-core --log-file log.txt create -n testpipeline -d "This pipeline is for testing" -a "Testing McTestface"
nextflow run nf-core-testpipeline -profile test,docker
- name: Upload log file artifact
if: ${{ always() }}
uses: actions/upload-artifact@v2
with:
name: nf-core-log-file
path: log.txt
8 changes: 4 additions & 4 deletions .github/workflows/markdown-lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,13 @@ jobs:

- uses: actions/setup-node@v1
with:
node-version: '10'
node-version: "10"

- name: Install markdownlint
run: npm install -g markdownlint-cli

- name: Run Markdownlint
run: markdownlint ${GITHUB_WORKSPACE} -c ${GITHUB_WORKSPACE}/.github/markdownlint.yml
run: markdownlint .

# If the above check failed, post a comment on the PR explaining the failure
- name: Post PR comment
Expand All @@ -32,8 +32,8 @@ jobs:
* On Mac: `brew install markdownlint-cli`
* Everything else: [Install `npm`](https://www.npmjs.com/get-npm) then [install `markdownlint-cli`](https://www.npmjs.com/package/markdownlint-cli) (`npm install -g markdownlint-cli`)
* Fix the markdown errors
* Automatically: `markdownlint . --config .github/markdownlint.yml --fix`
* Manually resolve anything left from `markdownlint . --config .github/markdownlint.yml`
* Automatically: `markdownlint . --fix`
* Manually resolve anything left from `markdownlint .`
Once you push these changes the test should pass, and you can hide this comment :+1:
Expand Down
3 changes: 2 additions & 1 deletion .github/markdownlint.yml → .markdownlint.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# Markdownlint configuration file
default: true
line-length: false
ul-indent:
indent: 4
no-duplicate-header:
siblings_only: true
no-inline-html:
Expand All @@ -10,7 +12,6 @@ no-inline-html:
- kbd
- details
- summary
- kbd
# tools only - the {{ jinja variables }} break URLs and cause this to error
no-bare-urls: false
# tools only - suppresses error messages for usage of $ in main README
Expand Down
185 changes: 94 additions & 91 deletions CHANGELOG.md

Large diffs are not rendered by default.

34 changes: 17 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,12 @@ A python package with helper tools for the nf-core community.
* [`nf-core bump-version` - Update nf-core pipeline version number](#bumping-a-pipeline-version-number)
* [`nf-core sync` - Synchronise pipeline TEMPLATE branches](#sync-a-pipeline-with-the-template)
* [`nf-core modules` - commands for dealing with DSL2 modules](#modules)
* [`modules list` - List available modules](#list-modules)
* [`modules install` - Install a module from nf-core/modules](#install-a-module-into-a-pipeline)
* [`modules remove` - Remove a module from a pipeline](#remove-a-module-from-a-pipeline)
* [`modules create` - Create a module from the template](#create-a-new-module)
* [`modules create-test-yml` - Create the `test.yml` file for a module](#create-a-module-test-config-file)
* [`modules lint` - Check a module against nf-core guidelines](#check-a-module-against-nf-core-guidelines)
* [`modules list` - List available modules](#list-modules)
* [`modules install` - Install a module from nf-core/modules](#install-a-module-into-a-pipeline)
* [`modules remove` - Remove a module from a pipeline](#remove-a-module-from-a-pipeline)
* [`modules create` - Create a module from the template](#create-a-new-module)
* [`modules create-test-yml` - Create the `test.yml` file for a module](#create-a-module-test-config-file)
* [`modules lint` - Check a module against nf-core guidelines](#check-a-module-against-nf-core-guidelines)
* [Citation](#citation)

The nf-core tools package is written in Python and can be imported and used within other packages.
Expand Down Expand Up @@ -327,24 +327,24 @@ Do you want to run this command now? [y/n]:
### Launch tool options

* `-r`, `--revision`
* Specify a pipeline release (or branch / git commit sha) of the project to run
* Specify a pipeline release (or branch / git commit sha) of the project to run
* `-i`, `--id`
* You can use the web GUI for nf-core pipelines by clicking _"Launch"_ on the website. Once filled in you will be given an ID to use with this command which is used to retrieve your inputs.
* You can use the web GUI for nf-core pipelines by clicking _"Launch"_ on the website. Once filled in you will be given an ID to use with this command which is used to retrieve your inputs.
* `-c`, `--command-only`
* If you prefer not to save your inputs in a JSON file and use `-params-file`, this option will specify all entered params directly in the nextflow command.
* If you prefer not to save your inputs in a JSON file and use `-params-file`, this option will specify all entered params directly in the nextflow command.
* `-p`, `--params-in PATH`
* To use values entered in a previous pipeline run, you can supply the `nf-params.json` file previously generated.
* This will overwrite the pipeline schema defaults before the wizard is launched.
* To use values entered in a previous pipeline run, you can supply the `nf-params.json` file previously generated.
* This will overwrite the pipeline schema defaults before the wizard is launched.
* `-o`, `--params-out PATH`
* Path to save parameters JSON file to. (Default: `nf-params.json`)
* Path to save parameters JSON file to. (Default: `nf-params.json`)
* `-a`, `--save-all`
* Without this option the pipeline will ignore any values that match the pipeline schema defaults.
* This option saves _all_ parameters found to the JSON file.
* Without this option the pipeline will ignore any values that match the pipeline schema defaults.
* This option saves _all_ parameters found to the JSON file.
* `-h`, `--show-hidden`
* A pipeline JSON schema can define some parameters as 'hidden' if they are rarely used or for internal pipeline use only.
* This option forces the wizard to show all parameters, including those labelled as 'hidden'.
* A pipeline JSON schema can define some parameters as 'hidden' if they are rarely used or for internal pipeline use only.
* This option forces the wizard to show all parameters, including those labelled as 'hidden'.
* `--url`
* Change the URL used for the graphical interface, useful for development work on the website.
* Change the URL used for the graphical interface, useful for development work on the website.

## Downloading pipelines for offline use

Expand Down
78 changes: 9 additions & 69 deletions nf_core/bump_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,75 +34,13 @@ def bump_pipeline_version(pipeline_obj, new_version):
log.info("Changing version number from '{}' to '{}'".format(current_version, new_version))

# nextflow.config - workflow manifest version
# nextflow.config - process container manifest version
docker_tag = "dev"
if new_version.replace(".", "").isdigit():
docker_tag = new_version
else:
log.info("New version contains letters. Setting docker tag to 'dev'")

update_file_version(
"nextflow.config",
pipeline_obj,
[
(
r"version\s*=\s*[\'\"]?{}[\'\"]?".format(current_version.replace(".", r"\.")),
"version = '{}'".format(new_version),
),
(
r"container\s*=\s*[\'\"]nfcore/{}:(?:{}|dev)[\'\"]".format(
pipeline_obj.pipeline_name.lower(), current_version.replace(".", r"\.")
),
"container = 'nfcore/{}:{}'".format(pipeline_obj.pipeline_name.lower(), docker_tag),
),
],
)

# .github/workflows/ci.yml - docker build image tag
# .github/workflows/ci.yml - docker tag image
update_file_version(
os.path.join(".github", "workflows", "ci.yml"),
pipeline_obj,
[
(
r"docker build --no-cache . -t nfcore/{name}:(?:{tag}|dev)".format(
name=pipeline_obj.pipeline_name.lower(), tag=current_version.replace(".", r"\.")
),
"docker build --no-cache . -t nfcore/{name}:{tag}".format(
name=pipeline_obj.pipeline_name.lower(), tag=docker_tag
),
),
(
r"docker tag nfcore/{name}:dev nfcore/{name}:(?:{tag}|dev)".format(
name=pipeline_obj.pipeline_name.lower(), tag=current_version.replace(".", r"\.")
),
"docker tag nfcore/{name}:dev nfcore/{name}:{tag}".format(
name=pipeline_obj.pipeline_name.lower(), tag=docker_tag
),
),
],
)

# environment.yml - environment name
update_file_version(
"environment.yml",
pipeline_obj,
[
(
r"name: nf-core-{}-{}".format(pipeline_obj.pipeline_name.lower(), current_version.replace(".", r"\.")),
"name: nf-core-{}-{}".format(pipeline_obj.pipeline_name.lower(), new_version),
)
],
)

# Dockerfile - ENV PATH and RUN conda env create
update_file_version(
"Dockerfile",
pipeline_obj,
[
(
r"nf-core-{}-{}".format(pipeline_obj.pipeline_name.lower(), current_version.replace(".", r"\.")),
"nf-core-{}-{}".format(pipeline_obj.pipeline_name.lower(), new_version),
)
],
)
Expand Down Expand Up @@ -132,8 +70,8 @@ def bump_nextflow_version(pipeline_obj, new_version):
pipeline_obj,
[
(
r"nextflowVersion\s*=\s*[\'\"]?>={}[\'\"]?".format(current_version.replace(".", r"\.")),
"nextflowVersion = '>={}'".format(new_version),
r"nextflowVersion\s*=\s*[\'\"]?!>={}[\'\"]?".format(current_version.replace(".", r"\.")),
"nextflowVersion = '!>={}'".format(new_version),
)
],
)
Expand All @@ -157,15 +95,17 @@ def bump_nextflow_version(pipeline_obj, new_version):
pipeline_obj,
[
(
r"nextflow-%E2%89%A5{}-brightgreen.svg".format(current_version.replace(".", r"\.")),
"nextflow-%E2%89%A5{}-brightgreen.svg".format(new_version),
r"nextflow%20DSL2-%E2%89%A5{}-23aa62.svg".format(current_version.replace(".", r"\.")),
"nextflow%20DSL2-%E2%89%A5{}-23aa62.svg".format(new_version),
),
(
# example: 1. Install [`nextflow`](https://nf-co.re/usage/installation) (`>=20.04.0`)
r"1\.\s*Install\s*\[`nextflow`\]\(https://nf-co\.re/usage/installation\)\s*\(`>={}`\)".format(
# example: 1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=20.04.0`)
r"1\.\s*Install\s*\[`Nextflow`\]\(https://www.nextflow.io/docs/latest/getstarted.html#installation\)\s*\(`>={}`\)".format(
current_version.replace(".", r"\.")
),
"1. Install [`nextflow`](https://nf-co.re/usage/installation) (`>={}`)".format(new_version),
"1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>={}`)".format(
new_version
),
),
],
)
Expand Down
8 changes: 7 additions & 1 deletion nf_core/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@


class PipelineCreate(object):
"""Creates a nf-core pipeline a la carte from the nf-core best-practise template.
"""Creates a nf-core pipeline a la carte from the nf-core best-practice template.
Args:
name (str): Name for the pipeline.
Expand Down Expand Up @@ -89,6 +89,10 @@ def render_template(self):
template_files = list(pathlib.Path(template_dir).glob("**/*"))
template_files += list(pathlib.Path(template_dir).glob("*"))
ignore_strs = [".pyc", "__pycache__", ".pyo", ".pyd", ".DS_Store", ".egg"]
rename_files = {
"workflows/pipeline.nf": f"workflows/{self.short_name}.nf",
"lib/WorkflowPipeline.groovy": f"lib/Workflow{self.short_name[0].upper()}{self.short_name[1:]}.groovy",
}

for template_fn_path_obj in template_files:

Expand All @@ -102,6 +106,8 @@ def render_template(self):
# Set up vars and directories
template_fn = os.path.relpath(template_fn_path, template_dir)
output_path = os.path.join(self.outdir, template_fn)
if template_fn in rename_files:
output_path = os.path.join(self.outdir, rename_files[template_fn])
os.makedirs(os.path.dirname(output_path), exist_ok=True)

try:
Expand Down
1 change: 1 addition & 0 deletions nf_core/launch.py
Original file line number Diff line number Diff line change
Expand Up @@ -301,6 +301,7 @@ def launch_web_gui(self):
try:
assert "api_url" in web_response
assert "web_url" in web_response
# DO NOT FIX THIS TYPO. Needs to stay in sync with the website. Maintaining for backwards compatability.
assert web_response["status"] == "recieved"
except AssertionError:
log.debug("Response content:\n{}".format(json.dumps(web_response, indent=4)))
Expand Down
8 changes: 3 additions & 5 deletions nf_core/lint/actions_awsfulltest.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ def actions_awsfulltest(self):
The GitHub Actions workflow is called ``awsfulltest.yml``, and it can be found in the ``.github/workflows/`` directory.
.. warning:: This workflow incurs AWS costs, therefore it should only be triggered for pipeline releases:
``workflow_run`` (after the docker hub release workflow) and ``workflow_dispatch``.
``release`` (after the pipeline release) and ``workflow_dispatch``.
.. note:: You can manually trigger the AWS tests by going to the `Actions` tab on the pipeline GitHub repository and selecting the
`nf-core AWS full size tests` workflow on the left.
Expand All @@ -23,7 +23,7 @@ def actions_awsfulltest(self):
The ``.github/workflows/awsfulltest.yml`` file is tested for the following:
* Must be turned on ``workflow_dispatch``.
* Must be turned on for ``workflow_run`` with ``workflows: ["nf-core Docker push (release)"]`` and ``types: [completed]``
* Must be turned on for ``release`` with ``types: [published]``
* Should run the profile ``test_full`` that should be edited to provide the links to full-size datasets. If it runs the profile ``test``, a warning is given.
"""
passed = []
Expand All @@ -42,9 +42,7 @@ def actions_awsfulltest(self):

# Check that the action is only turned on for published releases
try:
assert "workflow_run" in wf[True]
assert wf[True]["workflow_run"]["workflows"] == ["nf-core Docker push (release)"]
assert wf[True]["workflow_run"]["types"] == ["completed"]
assert wf[True]["release"]["types"] == ["published"]
assert "workflow_dispatch" in wf[True]
except (AssertionError, KeyError, TypeError):
failed.append("`.github/workflows/awsfulltest.yml` is not triggered correctly")
Expand Down
2 changes: 1 addition & 1 deletion nf_core/lint/conda_dockerfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ def conda_dockerfile(self):

# Check if we have both a conda and dockerfile
if self._fp("environment.yml") not in self.files or self._fp("Dockerfile") not in self.files:
return {"ignored": ["No `environment.yml` / `Dockerfile` file found - skipping conda_dockerfile test"]}
return {"warned": ["No `environment.yml` / `Dockerfile` file found - skipping conda_dockerfile test"]}

expected_strings = [
"COPY environment.yml /",
Expand Down
4 changes: 2 additions & 2 deletions nf_core/lint/conda_env_yaml.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
def conda_env_yaml(self):
"""Checks that the conda environment file is valid.
.. note:: This test is ignored if there is not an ``environment.yml``
.. note:: This test warns if there is not an ``environment.yml``
file present in the pipeline root directory.
DSL1 nf-core pipelines use a single Conda environment to manage all software
Expand Down Expand Up @@ -61,7 +61,7 @@ def conda_env_yaml(self):

env_path = os.path.join(self.wf_path, "environment.yml")
if env_path not in self.files:
return {"ignored": ["No `environment.yml` file found - skipping conda_env_yaml test"]}
return {"warned": ["No `environment.yml` file found - skipping conda_env_yaml test"]}

with open(env_path, "r") as fh:
raw_environment_yml = fh.read()
Expand Down
Loading

0 comments on commit fed86ce

Please sign in to comment.