Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snake8 #723

Open
wants to merge 24 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ jobs:
source activate ./atlasenv
conda list
else
mamba env create -p ./atlasenv --file atlasenv.yml
conda env create -p ./atlasenv --file atlasenv.yml
fi
- save_cache:
key: atlasenv-d-{{ checksum "atlasenv.yml" }}
Expand Down
33 changes: 33 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Start with the Miniconda base image
FROM continuumio/miniconda3:24.9.2-0

# Set the working directory in the container
WORKDIR /main

# Copy the environment file and project code
COPY atlasenv.yml .

# Create a user with a specific UID and GID
RUN groupadd -g 1000 atlasgroup && \
useradd -m -u 1000 -g atlasgroup -s /bin/bash atlasuser

# Set the HOME environment variable
ENV HOME=/home/atlasuser

# Change ownership of the home directory
RUN chown -R atlasuser:atlasgroup $HOME

# Switch to the new user
USER atlasuser

# Create and activate the environment
RUN conda env create -n atlas -f atlasenv.yml && \
conda clean -afy && \
echo "source activate atlas" > ~/.bashrc

# Set the working directory
WORKDIR /main


# Set the default command
CMD ["bash"]
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

You can start using atlas with three commands:
```
mamba install -y -c bioconda -c conda-forge metagenome-atlas={latest_version}
conda install -y -c bioconda -c conda-forge metagenome-atlas={latest_version}
atlas init --db-dir databases path/to/fastq/files
atlas run all
```
Expand All @@ -38,7 +38,7 @@
> doi: [10.1186/s12859-020-03585-4](https://doi.org/10.1186/s12859-020-03585-4)


# Developpment/Extensions

Check failure on line 41 in README.md

View workflow job for this annotation

GitHub Actions / Check for spelling errors

Developpment ==> Development

Here are some ideas I work or want to work on when I have time. If you want to contribute or have some ideas let me know via a feature request issue.

Expand Down
4 changes: 2 additions & 2 deletions atlas/atlas.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
import click


from snakemake.io import load_configfile
from snakemake.common.configfile import load_configfile
from .make_config import validate_config
from .init.atlas_init import run_init # , run_init_sra

Expand Down Expand Up @@ -247,7 +247,7 @@ def run_download(db_dir, jobs, snakemake_args):
cmd = (
"snakemake --snakefile {snakefile} "
"--jobs {jobs} --rerun-incomplete "
"--conda-frontend mamba --scheduler greedy "
"--scheduler greedy "
"--nolock --use-conda --conda-prefix {conda_prefix} "
" --show-failed-logs "
"--config database_dir='{db_dir}' {add_args} "
Expand Down
2 changes: 1 addition & 1 deletion atlas/make_config.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from .default_values import *
from snakemake.utils import update_config as snakemake_update_config
from snakemake.io import load_configfile
from snakemake.common.configfile import load_configfile
import tempfile
import sys
import os
Expand Down
8 changes: 4 additions & 4 deletions atlasenv.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,16 @@ channels:
- bioconda
- defaults
dependencies:
- python >=3.8, < 3.12
- mamba
- python >=3.10, < 3.12
- libmamba= 2
- bbmap >= 39.01, <40
- snakemake-minimal >= 7.18.1, <7.26
- snakemake-minimal >= 8.12, <8.26
- pygments
- networkx
- graphviz
- pandas >=1.2, <1.6
- pyarrow # for parquet reading
- click >=7
- ruamel.yaml >=0.17
- cookiecutter
- wget
- snakemake-executor-plugin-slurm
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ It handles all steps from QC, Assembly, Binning, to Annotation.

You can start using atlas with three commands::

mamba install -c bioconda -c conda-forge metagenome-atlas={latest_version}
conda install -c bioconda -c conda-forge metagenome-atlas={latest_version}
atlas init --db-dir databases path/to/fastq/files
atlas run

Expand Down
17 changes: 5 additions & 12 deletions docs/usage/getting_started.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
.. _conda: http://anaconda.org/
.. _mamba: https://github.com/TheSnakePit/mamba


Getting Started
***************
Expand Down Expand Up @@ -30,15 +30,8 @@ Setting strict channel priority can prevent quite some annoyances.

The order is important by the way.

Install mamba
-------------

Conda can be a bit slow because there are so many packages. A good way around this is to use mamba_ (another snake).::

conda install mamba


From now on, you can replace ``conda install`` with ``mamba install`` and see how much faster this snake is.
_Previously atalas reccomended to use of mamba, which was faster. Since conda 24.9, conda uses the same library as backend.
So we sugest to update conda ``conda update -n base conda``` and to use it._::

Install metagenome-atlas
------------------------
Expand All @@ -48,7 +41,7 @@ We also recommend to specify the latest version of metagenome-atlas.

.. code-block:: bash

mamba create -y -n atlasenv metagenome-atlas={latest_version}
conda create -y -n atlasenv metagenome-atlas={latest_version}
source activate atlasenv

where `{latest_version}` should be replaced by
Expand All @@ -73,7 +66,7 @@ Alternatively, you can install metagenome Atlas directly from GitHub. This allow
# git checkout branchname

# create dependencies for atlas
mamba env create -n atlas-dev --file atlasenv.yml
conda env create -n atlas-dev --file atlasenv.yml
conda activate atlas-dev

# install atlas version. Changes in the files are directly available in the atlas dev version
Expand Down
1 change: 0 additions & 1 deletion docs/usage/output.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
.. |scheme| image:: ../../resources/images/atlas_list.png
:alt: Atlas is a workflow for assembly and binning of metagenomic reads

.. _thesis: https://github.com/TheSnakePit/mamba

Expected output
***************
Expand Down
2 changes: 1 addition & 1 deletion workflow/envs/fasta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ channels:
- bioconda
- defaults
dependencies:
- pyfastx=0.9
- pyfastx=2.1
- pandas=1.2
- pyarrow
- biopython
2 changes: 1 addition & 1 deletion workflow/envs/required_packages.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ dependencies:
- bzip2 >=1.0
- pandas >=1.2, <2
- samtools >=1.13, <2
- sambamba <1
- sambamba
4 changes: 2 additions & 2 deletions workflow/rules/assemble.smk
Original file line number Diff line number Diff line change
Expand Up @@ -505,7 +505,7 @@ if config["filter_contigs"]:
resources:
mem_mb=config["mem"] * 1000,
wrapper:
"v1.19.0/bio/minimap2/aligner"
"v5.5.0/bio/minimap2/aligner"

rule pileup_prefilter:
input:
Expand Down Expand Up @@ -635,7 +635,7 @@ rule align_reads_to_final_contigs:
resources:
mem_mb=config["mem"] * 1000,
wrapper:
"v1.19.0/bio/minimap2/aligner"
"v5.5.0/bio/minimap2/aligner"


rule pileup_contigs_sample:
Expand Down
2 changes: 1 addition & 1 deletion workflow/rules/derep.smk
Original file line number Diff line number Diff line change
Expand Up @@ -110,4 +110,4 @@ rule build_bin_report:
log:
"logs/binning/report_{binner}.log",
script:
"../report/bin_report.py"
"../../report/bin_report.py"
4 changes: 2 additions & 2 deletions workflow/rules/genecatalog.smk
Original file line number Diff line number Diff line change
Expand Up @@ -234,7 +234,7 @@ rule index_genecatalog:
params:
index_size="12G",
wrapper:
"v1.19.0/bio/minimap2/index"
"v5.5.0/bio/minimap2/index"


rule concat_all_reads:
Expand Down Expand Up @@ -266,7 +266,7 @@ rule align_reads_to_Genecatalog:
extra="-x sr --split-prefix {sample}_split_ ",
sort="coordinate",
wrapper:
"v1.19.0/bio/minimap2/aligner"
"v5.5.0/bio/minimap2/aligner"


rule pileup_Genecatalog:
Expand Down
4 changes: 2 additions & 2 deletions workflow/rules/genomes.smk
Original file line number Diff line number Diff line change
Expand Up @@ -206,7 +206,7 @@ if config["genome_aligner"] == "minimap":
resources:
mem_mb=config["mem"] * 1000,
wrapper:
"v1.19.0/bio/minimap2/index"
"v5.5.0/bio/minimap2/index"

rule align_reads_to_genomes:
input:
Expand All @@ -223,7 +223,7 @@ if config["genome_aligner"] == "minimap":
resources:
mem_mb=config["mem"] * 1000,
wrapper:
"v1.19.0/bio/minimap2/aligner"
"v5.5.0/bio/minimap2/aligner"

elif config["genome_aligner"] == "bwa":

Expand Down
12 changes: 4 additions & 8 deletions workflow/rules/qc.smk
Original file line number Diff line number Diff line change
Expand Up @@ -158,8 +158,7 @@ if not SKIP_QC:
dupesubs=config["duplicates_allow_substitutions"],
only_optical=("t" if config.get("duplicates_only_optical") else "f"),
log:
sterr="{sample}/logs/QC/deduplicate.err",
stout="{sample}/logs/QC/deduplicate.log",
"{sample}/logs/QC/deduplicate.log",
conda:
"%s/required_packages.yaml" % CONDAENV
threads: config.get("threads", 1)
Expand All @@ -177,8 +176,7 @@ if not SKIP_QC:
" threads={threads} "
" pigz=t unpigz=t "
" -Xmx{resources.java_mem}G "
" 2> {log.sterr} "
" 1> {log.stout} "
" &> {log} "

PROCESSED_STEPS.append("filtered")

Expand Down Expand Up @@ -229,8 +227,7 @@ if not SKIP_QC:
output.reads, key="out", allow_singletons=False
),
log:
sterr="{sample}/logs/QC/quality_filter.err",
stout="{sample}/logs/QC/quality_filter.log",
"{sample}/logs/QC/quality_filter.log",
conda:
"%s/required_packages.yaml" % CONDAENV
threads: config.get("threads", 1)
Expand Down Expand Up @@ -260,8 +257,7 @@ if not SKIP_QC:
" prealloc={params.prealloc} "
" pigz=t unpigz=t "
" -Xmx{resources.java_mem}G "
" 2> {log.sterr} "
" 1> {log.stout} "
" &> {log} "

# if there are no references, decontamination will be skipped
if len(config.get("contaminant_references", {}).keys()) > 0:
Expand Down
Loading