Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Poetry and CI builds -- best practices, knowledge share (travisci), and discussion #366

Closed
2 tasks done
nackjicholson opened this issue Aug 2, 2018 · 28 comments
Closed
2 tasks done
Labels
area/docs Documentation issues/improvements

Comments

@nackjicholson
Copy link

nackjicholson commented Aug 2, 2018

  • I have searched the issues of this repo and believe that this is not a duplicate.
  • I have searched the documentation and believe that my question is not covered.

Question

Hi! I've been working with poetry recently and am happy that I can publish packages to pypi or my own hosted package repository with it. I would like to share my current setup with TravisCI in hopes that it could help others. I would also like to learn from those who have more experience with poetry to get tips on how I might make this setup better, or see how it works with other continuous integration platforms, or other more in depth CI/CD requirements.

My requirements for travis-ci are fairly common.

  1. Use poetry to install my project dependencies and dev dependencies.
  2. Run lint and unit tests for my project.
  3. When the project is git tagged, to build and publish my project to my companies internal pypi registry.

.travis.yml

language: python
python:
- 3.6
env:
  global:
  - secure: "<encrypted MYPYPI_USER=username>"
  - secure: "<encrypted MYPYPI_PASS=p@ssword>"
before_install:
- pip install poetry
install:
- poetry install
script:
- poetry run flake8 my_package test
- poetry run coverage run --source=my_package -m unittest discover -b
before_deploy:
- poetry config repositories.mypypi http://mypypi.example.com/simple
- poetry config http-basic.mypypi $MYPYPI_USER $MYPYPI_PASS
- poetry build -f sdist
deploy:
  provider: script
  script: poetry publish -r mypypi
  skip_cleanup: true
  on:
    tags: true

I have in the past used the built in travis pypi deployment, but it requires a setup.py ( which I don't have anymore! 🙌). So instead I'm running my poetry publish as a script deployment when I tag my repo.

So when master is at a spot where I want to deploy a new version of the package. I do something like.

poetry version minor
git commit -am 'bumped the version'
git tag <version>
# SIDE NOTE: it would be nice to be able to do `git tag $(poetry version --show)`
# or possibly have the bump command output the NEW_VERSION=poetry version minor --output
git push --tags

In order to configure poetry with the credentials to push to our repository I have set $MYPYPI_USER and $MYPYPI_PASS encrypted environment variables in travis.

That's what I have. Cheers 🍺

@epage
Copy link
Contributor

epage commented Oct 8, 2018

If someone is checking the lock file in, it'd also be nice to include best practices for updating it. Ideally we'd get pyup support for it.

@madig
Copy link

madig commented Feb 27, 2019

Your .travis.yml inspired my own:

language: python
python: "3.6"
dist: xenial

stages:
  - lint
  - test
  - name: deploy
    if: tag IS present

cache:
  pip: true
  directories:
    - "$HOME/.cache/pre-commit"

jobs:
  include:
    - stage: lint
      install:
        - pip install pre-commit
        - pre-commit install-hooks
      script:
        - pre-commit run --all-files
    - stage: test
      install:
        - pip install --upgrade pip
        - pip install poetry
        - poetry install -v
      script:
        - pytest --cov
    - stage: deploy
      script:
        - echo Deploying to PyPI...

before_deploy:
  # User and password environment variables are set as hidden variables through
  # the web interface in the project settings.
  - pip install --upgrade pip
  - pip install poetry
  - poetry config http-basic.pypi $PYPI_USER $PYPI_PASS
  - poetry build

deploy:
  provider: script
  script: poetry publish
  skip_cleanup: true
  on:
    all_branches: true  # Travis recognizes tag names as "branches"
    condition: $TRAVIS_BUILD_STAGE_NAME = Deploy
    repo: daltonmaag/statmake
    tags: true

Notes:

  1. I use pre-commit to lint my sources, which I put into its own stage. The lint stage is run before test and deploy is run.
  2. Pip packages and the pre-commit installation are cached to speed up stuff.
  3. The deploy stage is only enabled if Travis is looking at a tagged commit. It's more of a hack, so the deploy: section at the end can test if it's run from the deploy stage (see the condition). Reading the output of the stages makes me think it would otherwise deploy after every stage, but I'm not sure. That would be a gross misfeature.

@vancouverwill
Copy link

on Gitlab setting virtualenvs.in-project to true worked for me and then caching the .venv directory

The relevant section of the .gitlab-ci.yml file are

...

cache:
  paths:
    - .venv
  key: "${CI_COMMIT_REF_SLUG}"

...
  script:
    - poetry config settings.virtualenvs.in-project true
    - poetry install

@matthieugouel
Copy link

matthieugouel commented May 5, 2019

I came into a slightly different .travis-ci.yml based on your suggestions :

language: python
dist: xenial
python:
  - "3.7"

stages:
  - Quality
  - Publish

before_install:
  - pip install poetry

install:
  - poetry install

jobs:
  include:
    - stage: Quality
      name: lint
      script: make lint
    - name: type
      script: make type
    - name: tests
      script: make tests
    - stage: Publish
      script: skip
      before_deploy:
        - poetry config http-basic.pypi $PYPI_USERNAME $PYPI_PASSWORD
        - poetry build -f sdist
      deploy:
        provider: script
        script: poetry publish
        skip_cleanup: true
        on:
          tags: true
      if: tag IS present

after_success:
  poetry run coveralls

@FranklinYu
Copy link

FranklinYu commented Jun 13, 2019

Is it recommended to install poetry through pip? From README:

Be aware, however, that it will also install poetry's dependencies which might cause conflicts.

In addition #166 may be related.

@piccolbo
Copy link

@FranklinYu isn't that true of all dev deps, say pytest? To test a package I need pytest, so I typically install all dev deps with poetry install which may create conflicts or, more often for me, hide a missing dep. I had the same problem before using poetry, just didn't think about it.

@dmontagu
Copy link

dmontagu commented Sep 19, 2019

@piccolbo Bad things happen if you install poetry via pip but have it configured not to create a virtual environment. (As far as I can tell this isn't relevant to the configs discussed above though.)

It might be possible to speed up your builds if you don't have poetry create a virtualenvironment, and rely on the installer script to ensure the vendorized dependencies are present. (This probably depends on your CI system to some degree though.)

@FranklinYu
Copy link

@piccolbo I'm lost at what you mean. If you add pytest through Poetry then it shouldn't conflict with other dependencies. This is the whole point of Poetry, isn't it?

@piccolbo
Copy link

@FranklinYu I consider a conflict when even Poetry can not satisfy all dep constraints. Maybe I am not using the term in the most appropriate way. More generally, I was referring to the fact the dev environment is more complicated than the regular one but the only one in which we can test. So one may be unable to test the environment a user would get in say an empty env. Let's say mypackage requires pandas >=3.14.15 but installing pytest requires pandas >=4.0.0 (I am making this up). Can't test with pandas <4 anymore. Maybe I am worrying for no reason since one can't test all dep combinations anyway.

@FranklinYu
Copy link

Let's say mypackage requires pandas >=3.14.15 but installing pytest requires pandas >=4.0.0 (I am making this up).

In this case Poetry will try to come up with a solution (like finding another pytest version). When this is impossible, it would raise an error before trying to install anything. This happens in version resolution, not during installation. This is the point of using Poetry instead of requirements.txt.

@piccolbo
Copy link

@FranklinYu @dmontagu thanks for your answers but they address different questions. Unless I find a better way to explain my question, I guess it's better to leave it at that.

@kasteph kasteph added the area/docs Documentation issues/improvements label Oct 18, 2019
@pawamoy
Copy link

pawamoy commented Nov 29, 2019

Can I continue the sharing 🙂 ?

Here is my typical .gitlab-ci.yml:

# Global --------------------------

variables:
  PIP_CACHE_DIR: "${CI_PROJECT_DIR}/.cache/pip"

cache:
  key: "${CI_JOB_NAME}"
  paths:
    - .cache/pip
    - .venv

stages:
  - quality
  - tests

# Jobs templates ------------------

.install-deps-template: &install-deps
  before_script:
    - pip install poetry
    - poetry --version
    - poetry config settings.virtualenvs.in-project true
    - poetry install -vv

.quality-template: &quality
  <<: *install-deps
  image: python:3.6
  stage: quality

.test-template: &test
  <<: *install-deps
  stage: tests
  coverage: '/TOTAL.*\s(\d+\.\d+\%)/'
  script: make test
  artifacts:
    paths:
      - tests/logs
    when: always
    expire_in: 1 week

# Quality jobs ----------------------

check-bandit:
  <<: *quality
  script: make check-bandit

check-black:
  <<: *quality
  script: make check-black

check-flake8:
  <<: *quality
  script: make check-flake8

check-isort:
  <<: *quality
  script: make check-isort

check-safety:
  <<: *quality
  script: make check-safety

# Tests jobs ------------------------

python3.6:
  <<: *test
  image: python:3.6

python3.7:
  <<: *test
  image: python:3.7

python3.8:
  <<: *test
  image: python:3.8

pipeline

@tarkatronic
Copy link
Contributor

Adding to the conversation here, I took some of the recommendations from this thread, as well as others, and mixed in some of my own past patterns, and came up with the following .travis.yml, found here:

language: python
python:
  - "3.5"
  - "3.6"
  - "3.7"
  - "3.8"
  - "pypy3"
install: pip install poetry tox-travis codecov
script: tox
after_success: codecov

stages:
  - test
  - lint
  - name: deploy
    if: tag IS present

jobs:
  fast_finish: true
  include:
    - stage: test
      python: 3.7
      env: TOXENV=docs
    - stage: lint
      python: 3.7
      env: TOXENV=black
    - python: 3.7
      env: TOXENV=mypy
    - python: 3.7
      env: TOXENV=pylint
    - python: 3.7
      env: TOXENV=vulture
    - stage: deploy
      python: 3.7
      install: true
      script: true
      after_success: true

before_deploy:
  - pip install --upgrade pip
  - pip install poetry
  - poetry config pypi-token.pypi $PYPI_PASSWORD

deploy:
- provider: script
  script: poetry publish
  on:
    branch: master
    tags: true
- provider: script
  script: ./scripts/build_and_publish_docker.sh
  on:
    branch: master
    tags: true

This has allowed me to continue using tox for development, with the following:

[testenv]
whitelist_externals =
    poetry
setenv =
    PYTHONDONTWRITEBYTECODE=1
    PYTHONHASHSEED=0
    PYTHONWARNINGS=ignore
commands =
    poetry install --no-root -v
    poetry run pytest {posargs}

@louwers
Copy link

louwers commented Aug 18, 2020

Related: #2102

Installing Poetry the same virtual environment as the package being built creates issues.

@gazorby
Copy link

gazorby commented Aug 24, 2020

Here is my .travis.yml based on those above :

dist: xenial
language: python
python:
- 3.8
- 3.6
- 3.7
stages:
- test
- name: deploy
    if: tag IS present

# Tests
before_install:
- pip install poetry
install:
- export PYTHONPATH=$PYTHONPATH:$(pwd)/turbulette
- poetry install --no-root -v
services:
- postgresql
env:
    - DB_DRIVER=postgresql DB_HOST=localhost DB_PORT=5432 DB_USER=postgres DB_PASSWORD=""
    PYTEST_TURBULETTE_SETTINGS=tests.settings
before_script:
- psql -c 'create database test;' -U postgres
script: pytest

jobs:
    include:
    - stage: deploy
        script: skip
        before_deploy:
        - pip install --upgrade pip
        - pip install poetry
        - poetry config http-basic.pypi $PYPI_USERNAME $PYPI_PASSWORD
        deploy:
        provider: script
        script: poetry publish --build
        on:
            tags: true
            repo: gazorby/turbulette


after_success:
- bash <(curl -s https://codecov.io/bash)

@pawamoy
Copy link

pawamoy commented Aug 25, 2020

And here is my CI config with a GitHub workflow (migrated from my GitLab CI config, previously commented here):

name: ci

on:
  push:
    branches:
      - master
  pull_request:
    branches:
      - master

defaults:
  run:
    shell: bash

jobs:

  quality:

    runs-on: ubuntu-latest

    steps:
    - name: Checkout
      uses: actions/checkout@v2

    - name: Set up Python 3.6
      uses: actions/setup-python@v1
      with:
        python-version: 3.6

    - name: Set up the cache
      uses: actions/cache@v1
      with:
        path: .venv
        key: cache-python-packages

    - name: Set up the project
      run: |
        pip install poetry safety
        poetry config virtualenvs.in-project true
        make setup

    - name: Check if the documentation builds correctly
      run: make check-docs

    - name: Check the code quality
      run: make check-code-quality

    - name: Check if the code is correctly typed
      run: make check-types

    - name: Check for vulnerabilities in dependencies
      run: make check-dependencies

  tests:

    strategy:
      max-parallel: 6
      matrix:
        os: [ubuntu-latest, macos-latest, windows-latest]
        python-version: [3.6,3.7,3.8]

    runs-on: ${{ matrix.os }}

    steps:
    - name: Checkout
      uses: actions/checkout@v2

    - name: Set up Python ${{ matrix.python-version }}
      uses: actions/setup-python@v1
      with:
        python-version: ${{ matrix.python-version }}

    - name: Set up the cache
      uses: actions/cache@v1
      env:
        cache-name: cache-python-packages
      with:
        path: .venv
        key: ${{ matrix.os }}-${{ matrix.python-version }}-${{ env.cache-name }}
        restore-keys: |
          ${{ matrix.os }}-${{ matrix.python-version }}-
          ${{ matrix.os }}-

    - name: Set up the project
      run: |
        pip install poetry
        poetry config virtualenvs.in-project true
        make setup

    - name: Run the test suite
      run: make test

The make actions almost all run tools with poetry run ....

@earonesty
Copy link

When I run "poetry config" via CI/CD i invariably get keyring lock errors.

@sisp
Copy link
Contributor

sisp commented Sep 2, 2020

@earonesty In case you're using Tox, this suggestion solved a keyring error I had been getting with poetry config in CI/CD: jaraco/keyring#283 (comment)

@br3ndonland
Copy link

br3ndonland commented Sep 12, 2020

Poetry tips for GitHub Actions workflows

Use caching to speed up workflows

Use actions/cache with a variation on their pip cache example to cache Poetry dependencies for faster installation.

- name: Set up Poetry cache for Python dependencies
  uses: actions/cache@v2
  if: startsWith(runner.os, 'Linux')
  with:
    path: ~/.cache/pypoetry
    key: ${{ runner.os }}-poetry-${{ hashFiles('**/poetry.lock') }}
    restore-keys: ${{ runner.os }}-poetry-

Use the custom installer

Installing Poetry via pip can lead to dependency conflicts, so the custom installer is recommended. The command listed in the docs exits in GitHub Actions with 127 (not on $PATH).

There are some additional modifications needed for GitHub Actions:

  • Add -y to avoid prompts.
  • Add Poetry to $GITHUB_PATH (note that the ::set-env syntax has been deprecated).
  • Move poetry install to separate step to ensure Poetry is on $GITHUB_PATH.
- name: Install Poetry
  run: |
    curl -fsS -o get-poetry.py https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py
    python get-poetry.py -y
    echo "$HOME/.poetry/bin" >> $GITHUB_PATH
- name: Install dependencies
  run: poetry install --no-interaction

Use environment variables for config

Poetry allows config from the poetry config command, or by environment variables. Environment variables are a more dependable way to configure Poetry in CI.

env:
  POETRY_VIRTUALENVS_CREATE: false

Build and publish in one step

  • Create a PyPI token.
  • Add it to the GitHub Secrets store for the repo (Settings -> Secrets).
  • Use the secret in your workflow with ${{ secrets.PYPI_TOKEN }} (secret name is PYPI_TOKEN in this example, and username for PyPI tokens is __token__).
  • Use poetry publish --build to build and publish in one step.
- name: Build Python package and publish to PyPI
  if: startsWith(github.ref, 'refs/tags/')
  run: poetry publish --build -u __token__ -p ${{ secrets.PYPI_TOKEN }}

That's why they call it Poetry. Beautiful.

Example workflow

Expand this details element for an example workflow from br3ndonland/inboard that uses these tips.
name: builds

on:
  push:
    branches: [develop, master]
    tags:
      - "[0-9v]+.[0-9]+.[0-9a-z]+"
  workflow_dispatch:

jobs:
  python:
    runs-on: ubuntu-latest
    env:
      POETRY_VIRTUALENVS_CREATE: false
    steps:
      - uses: actions/checkout@v2
      - uses: actions/setup-python@v2
        with:
          python-version: 3.8
      - name: Set up Poetry cache for Python dependencies
        uses: actions/cache@v2
        if: startsWith(runner.os, 'Linux')
        with:
          path: ~/.cache/pypoetry
          key: ${{ runner.os }}-poetry-${{ hashFiles('**/poetry.lock') }}
          restore-keys: ${{ runner.os }}-poetry-
      - name: Set up pre-commit cache
        uses: actions/cache@v2
        if: startsWith(runner.os, 'Linux')
        with:
          path: ~/.cache/pre-commit
          key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
          restore-keys: ${{ runner.os }}-pre-commit-
      - name: Install Poetry
        run: |
          curl -fsS -o get-poetry.py https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py
          python get-poetry.py -y
          echo "$HOME/.poetry/bin" >> $GITHUB_PATH
      - name: Install dependencies
        run: poetry install --no-interaction -E fastapi
      - name: Run pre-commit hooks
        run: pre-commit run --all-files
      - name: Run unit tests
        run: pytest
      - name: Build Python package and publish to PyPI
        if: startsWith(github.ref, 'refs/tags/')
        run: poetry publish --build -u __token__ -p ${{ secrets.PYPI_TOKEN }}
  docker:
    runs-on: ubuntu-latest
    needs: [python]
    steps:
      - uses: actions/checkout@v2
      - name: Log in to Docker registry
        run: docker login ghcr.io -u ${{ github.actor }} -p ${{ secrets.PAT_GHCR }}
      - name: Build Docker images
        run: |
          docker build . --rm --target base -t ghcr.io/br3ndonland/inboard:base --cache-from python:3.8
          docker build . --rm --target starlette -t ghcr.io/br3ndonland/inboard:starlette
          docker build . --rm --target fastapi -t ghcr.io/br3ndonland/inboard:fastapi
      - name: Push Docker images to registry
        run: |
          docker push ghcr.io/br3ndonland/inboard:base
          docker push ghcr.io/br3ndonland/inboard:starlette
          docker push ghcr.io/br3ndonland/inboard:fastapi
      - name: Add Git tag to Docker images
        if: startsWith(github.ref, 'refs/tags/')
        run: |
          GIT_TAG=$(echo ${{ github.ref }} | cut -d / -f 3)
          docker tag ghcr.io/br3ndonland/inboard:base ghcr.io/br3ndonland/inboard:base-"$GIT_TAG"
          docker tag ghcr.io/br3ndonland/inboard:starlette ghcr.io/br3ndonland/inboard:starlette-"$GIT_TAG"
          docker tag ghcr.io/br3ndonland/inboard:fastapi ghcr.io/br3ndonland/inboard:fastapi-"$GIT_TAG"
          docker push ghcr.io/br3ndonland/inboard:base-"$GIT_TAG"
          docker push ghcr.io/br3ndonland/inboard:starlette-"$GIT_TAG"
          docker push ghcr.io/br3ndonland/inboard:fastapi-"$GIT_TAG"
      - name: Tag and push latest image
        run: |
          docker tag ghcr.io/br3ndonland/inboard:fastapi ghcr.io/br3ndonland/inboard:latest
          docker push ghcr.io/br3ndonland/inboard:latest

Bonus: automated dependency updates with Dependabot

Dependabot now offers automated version updates, with (preliminary) support for Poetry 🎉. If you have access to the Dependabot beta, set up .github/dependabot.yml as described in the docs:

version: 2
updates:
  - package-ecosystem: "github-actions"
    directory: "/"
    schedule:
      interval: "weekly"
  - package-ecosystem: "pip"
    directory: "/"
    schedule:
      interval: "weekly"

Dependabot will now send you PRs when dependency updates are available. Although package-ecosystem must be set to pip, it will pick up the pyproject.toml and poetry.lock. Check the status of the repo at Insights -> Dependency graph -> Dependabot.

@alexpovel
Copy link
Contributor

alexpovel commented Nov 10, 2020

Can I continue the sharing 🙂 ?

Here is my typical .gitlab-ci.yml:

# Global --------------------------

variables:
  PIP_CACHE_DIR: "${CI_PROJECT_DIR}/.cache/pip"

cache:
  key: "${CI_JOB_NAME}"
  paths:
    - .cache/pip
    - .venv

stages:
  - quality
  - tests

# Jobs templates ------------------

.install-deps-template: &install-deps
  before_script:
    - pip install poetry
    - poetry --version
    - poetry config settings.virtualenvs.in-project true
    - poetry install -vv

.quality-template: &quality
  <<: *install-deps
  image: python:3.6
  stage: quality

.test-template: &test
  <<: *install-deps
  stage: tests
  coverage: '/TOTAL.*\s(\d+\.\d+\%)/'
  script: make test
  artifacts:
    paths:
      - tests/logs
    when: always
    expire_in: 1 week

# Quality jobs ----------------------

check-bandit:
  <<: *quality
  script: make check-bandit

check-black:
  <<: *quality
  script: make check-black

check-flake8:
  <<: *quality
  script: make check-flake8

check-isort:
  <<: *quality
  script: make check-isort

check-safety:
  <<: *quality
  script: make check-safety

# Tests jobs ------------------------

python3.6:
  <<: *test
  image: python:3.6

python3.7:
  <<: *test
  image: python:3.7

python3.8:
  <<: *test
  image: python:3.8

pipeline

This is great. @pawamoy , do you have any pointers on what PIP_CACHE_DIR does exactly? It doesn't seem widely used. I am trying to get poetry to install into the local directory when doing pip install poetry . For this, the --target options works. It allows us to install the entire thing, including modules and bin/aries into a subdirectory of the working directory. This allows for caching.

However, it also means we have to set PATH as well as PYTHONPATH to find that new custom location. That is not the most sexy approach. PIP_CACHE_DIR seems to alleviate this, but it does not actually change the install path. So as far as I can see, your pip install poetry is not cached and runs each time, can you confirm this?

Thanks for your inspirational file!

Update 2020-11-12

I have split this into two steps for now:

    variables:
        PIP_DOWNLOAD_DIR: ".pip"
    before_script:
        # Allow caching by only downloading first:
        - pip download --dest=${PIP_DOWNLOAD_DIR} poetry  # STEP 1
        - pip install --find-links=${PIP_DOWNLOAD_DIR} poetry  # STEP 2
        # Make available for caching by installing to current directory:
        - poetry config virtualenvs.in-project true
        - poetry install -vv

pip download will just download the dependencies, allowing them to be cached. A later pip install --find-links to that same directory allows to install into the system (container)-wide Python. This allows us to ignore all of PATH and PYTHONPATH, it will Just Work™. The downside is that the install process itself is not cached and will have to rerun.

@cglacet
Copy link

cglacet commented Nov 13, 2020

@pawamoy Thanks a lot for sharing, I discovered you could have templates in .gitlab-ci.yml with your example. On the other hand, is there a reason to run quality check jobs in parallel instead of running them sequentially?

Since they have no side effects I feel like its a waste of ressources (ci/cd pipeline minutes).

Thanks again for sharing your example.

@alexpovel
Copy link
Contributor

@cglacet You mean the other way around, parallel v sequentially? Towards this, you might like the needs keyword. It allows you to build a DAG of job dependencies, speeding stuff up.

The templates used there are YAML anchors, so not GitLab-specific. The extends allows you to do the same thing (essentially have job templates) and is GitLab specific. I find it more readable, but don't know if features differ. For simple stuff, they do the same.

Lastly, if your config is Makefile-based and has calls make <target> in each job's script, where the job name corresponds to target names, those concepts can be combined:

.make:
    # Note we cannot have `default: script:`, so this approach works better.
    script: make ${CI_JOB_NAME}

Then, later, for example:

preflight:
    extends: .make
    stage: .pre

(The .pre stage is always available, as is .post, as well as build, test, deploy, if no other stages are defined). The above will call make preflight, with the term preflight only occurring once in the config. Pretty dry! I suppose it gets more cumbersome to cross-check for existing make targets though.

@pawamoy
Copy link

pawamoy commented Nov 13, 2020

@alexpovel about PIP_CACHE_DIR, I'm not sure anymore 😅 I guess I just picked it from another example somewhere 🙂 Thanks for sharing your solution, this is interesting!

@cglacet I think what you mean is: why don't I run all the checks in a single job, to avoid installing deps 5 times instead of just one? Well indeed, it's a waste of resources. I think I just liked seeing more green checkmarks in GitLab CI. But if wasting time and resources by installing too many deps is an issue to you (well, it should be to everyone), you can always reduce the installed dependencies to a minimum by making use of extras! For example, a flake8 extra with flake8 and all its plugins, then a tests extra with only the dependencies required for running the tests, etc. You'd then adapt the "install deps" step to make use of these extras. This way you can have your parallel jobs without any waste 🙂 And it's faster than sequential jobs 😉

@alexpovel I didn't know all these GitLab CI configuration features, very nice! The snippet I shared was written when GitLab didn't have the extends ability yet 🙂 The trick with make and the CI job name is really nice, thanks for sharing!

@cglacet
Copy link

cglacet commented Nov 19, 2020

@pawamoy

you can always reduce the installed dependencies to a minimum by making use of extras ... This way you can have your parallel jobs without any waste

I feel it's still a waste because you have to start 5 docker instances instead of one. But I agree that having parallel jobs isn't only a cost, reducing user build time is clearly a plus if you build often and don't want to wait. I guess the choice depend on the use case. From my personal perspective I would love to remain under 400 minutes per month so even if it's only saving 20% I would gladly sacrifice a x3 speedup factor on the user build time.

Now I just need to wait and see when I change my mind about this 😄 . The future me will maybe think I'm an ignorant.

For those who wonder how to publish to gitlab from your CI, I struggled finding the info so here is how I did it (I'm not sure that's the right way, but it's working decently):

publish-package:
  stage: publish
  image: python:3.7
  before_script:
    - pip install poetry
    - poetry --version
    - poetry config repositories.gitlab https://gitlab.com/api/v4/projects/${CI_PROJECT_ID}/packages/pypi
    - poetry config http-basic.gitlab gitlab-ci-token ${CI_JOB_TOKEN}
  script:
    - make publish
  only:
    - tags

In my Makefile:

build:  
	@poetry build

publish: build
	@poetry publish -r gitlab

A link that might help building better CI on gitlab: Predefined environment variables reference.

@cglacet
Copy link

cglacet commented Nov 19, 2020

@alexpovel No I meant it this way. When building in parallel you save "real time" because you actually wait less for the whole build to finish. On the other hand consume more computation time (because in this case you start several containers and install the dependencies many times).

@Casyfill
Copy link

Thanks for putting this together.

I am trying to publish a python wrapper for a C# library and hope to build a bunch of wheels for different OS using Github ci. I can build using poetry on each os separately in a different task, but when I hit publish only one set of wheels arrives. Perhaps anyone has a recipe with multiple artifacts published all at once? 🙏

@1ace
Copy link
Contributor

1ace commented Mar 29, 2021

@Casyfill I don't know which python package index you're publishing to (I'm assuming pypi.org? which I'm not familiar with) but it's likely it doesn't allow you to publish a package with the same name and version number as an existing one, even if it's a variant like a different OS or Python version.

Instead of building and publishing your packages one by one, the solution is to build everything as you do now, and then have a separate job that gathers the packages and publishes them in one command.

@Casyfill
Copy link

Casyfill commented Mar 30, 2021

Thanks, @1ace ! That's what I thought. It turned out to be easier than I thought, actually. Here is the link to our implementation:
https://github.com/Casyfill/pyCombo/blob/master/.github/workflows/release.yml

@python-poetry python-poetry locked and limited conversation to collaborators Jun 23, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
area/docs Documentation issues/improvements
Projects
None yet
Development

No branches or pull requests