-
Notifications
You must be signed in to change notification settings - Fork 564
Release process
Here are some guidelines on how to do a release of pyodbc to PyPi:
-
You'll need an account on the Python Package Index (PyPi) website, with the privileges to upload files to the pyodbc project.
-
(Optional) an account on the Test Python Package Index (Test PyPi) website, with the privileges to upload files to the pyodbc project. Note, accounts on PyPi and Test Pypi are completely separate. You'll need two accounts.
-
The Python
twine
utility for uploading build files to PyPi. It's recommended to install this usingpipx
so it is universally available but doesn't pollute your system Python installation(s).
-
The package version for the new release should already be defined in
setup.py
(see the VERSION constant), for example "5.2.0". However, if that value is not correct, change it and commit, then wait for the CI pipelines to complete (manually triggering them if necessary). -
Choose the git commit that will be used for the release. This will almost always be the head of the
master
branch, but it can theoretically be any commit. -
In your local pyodbc repository, create an annotated git tag for the chosen commit, as follows:
git tag -a <package version> <commit SHA>
You will be prompted to enter a message for the tag. Give a short description of the release, a fuller description will be added to the Github release notes later.
Push this tag to Github.
-
Delete all the files in the top-level
/dist
directory in your local pyodbc repository (create the/dist
directory if necessary).The
/dist
directory will be used to store all the build files before uploading to PyPi. In principle, any directory could serve this purpose but by convention/dist
is used. -
From the Github Actions page, find the chosen commit in the "Ubuntu build" workflow and download the two zipped artifacts "sdist" and "wheels" into your
/dist
directory. These artifacts cover all the files for Python 3.https://github.com/mkleehammer/pyodbc/actions/workflows/ubuntu_build.yml
-
From the AppVeyor History page, find the chosen commit and download the artifacts for Python 2.7 (both 32-bit and 64-bit) into your
/dist
directory from the build page.https://ci.appveyor.com/project/mkleehammer/pyodbc/history
(If there are no artifacts, go to the Environment page in Settings, and set the environment variable "APVYR_GENERATE_WHEELS" to "true", then re-build the commit from the build page)
Note, this step will become obsolete when pyodbc no longer supports Python 2.7.
-
In your
/dist
directory, unzip the two zip files "sdist" and "wheels" to retrieve the sdist/wheel files within them. Delete the zip files. Don't unzip the AppVeyor .tar.gz files. -
From the command prompt, navigate to the top-level pyodbc directory, and then upload the contents of the
/dist
directory to the Test PyPi:twine upload -r testpypi dist/*
Check https://test.pypi.org/project/pyodbc/ to make sure the files were uploaded successfully.
-
If the upload to Test PyPi is successful, upload the same contents of the
/dist
directory to the real PyPi.Bear in mind, you get only one shot at this! You can't replace files in PyPi, so make sure you get this right first time:
twine upload dist/*
Check https://pypi.org/project/pyodbc/ to make sure the files were uploaded successfully.
-
Create release notes from the Github portal, based on the git tag created earlier. This will be the main source of information about the release so make it reasonably comprehensive.
-
Finally, increment the VERSION constant in
setup.py
to the next release, and commit. For example, from "5.2.0" to "5.3.0". This ensures there's no confusion about newly-generated build files.And that's it!