diff --git a/airflow-ctl/src/airflowctl/__init__.py b/airflow-ctl/src/airflowctl/__init__.py index 93ffa6bbc4107..6769fff006043 100644 --- a/airflow-ctl/src/airflowctl/__init__.py +++ b/airflow-ctl/src/airflowctl/__init__.py @@ -19,4 +19,4 @@ __path__ = __import__("pkgutil").extend_path(__path__, __name__) -__version__ = "1.0.0b1" +__version__ = "1.0.0" diff --git a/dev/README_RELEASE_AIRFLOWCTL.md b/dev/README_RELEASE_AIRFLOWCTL.md new file mode 100644 index 0000000000000..4ceb68dc8f8e1 --- /dev/null +++ b/dev/README_RELEASE_AIRFLOWCTL.md @@ -0,0 +1,933 @@ + + + +**Table of contents** + +- [What the airflow-ctl distribution is](#what-the-airflow-ctl-distribution-is) +- [The airflow-ctl distributions](#the-airflow-ctl-distributions) +- [Perform review of security issues that are marked for the release](#perform-review-of-security-issues-that-are-marked-for-the-release) +- [Decide when to release](#decide-when-to-release) +- [Airflow-ctl versioning](#airflow-ctl-versioning) +- [Prepare Regular airflow-ctl distributions (RC)](#prepare-regular-airflow-ctl-distributions-rc) + - [Generate release notes](#generate-release-notes) + - [Build airflow-ctl distributions for SVN apache upload](#build-airflow-ctl-distributions-for-svn-apache-upload) + - [Build and sign the source and convenience packages](#build-and-sign-the-source-and-convenience-packages) + - [Commit the source packages to Apache SVN repo](#commit-the-source-packages-to-apache-svn-repo) + - [Publish the Regular distributions to PyPI (release candidates)](#publish-the-regular-distributions-to-pypi-release-candidates) + - [Add tags in git](#add-tags-in-git) + - [Prepare documentation in Staging](#prepare-documentation-in-staging) + - [Prepare issue in GitHub to keep status of testing](#prepare-issue-in-github-to-keep-status-of-testing) + - [Prepare voting email for airflow-ctl release candidate](#prepare-voting-email-for-airflow-ctl-release-candidate) + - [Verify the release candidate by PMC members](#verify-the-release-candidate-by-pmc-members) + - [Verify the release candidate by Contributors](#verify-the-release-candidate-by-contributors) +- [Publish release](#publish-release) + - [Summarize the voting for the Apache Airflow release](#summarize-the-voting-for-the-apache-airflow-release) + - [Publish release to SVN](#publish-release-to-svn) + - [Publish the packages to PyPI](#publish-the-packages-to-pypi) + - [Add tags in git](#add-tags-in-git-1) + - [Publish documentation](#publish-documentation) + - [Notify developers of release](#notify-developers-of-release) + - [Send announcements about security issues fixed in the release](#send-announcements-about-security-issues-fixed-in-the-release) + - [Announce about the release in social media](#announce-about-the-release-in-social-media) + - [Add release data to Apache Committee Report Helper](#add-release-data-to-apache-committee-report-helper) + - [Close the testing status issue](#close-the-testing-status-issue) + - [Remove Airflow-ctl distributions scheduled for removal](#remove-airflow-ctl-distributions-scheduled-for-removal) + + + +------------------------------------------------------------------------------------------------------------ + +# What the airflow-ctl distribution is + +The distribution is separate packages that implement remote CLI for Apache Airflow. + +The Release Manager prepares `airflow-ctl` packages separately from the main Airflow Release, using +`breeze` commands and accompanying scripts. This document provides an overview of the command line tools +needed to prepare the packages. + +NOTE!! When you have problems with any of those commands that run inside `breeze` docker image, you +can run the command with `--debug` flag that will drop you in the shell inside the image and will +print the command that you should run. + +# The airflow-ctl distributions + +The prerequisites to release airflow-ctl are described in [README.md](README.md). + +# Perform review of security issues that are marked for the release + +We are keeping track of security issues in the [Security Issues](https://github.com/airflow-s/airflow-s/issues) +repository currently. As a release manager, you should have access to the repository. +Please review and ensure that all security issues marked for the release have been +addressed and resolved. Ping security team (comment in the issues) if anything missing or +the issue does not seem to be addressed. + +Additionally, the [dependabot alerts](https://github.com/apache/airflow/security/dependabot) and +code [scanning alerts](https://github.com/apache/airflow/security/code-scanning) should be reviewed +and security team should be pinged to review and resolve them. + +# Decide when to release + +You can release `airflow-ctl` distributions separately from the main Airflow on an ad-hoc basis, +whenever we find that airflow-ctl needs to be released - due to new features or due to bug fixes. + +# Airflow-ctl versioning + +We are using the [SEMVER](https://semver.org/) versioning scheme for the `airflow-ctl` distributions. This is in order +to give the users confidence about maintaining backwards compatibility in the new releases of those +packages. + +Set version env variable + +```shell script +VERSION=1.0.0 +VERSION_RC=1.0.0rc1 +``` + +# Prepare Regular airflow-ctl distributions (RC) + +## Generate release notes + +TODO: Describe release notes preparation + +## Build airflow-ctl distributions for SVN apache upload + +Those packages might get promoted to "final" packages by just renaming the files, so internally they +should keep the final version number without the rc suffix, even if they are rc1/rc2/... candidates. + +They also need to be signed and have checksum files. You can generate the checksum/signature files by running +the "dev/sign.sh" script (assuming you have the right PGP key set-up for signing). The script +generates corresponding .asc and .sha512 files for each file to sign. +note: sign script uses `libassuan` and `gnupg` if you don't have them installed run: + +```shell script +brew install libassuan +brew install gnupg +``` + +## Build and sign the source and convenience packages + +* Cleanup dist folder: + +```shell script +export AIRFLOW_REPO_ROOT=$(pwd -P) +rm -rf ${AIRFLOW_REPO_ROOT}/dist/* +``` + +* Release candidate packages: + +```shell script +breeze release-management prepare-airflow-ctl-distributions --distribution-format both +``` + +* Sign all your packages + +```shell script +pushd dist +../dev/sign.sh * +popd +``` + +If you see ``Library not loaded error`` it means that you are missing `libassuan` and `gnupg`. +check above steps to install them. + +## Commit the source packages to Apache SVN repo + +* Push the artifacts to ASF dev dist repo + +```shell script +# First clone the repo if you do not have it +cd .. +[ -d asf-dist ] || svn checkout --depth=immediates https://dist.apache.org/repos/dist asf-dist +svn update --set-depth=infinity asf-dist/dev/airflow + +# Create a new folder for the release. +cd asf-dist/dev/airflow/airflow-ctl + +# Remove previously released versions +svn rm * + +# Move the artifacts to svn folder +mv ${AIRFLOW_REPO_ROOT}/dist/* . + +# Add and commit +svn add * +svn commit -m "Add artifacts for Airflow CTL $(date "+%Y-%m-%d%n")" + +cd ${AIRFLOW_REPO_ROOT} +``` + +Verify that the files are available at +[airflow-ctl](https://dist.apache.org/repos/dist/dev/airflow/airflow-ctl/) + +You should see only airflow-ctl that you are about to release. +If you are seeing others there is an issue. +You can remove the redundant airflow-ctl files manually with: + +```shell script +svn rm -rf file_name // repeat that for every folder/file +svn commit -m "delete old airflow-ctl" +``` + +## Publish the Regular distributions to PyPI (release candidates) + +In order to publish release candidate to PyPI you just need to build and release packages. +The packages should however contain the rcN suffix in the version file name but not internally in the package, +so you need to use `--version-suffix` switch to prepare those packages. +Note that these are different packages than the ones used for SVN upload +though they should be generated from the same sources. + +* Generate the packages with the rc1 version (specify the version suffix with PyPI switch). Note that +you should clean up dist folder before generating the packages, so you will only have the right packages there. + +```shell script +rm -rf ${AIRFLOW_REPO_ROOT}/dist/* + +breeze release-management prepare-airflow-ctl-distributions --version-suffix rc1 --distribution-format both +``` + +* Verify the artifacts that would be uploaded: + +```shell script +twine check ${AIRFLOW_REPO_ROOT}/dist/* +``` + +* Upload the package to PyPi: + +```shell script +twine upload -r pypi ${AIRFLOW_REPO_ROOT}/dist/* +``` + +* Confirm that the packages are available under the links printed and look good. + + +## Add tags in git + +Assume that your remote for apache repository is called `apache` you should now +set tags for the airflow-ctl in the repo. + +Sometimes in cases when there is a connectivity issue to GitHub, it might be possible that local tags get created +and lead to annoying errors. The default behaviour would be to clean such local tags up. + +If you want to disable this behaviour, set the env **CLEAN_LOCAL_TAGS** to false. + +```shell script +git tag -s "airflow-ctl/${VERSION_RC}" +git push apache --tags +``` + +## Prepare documentation in Staging + +Documentation is an essential part of the product and should be made available to users. +In our cases, documentation for the released versions is published in the staging S3 bucket, and the site is +kept in a separate repository - [`apache/airflow-site`](https://github.com/apache/airflow-site), +but the documentation source code and build tools are available in the `apache/airflow` repository, so +you need to run several workflows to publish the documentation. More details about it can be found in +[Docs README](../docs/README.md) showing the architecture and workflows including manual workflows for +emergency cases. + +We have two options publishing the documentation 1. Using breeze commands 2. Manually using GitHub Actions.: + +### Using breeze commands + +You can use the `breeze` command to publish the documentation. +The command does the following: + +1. Triggers [Publish Docs to S3](https://github.com/apache/airflow/actions/workflows/publish-docs-to-s3.yml). +2. Triggers workflow in apache/airflow-site to refresh +3. Triggers S3 to GitHub Sync + +```shell script + breeze workflow-run publish-docs --ref --site-env apache-airflow-ctl +``` + +The `--ref` parameter should be the tag of the release candidate you are publishing. + +The `--site-env` parameter should be set to `staging` for pre-release versions or `live` for final releases. the default option is `auto` +if the tag is rc it publishes to `staging` bucket, otherwise it publishes to `live` bucket. + +One of the interesting features of publishing this way is that you can also rebuild historical version of +the documentation with patches applied to the documentation (if they can be applied cleanly). + +Yoy should specify the `--apply-commits` parameter with the list of commits you want to apply +separated by commas and the workflow will apply those commits to the documentation before +building it (don't forget to add --skip-write-to-stable-folder if you are publishing +previous version of the distribution). Example: + +```shell script +breeze workflow-run publish-docs --ref airflow-ctl/1.0.0rc1 --site-env live \ + --apply-commits 4ae273cbedec66c87dc40218c7a94863390a380d --skip-write-to-stable-folder \ + apache.hive +``` + +Other available parameters can be found with: + +```shell +breeze workflow-run publish-docs --help +``` + +### Manually using GitHub Actions + +There are two steps to publish the documentation: + +1. Publish the documentation to the staging S3 bucket. + +The release manager publishes the documentation using GitHub Actions workflow +[Publish Docs to S3](https://github.com/apache/airflow/actions/workflows/publish-docs-to-s3.yml). + +You should specify the final tag to use to build the docs and `apache-airflow-ctl` as package. + +After that step, the documentation should be available under the http://airflow.staged.apache.org URL +(also present in the PyPI packages) but stable links and drop-down boxes should not be yet updated. + +2. Invalidate Fastly cache, update version drop-down and stable links with the new versions of the documentation. + +Before doing it - review the state of removed, suspended, new packages in +[the docs index](https://github.com/apache/airflow-site/blob/master/landing-pages/site/content/en/docs/_index.md): +Make sure to use `staging` branch to run the workflow. + +```shell script +cd "${AIRFLOW_SITE_DIRECTORY}" +branch="add-documentation-airflow-ctl-${VERSION_RC}" +git checkout -b "${branch}" +git add . +git commit -m "Add documentation for airflow-ctl - ${VERSION_RC}" +git push --set-upstream origin "${branch}" +``` + +Merging the PR with the index changes to `staging` will trigger site publishing. + +If you do not need to merge a PR, you should manually run the +[Build docs](https://github.com/apache/airflow-site/actions/workflows/build.yml) +workflow in `airflow-site` repository to refresh indexes and drop-downs. + +After that build from PR or workflow completes, the new version should be available in the drop-down +list and stable links should be updated, also Fastly cache will be invalidated. + +## Prepare issue in GitHub to keep status of testing + +TODO: prepare an issue + +## Prepare voting email for airflow-ctl release candidate + +Make sure the packages are in https://dist.apache.org/repos/dist/dev/airflow/airflow-ctl/ + +TODO: prepare email + +## Verify the release candidate by PMC members + +### SVN check + +The files should be present in +[Airflow dist](https://dist.apache.org/repos/dist/dev/airflow/airflow-ctl/) + +The following files should be present (6 files): + +* .tar.gz + .asc + .sha512 (one set of files) +* -py3-none-any.whl + .asc + .sha512 (one set of files) + +As a PMC member, you should be able to clone the SVN repository: + +```shell script +cd .. +[ -d asf-dist ] || svn checkout --depth=immediates https://dist.apache.org/repos/dist asf-dist +svn update --set-depth=infinity asf-dist/dev/airflow +``` + +Or update it if you already checked it out: + +```shell script +cd asf-dist/dev/airflow +svn update . +``` + +Set an environment variable: PATH_TO_SVN to the root of folder where you have airflow-ctl + +``` shell +cd asf-dist/dev/airflow +export PATH_TO_SVN=$(pwd -P) +``` + +TODO: implement check in ``check_files.py`` + +Optionally you can use the [`check_files.py`](https://github.com/apache/airflow/blob/main/dev/check_files.py) +script to verify that all expected files are present in SVN. This script will produce a `Dockerfile.pmc` which +may help with verifying installation of the packages. + +Once you have cloned/updated the SVN repository, copy the pypi URLs shared in the email to a file called `packages.txt` in the $AIRFLOW_REPO_ROOT/dev +directory and cd into it. + +```shell script +uv run check_files.py airflow-ctl -p ${PATH_TO_SVN} +``` + +After the above script completes you can build `Dockerfile.pmc` to trigger an installation of airflow-ctl +package and verify the correct versions are installed: + +```shell script +docker build -f Dockerfile.pmc --tag local/airflow . +docker run --rm --entrypoint "airflow-ctl" local/airflow info +docker image rm local/airflow +``` + +### Reproducible package builds checks + +For Airflow-ctl distributions we introduced a reproducible build mechanism - which means that whoever wants +to use sources of Airflow from the release tag, can reproducibly build the same "wheel" and "sdist" +packages as the release manager and they will be byte-by-byte identical, which makes them easy to +verify - if they came from the same sources. This build is only done using released dependencies +from PyPI and source code in our repository - no other binary dependencies are used during the build +process and if the packages produced are byte-by-byte identical with the one we create from tagged sources +it means that the build has a verified provenance. + +How to verify it: + +1) Change directory where your airflow sources are checked out + +```shell +cd "${AIRFLOW_REPO_ROOT}" +``` + +Choose the tag you used for release: + +```shell +git fetch apache --tags +git checkout airflow-ctl/1.0.0rc1 +``` + +3) Remove all the packages you have in dist folder + +```shell +rm -rf dist/* +``` + +4) Build the packages using checked out sources + +```shell +breeze release-management prepare-airflow-ctl-distributions --distribution-format both +``` + +5) Switch to the folder where you checked out the SVN dev files + +```shell +cd ${PATH_TO_SVN} +cd airflow-ctl +``` + +6) Compare the packages in SVN to the ones you just built + +```shell +for i in *.tar.gz *.whl +do + echo -n "$i:"; diff $i ${AIRFLOW_REPO_ROOT}/dist/$i && echo "No diff found" +done +``` + +You should see output similar to: + +``` +apache_airflow_airflow_ctl-1.0.0.tar.gz:No diff found +``` + +### Licences check + +This can be done with the Apache RAT tool. + +* Download the latest jar from https://creadur.apache.org/rat/download_rat.cgi (unpack the binary, + the jar is inside) +* Unpack the release source archive (the `.tar.gz` file) to a folder +* Enter the sources folder run the check + +```shell script +# Get rat if you do not have it +if command -v wget >/dev/null 2>&1; then + echo "Using wget to download Apache RAT..." + wget -qO- https://dlcdn.apache.org//creadur/apache-rat-0.17/apache-rat-0.17-bin.tar.gz | gunzip | tar -C /tmp -xvf - +else + echo "ERROR: wget not found. Install with: brew install wget (macOS) or apt-get install wget (Linux)" + exit 1 +fi +# Cleanup old folders (if needed) +find . -type d -maxdepth 1 | grep -v "^.$"> /tmp/files.txt +cat /tmp/files.txt | xargs rm -rf +# Unpack all packages +for i in *.tar.gz +do + tar -xvzf $i +done +# Generate list of unpacked packages +find . -type d -maxdepth 1 | grep -v "^.$"> /tmp/files.txt +# Check licences +for d in $(cat /tmp/files.txt | sort) +do + pushd $d 2>&1 >/dev/null + echo "Checking licences for $d" + java -jar /tmp/apache-rat-0.17/apache-rat-0.17.jar --input-exclude-file ${AIRFLOW_REPO_ROOT}/.rat-excludes . 2>/dev/null | grep '! ' + popd 2>&1 >/dev/null +done +``` + +You should see output similar to: + +``` +Checking licences for ./apache_airflow_airflow_ctl-1.0.0 +... +``` + +You will see there files that are considered problematic by RAT tool (RAT prints such files preceding them with "! "). + +Cleanup: + +```shell script +cat /tmp/files.txt | xargs rm -rf +``` + +### Signature check + +Make sure you have imported into your GPG the PGP key of the person signing the release. You can find the valid keys in +[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS). + +Download the KEYS file from the above link and save it locally. + +You can import the whole KEYS file into gpg by running the following command: + +```shell script +gpg --import KEYS +``` + +You can also import the keys individually from a keyserver. The below one uses Kaxil's key and +retrieves it from the default GPG keyserver +[OpenPGP.org](https://keys.openpgp.org): + +```shell script +gpg --keyserver keys.openpgp.org --receive-keys CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F +``` + +You should choose to import the key when asked. + +Note that by being default, the OpenPGP server tends to be overloaded often and might respond with +errors or timeouts. Many of the release managers also uploaded their keys to the +[GNUPG.net](https://keys.gnupg.net) keyserver, and you can retrieve it from there. + +```shell script +gpg --keyserver keys.gnupg.net --receive-keys CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F +``` + +Once you have the keys, the signatures can be verified by running this: + +```shell script +for i in *.asc +do + echo -e "Checking $i\n"; gpg --verify $i +done +``` + +This should produce results similar to the below. The "Good signature from ..." is indication +that the signatures are correct. Do not worry about the "not certified with a trusted signature" +warning. Most of the certificates used by release managers are self-signed, and that's why you get this +warning. By importing the key either from the server in the previous step or from the +[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS) page, you know that +this is a valid key already. To suppress the warning you may edit the key's trust level +by running `gpg --edit-key trust` and entering `5` to assign trust level `ultimate`. + +``` +Checking apache-airflow-ctl-1.0.0rc1.tar.gz.asc +gpg: assuming signed data in 'apache-airflow-2.0.2rc4.tar.gz' +gpg: Signature made sob, 22 sie 2020, 20:28:28 CEST +gpg: using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B +gpg: Good signature from "Kaxil Naik " [unknown] +gpg: WARNING: This key is not certified with a trusted signature! +gpg: There is no indication that the signature belongs to the owner. +Primary key fingerprint: 1271 7556 040E EF2E EAF1 B9C2 75FC CD0A 25FA 0E4B +``` + +### SHA512 check + +Run this: + +```shell script +for i in *.sha512 +do + echo "Checking $i"; shasum -a 512 `basename $i .sha512 ` | diff - $i +done +``` + +You should get output similar to: + +``` +Checking apache-airflow-ctl-1.0.0rc1.tar.gz.sha512 +Checking apache_airflow-ctl-1.0.0rc1-py3-none-any.whl.sha512 +... +``` + +## Verify the release candidate by Contributors + +This can be done (and we encourage to) by any of the Contributors. In fact, it's best if the +actual users of Apache Airflow test it in their own staging/test installations. Each release candidate +is available on PyPI apart from SVN packages, so everyone should be able to install +the release candidate version. + +Breeze allows you to easily install and run pre-release candidates by following simple instructions +described in +[Manually testing release candidate packages](https://github.com/apache/airflow/blob/main/contributing-docs/testing/testing_packages.rst) + +But you can use any of the installation methods you prefer (you can even install it via the binary wheels +downloaded from the SVN). + +### Installing in your local virtualenv + +```shell +pip install apache-airflow-ctl>==rc +``` + +### Additional Verification + +Once you install and run Airflow, you can perform any verification you see as necessary to check +that the Airflow works as you expected. + + +# Publish release + +Replace the DAYS_BACK with how many days ago you prepared the release. +Normally it's 3 but in case it's longer change it. The output should match the prepare date. + +``` +export DAYS_BACK=3 +export RELEASE_DATE=$(LANG=en_US.UTF-8 date -u -v-${DAYS_BACK}d "+%B %d, %Y") +export RELEASE_MANAGER_NAME="Elad Kalif" +echo "prepare release date is ${RELEASE_DATE}" +``` + +## Summarize the voting for the Apache Airflow release + +Once the vote has been passed, you will need to send a result vote to dev@airflow.apache.org: + +Email subject: + +``` +cat <" +# Set AIRFLOW_REPO_ROOT to the path of your git repo +export AIRFLOW_REPO_ROOT="$(pwd -P)" + +# Go the folder where you have checked out the release repo from SVN +# Make sure this is direct directory and a symbolic link +# Otherwise 'svn mv' errors out if it is with "E200033: Another process is blocking the working copy database +cd "" + +export ASF_DIST_PARENT="$(pwd -P)" +# make sure physical path is used, in case original directory is symbolically linked +cd "${ASF_DIST_PARENT}" + +# or clone it if it's not done yet +[ -d asf-dist ] || svn checkout --depth=immediates https://dist.apache.org/repos/dist asf-dist +# Update to latest version +svn update --set-depth=infinity asf-dist/dev/airflow asf-dist/release/airflow + +SOURCE_DIR="${ASF_DIST_PARENT}/asf-dist/dev/airflow/airflow-ctl" + +# Create airflow-ctl folder if it does not exist +# All latest releases are kept in this one folder without version sub-folder +cd "${ASF_DIST_PARENT}/asf-dist/release/airflow" +mkdir -pv airflow-ctl +cd airflow-ctl + +# Copy your airflow-ctl with the target name to dist directory and to SVN +rm -rf "${AIRFLOW_REPO_ROOT}"/dist/* + +for file in "${SOURCE_DIR}"/* +do + base_file=$(basename ${file}) + cp -v "${file}" "${AIRFLOW_REPO_ROOT}/dist/${base_file//rc[0-9]/}" + svn mv "${file}" "${base_file//rc[0-9]/}" +done + +# TODO: add cleanup + +# You need to do go to the asf-dist directory in order to commit both dev and release together +cd ${ASF_DIST_PARENT}/asf-dist +# Commit to SVN +svn commit -m "Release Airflow Ctl ${VERSION} " +``` + +Verify that the packages appear in +[airflow-ctl](https://dist.apache.org/repos/dist/release/airflow/airflow-ctl) + +You are expected to see all latest versions of airflow-ctl. +The ones you are about to release (with new version) and the ones that are not part of the current release. + +## Publish the packages to PyPI + +By that time the packages should be in your dist folder. + +```shell script +cd ${AIRFLOW_REPO_ROOT} +git checkout +``` + +example `git checkout airflow-ctl/1.0.0rc1` + +Note you probably will see message `You are in 'detached HEAD' state.` +This is expected, the RC tag is most likely behind the main branch. + +* Verify the artifacts that would be uploaded: + +```shell script +twine check ${AIRFLOW_REPO_ROOT}/dist/*.whl ${AIRFLOW_REPO_ROOT}/dist/*.tar.gz +``` + +* Upload the package to PyPi: + +```shell script +twine upload -r pypi ${AIRFLOW_REPO_ROOT}/dist/*.whl ${AIRFLOW_REPO_ROOT}/dist/*.tar.gz +``` + +* Verify that the packages are available under the links printed. + +Copy links to updated packages, sort it alphabetically and save it on the side. You will need it for the announcement message. + +* Again, confirm that the packages are available under the links printed. + + +## Add tags in git + +Assume that your remote for apache repository is called `apache` you should now +set tags for airflow-ctl in the repo. + +Sometimes in cases when there is a connectivity issue to GitHub, it might be possible that local tags get created +and lead to annoying errors. The default behaviour would be to clean such local tags up. + +If you want to disable this behaviour, set the env **CLEAN_LOCAL_TAGS** to false. + +```shell script +git tag -s airflow-ctl/1.0.0 +``` + +## Publish documentation + +Documentation is an essential part of the product and should be made available to users. +In our cases, documentation for the released versions is published in the `live` S3 bucket, and the site is +kept in a separate repository - [`apache/airflow-site`](https://github.com/apache/airflow-site), +but the documentation source code and build tools are available in the `apache/airflow` repository, so +you need to run several workflows to publish the documentation. More details about it can be found in +[Docs README](../docs/README.md) showing the architecture and workflows including manual workflows for +emergency cases. + +We have two options publishing the documentation 1. Using breeze commands 2. Manually using GitHub Actions.: + +### Using breeze commands + +You can use the `breeze` command to publish the documentation. +The command does the following: + +1. Triggers [Publish Docs to S3](https://github.com/apache/airflow/actions/workflows/publish-docs-to-s3.yml). +2. Triggers workflow in apache/airflow-site to refresh +3. Triggers S3 to GitHub Sync + +```shell script + unset GITHUB_TOKEN + breeze workflow-run publish-docs --ref --site-env airflow-ctl +``` + +The `--ref` parameter should be the tag of the final candidate you are publishing. + +The `--site-env` parameter should be set to `staging` for pre-release versions or `live` for final releases. the default option is `auto` +if the tag is rc it publishes to `staging` bucket, otherwise it publishes to `live` bucket. + +Other available parameters can be found with: + +```shell +breeze workflow-run publish-docs --help +``` + +### Manually using GitHub Actions + +There are two steps to publish the documentation: + +1. Publish the documentation to the live S3 bucket. + +The release manager publishes the documentation using GitHub Actions workflow +[Publish Docs to S3](https://github.com/apache/airflow/actions/workflows/publish-docs-to-s3.yml). + +After that step, the documentation should be available under the http://airflow.apache.org URL +(also present in the PyPI packages) but stable links and drop-down boxes should not be yet updated. + +2. Invalidate Fastly cache, update version drop-down and stable links with the new versions of the documentation. + +Before doing it - review the state of removed, suspended, new packages in +[the docs index](https://github.com/apache/airflow-site/blob/master/landing-pages/site/content/en/docs/_index.md): +Make sure to use `main` branch to run the workflow. + +```shell script +cd airflow-site +export AIRFLOW_SITE_DIRECTORY="$(pwd -P)" +cd "${AIRFLOW_SITE_DIRECTORY}" +branch="add-documentation-$(date "+%Y-%m-%d%n")" +git checkout -b "${branch}" +git add . +git commit -m "Add documentation for packages - $(date "+%Y-%m-%d%n")" +git push --set-upstream origin "${branch}" +``` + +Merging the PR with the index changes to `main` will trigger site publishing. + +If you do not need to merge a PR, you should manually run the +[Build docs](https://github.com/apache/airflow-site/actions/workflows/build.yml) +workflow in `airflow-site` repository to refresh indexes and drop-downs. + +After that build from PR or workflow completes, the new version should be available in the drop-down +list and stable links should be updated, also Fastly cache will be invalidated. + +## Notify developers of release + +Notify users@airflow.apache.org (cc'ing dev@airflow.apache.org) that +the artifacts have been published. + +Subject: + +``` +cat <` in the CVE management tool +at https://cveprocess.apache.org/. Then the release manager should announced the issues via the tool. + +Once announced, each of the issue should be linked with a 'reference' with tag 'vendor advisory' with the +URL to the announcement published automatically by the CVE management tool. +Note that the announce@apache.org is moderated, and the link to the email thread will not be published +immediately, that's why it is recommended to add the link to users@airflow.apache.org which takes usually +few seconds to be published after the CVE tool sends them. + +The ASF Security will be notified and will submit to the CVE project and will set the state to 'PUBLIC'. + +## Announce about the release in social media + +NOTE! + +As a rule we announce only new airlfow-ctl that were added. +If you believe there is a reason to announce in social media for another case consult with PMC +members about it. + +------------------------------------------------------------------------------------------------------------ +Announcement is done from official Apache-Airflow accounts. + +* X: https://x.com/ApacheAirflow +* LinkedIn: https://www.linkedin.com/company/apache-airflow/ +* Fosstodon: https://fosstodon.org/@airflow +* Bluesky: https://bsky.app/profile/apache-airflow.bsky.social + +Make sure attach the release image generated with Figma to the post. +If you don't have access to the account ask a PMC member to post. + +------------------------------------------------------------------------------------------------------------ + +## Add release data to Apache Committee Report Helper + +Add the release data (version and date) at: https://reporter.apache.org/addrelease.html?airflow + +## Close the testing status issue + +Don't forget to thank the folks who tested and close the issue tracking the testing status. + +``` +Thank you everyone. Airflow-ctl are released. +``` + +## Remove Airflow-ctl distributions scheduled for removal + +If there are Airflow-ctl distributions scheduler for removal, create PR and merge it to remove them.