Skip to content

Commit

Permalink
Attempt to add Python 3.11 support
Browse files Browse the repository at this point in the history
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: #27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
  • Loading branch information
potiuk committed Oct 27, 2022
1 parent 55f8a63 commit 71cbff2
Show file tree
Hide file tree
Showing 80 changed files with 2,936 additions and 2,695 deletions.
2 changes: 1 addition & 1 deletion CI.rst
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Container Registry used as cache
We are using GitHub Container Registry to store the results of the ``Build Images``
workflow which is used in the ``Tests`` workflow.

Currently in main version of Airflow we run tests in 4 different versions of Python (3.7, 3.8, 3.9, 3.10)
Currently in main version of Airflow we run tests in those versions of Python (3.7, 3.8, 3.9, 3.10, 3.11)
which means that we have to build 8 images (4 CI ones and 4 PROD ones). Yet we run around 12 jobs
with each of the CI images. That is a lot of time to just build the environment to run. Therefore
we are utilising ``pull_request_target`` feature of GitHub Actions.
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -1215,7 +1215,7 @@ ARG ADDITIONAL_PYTHON_DEPS=""
# are compatible with the new protobuf version. All the google python client libraries need
# to be upgraded to >=2.0.0 in order to able to lift that limitation
# https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates
ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 pyarrow>=6.0.0 protobuf<4.21.0"
ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 pyarrow>=6.0.0;python_version==\"3.10\" protobuf<4.21.0"

ENV ADDITIONAL_PYTHON_DEPS=${ADDITIONAL_PYTHON_DEPS} \
INSTALL_PACKAGES_FROM_CONTEXT=${INSTALL_PACKAGES_FROM_CONTEXT} \
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.ci
Original file line number Diff line number Diff line change
Expand Up @@ -1142,7 +1142,7 @@ RUN echo "Airflow version: ${AIRFLOW_VERSION}"
# are compatible with the new protobuf version. All the google python client libraries need
# to be upgraded to >= 2.0.0 in order to able to lift that limitation
# https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates
ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 pyarrow>=6.0.0 protobuf<4.21.0"
ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 pyarrow>=6.0.0;python_version==\"3.10\" protobuf<4.21.0"
ARG UPGRADE_TO_NEWER_DEPENDENCIES="false"
ENV EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS=${EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS} \
UPGRADE_TO_NEWER_DEPENDENCIES=${UPGRADE_TO_NEWER_DEPENDENCIES}
Expand Down
12 changes: 6 additions & 6 deletions LOCAL_VIRTUALENV.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Required Software Packages
Use system-level package managers like yum, apt-get for Linux, or
Homebrew for macOS to install required software packages:

* Python (One of: 3.7, 3.8, 3.9, 3.10)
* Python (One of: 3.7, 3.8, 3.9, 3.10, 3.11)
* MySQL 5.7+
* libxml

Expand Down Expand Up @@ -102,7 +102,7 @@ Creating a Local virtualenv

To use your IDE for Airflow development and testing, you need to configure a virtual
environment. Ideally you should set up virtualenv for all Python versions that Airflow
supports (3.7, 3.8, 3.9, 3.10).
supports (3.7, 3.8, 3.9, 3.10, 3.11).

To create and initialize the local virtualenv:

Expand All @@ -122,7 +122,7 @@ To create and initialize the local virtualenv:

.. code-block:: bash
conda create -n airflow python=3.7 # or 3.8, 3.9, 3.10
conda create -n airflow python=3.7 # or 3.8, 3.9, 3.10, 3.11
conda activate airflow
2. Install Python PIP requirements:
Expand Down Expand Up @@ -150,7 +150,7 @@ for different python versions). For development on current main source:

.. code-block:: bash
# use the same version of python as you are working with, 3.7, 3.8, 3.9, or 3.10
# use the same version of python as you are working with, 3.7, 3.8, 3.9, 3.10 or 3.11
pip install -e ".[devel,<OTHER EXTRAS>]" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-source-providers-3.7.txt"
Expand All @@ -163,7 +163,7 @@ You can also install Airflow in non-editable mode:

.. code-block:: bash
# use the same version of python as you are working with, 3.7, 3.8, 3.9, or 3.10
# use the same version of python as you are working with, 3.7, 3.8, 3.9, 3.10 or 3.11
pip install ".[devel,<OTHER EXTRAS>]" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-source-providers-3.7.txt"
Expand All @@ -173,7 +173,7 @@ sources, unless you set ``INSTALL_PROVIDERS_FROM_SOURCES`` environment variable

.. code-block:: bash
# use the same version of python as you are working with, 3.7, 3.8, 3.9, or 3.10
# use the same version of python as you are working with, 3.7, 3.8, 3.9, 3.10 or 3.11
INSTALL_PROVIDERS_FROM_SOURCES="true" pip install ".[devel,<OTHER EXTRAS>]" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-source-providers-3.7.txt"
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ Apache Airflow is tested with:

| | Main version (dev) | Stable version (2.4.2) |
|---------------------|------------------------------|------------------------------|
| Python | 3.7, 3.8, 3.9, 3.10 | 3.7, 3.8, 3.9, 3.10 |
| Python | 3.7, 3.8, 3.9, 3.10, 3.11 | 3.7, 3.8, 3.9, 3.10, 3.11 |
| Platform | AMD64/ARM64(\*) | AMD64/ARM64(\*) |
| Kubernetes | 1.21, 1.22, 1.23, 1.24, 1.25 | 1.21, 1.22, 1.23, 1.24, 1.25 |
| PostgreSQL | 10, 11, 12, 13, 14 | 10, 11, 12, 13, 14 |
Expand Down
5 changes: 5 additions & 0 deletions airflow/providers/apache/beam/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,11 @@ dependencies:
- apache-airflow>=2.3.0
- apache-beam>=2.39.0

# Excluded because apache-beam and a number of dependent libraries are not
# compatible with 3.11 yet. See https://github.com/apache/beam/issues/23848
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Apache Beam
external-doc-url: https://beam.apache.org/
Expand Down
5 changes: 5 additions & 0 deletions airflow/providers/apache/hive/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,11 @@ dependencies:
- sasl>=0.3.1; python_version>="3.9"
- thrift>=0.9.2

# Excluded because SASL is not yet compatible
# with 3.11. See https://github.com/cloudera/python-sasl/issues/30
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Apache Hive
external-doc-url: https://hive.apache.org/
Expand Down
6 changes: 6 additions & 0 deletions airflow/providers/databricks/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,12 @@ dependencies:
- databricks-sql-connector>=2.0.0, <3.0.0
- aiohttp>=3.6.3, <4

# Excluded because databricks-sql-connector and a number of dependent libraries are not compatible
# with 3.11 yet See https://pyreadiness.org/3.11/. (Pyarrow is one of the important ones)
# See https://github.com/databricks/databricks-sql-python/issues/59 for issue that tracks it
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Databricks
external-doc-url: https://databricks.com/
Expand Down
6 changes: 6 additions & 0 deletions airflow/providers/google/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,12 @@ dependencies:
# Google bigtable client require protobuf <= 3.20.0. We can remove the limitation
# when this limitation is removed
- protobuf<=3.20.0
# Excluded because bigquery and a number of dependent libraries are not compatible with 3.11 yet
# See https://pyreadiness.org/3.11/. Also some of Google libraries are in pretty ancient
# version (pre 2.0.0) and they MUST be migrated in order to add 3.11 support
# This is tracked in https://github.com/apache/airflow/issues/27292
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Google Analytics360
Expand Down
5 changes: 5 additions & 0 deletions airflow/providers/jdbc/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,11 @@ dependencies:
- apache-airflow-providers-common-sql>=1.3.0
- jaydebeapi>=1.1.1

# Excluded because Jpype1 is not yet compatible
# with 3.11 yet. See https://github.com/jpype-project/jpype/pull/1087
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Java Database Connectivity (JDBC)
external-doc-url: https://docs.oracle.com/javase/8/docs/technotes/guides/jdbc/
Expand Down
8 changes: 8 additions & 0 deletions airflow/providers/microsoft/azure/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,14 @@ dependencies:
- azure-servicebus>=7.6.1; platform_machine != "aarch64"
- azure-synapse-spark

# Excluded because azure-uamqp-python and azure-core and a number of dependent libraries are not compatible
# with 3.11 yet See https://pyreadiness.org/3.11/.
# See for issues that track it:
# * https://github.com/Azure/azure-uamqp-python/issues/334
# * https://github.com/Azure/azure-sdk-for-python/issues/27066
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Microsoft Azure Batch
external-doc-url: https://azure.microsoft.com/en-us/services/batch/
Expand Down
7 changes: 7 additions & 0 deletions airflow/providers/papermill/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,13 @@ dependencies:
- papermill[all]>=1.2.1
- scrapbook[all]

# Excluded because papermill and a number of dependent libraries are not compatible
# with 3.11 yet See https://pyreadiness.org/3.11/. (Pyarrow is one of the important ones)
# See for issue that tracks it https://github.com/nteract/papermill/issues/700
excluded-python-versions:
- "3.11"


integrations:
- integration-name: Papermill
external-doc-url: https://github.com/nteract/papermill
Expand Down
7 changes: 7 additions & 0 deletions airflow/providers/snowflake/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,13 @@ dependencies:
- snowflake-connector-python>=2.4.1
- snowflake-sqlalchemy>=1.1.0

# Excluded because snowflake connector python and a number of dependent libraries are not
# compatible with 3.11 yet. See https://pyreadiness.org/3.11/. One of the important dependencies
# is Pyarrow: https://github.com/apache/arrow/pull/14499 but the
# https://github.com/snowflakedb/snowflake-connector-python/issues/1289 is the issue
excluded-python-versions:
- "3.11"

integrations:
- integration-name: Snowflake
external-doc-url: https://snowflake.com/
Expand Down
2 changes: 1 addition & 1 deletion dev/README_RELEASE_AIRFLOW.md
Original file line number Diff line number Diff line change
Expand Up @@ -986,7 +986,7 @@ the older branches, you should set the "skip" field to true.
## Verify production images
```shell script
for PYTHON in 3.7 3.8 3.9 3.10
for PYTHON in 3.7 3.8 3.9 3.10 3.11
do
docker pull apache/airflow:${VERSION}-python${PYTHON}
breeze prod-image verify --image-name apache/airflow:${VERSION}-python${PYTHON}
Expand Down
4 changes: 2 additions & 2 deletions dev/breeze/SELECTIVE_CHECKS.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,8 +109,8 @@ The selective check outputs available are described below:

| Output | Meaning of the output | Example value |
|------------------------------------|--------------------------------------------------------------------------------------------------------|---------------------------------------------------------------|
| all-python-versions | List of all python versions there are available in the form of JSON array | ['3.7', '3.8', '3.9', '3.10'] |
| all-python-versions-list-as-string | List of all python versions there are available in the form of space separated string | 3.7 3.8 3.9 3.10 |
| all-python-versions | List of all python versions there are available in the form of JSON array | ['3.7', '3.8', '3.9', '3.10', '3.11'] |
| all-python-versions-list-as-string | List of all python versions there are available in the form of space separated string | 3.7 3.8 3.9 3.10 3.11 |
| basic-checks-only | Whether to run all static checks ("false") or only basic set of static checks ("true") | false |
| cache-directive | Which cache should be be used for images ("registry", "local" , "disabled") | registry |
| debug-resources | Whether resources usage should be printed during parallel job execution ("true"/ "false") | false |
Expand Down
4 changes: 2 additions & 2 deletions dev/breeze/src/airflow_breeze/global_constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
APACHE_AIRFLOW_GITHUB_REPOSITORY = "apache/airflow"

# Checked before putting in build cache
ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS = ["3.7", "3.8", "3.9", "3.10"]
ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]
DEFAULT_PYTHON_MAJOR_MINOR_VERSION = ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS[0]
ALLOWED_ARCHITECTURES = [Architecture.X86_64, Architecture.ARM]
ALLOWED_BACKENDS = ["sqlite", "mysql", "postgres", "mssql"]
Expand Down Expand Up @@ -153,7 +153,7 @@ def get_default_platform_machine() -> str:
PYTHONDONTWRITEBYTECODE = True

PRODUCTION_IMAGE = False
ALL_PYTHON_MAJOR_MINOR_VERSIONS = ["3.7", "3.8", "3.9", "3.10"]
ALL_PYTHON_MAJOR_MINOR_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]
CURRENT_PYTHON_MAJOR_MINOR_VERSIONS = ALL_PYTHON_MAJOR_MINOR_VERSIONS
CURRENT_POSTGRES_VERSIONS = ["10", "11", "12", "13", "14"]
DEFAULT_POSTGRES_VERSION = CURRENT_POSTGRES_VERSIONS[0]
Expand Down
2 changes: 1 addition & 1 deletion dev/breeze/src/airflow_breeze/utils/selective_checks.py
Original file line number Diff line number Diff line change
Expand Up @@ -360,7 +360,7 @@ def mysql_exclude(self) -> list[dict[str, str]]:

@cached_property
def sqlite_exclude(self) -> list[dict[str, str]]:
return [{"python-version": "3.9"}] if self._full_tests_needed else []
return [{"python-version": "3.11"}] if self._full_tests_needed else []

@cached_property
def kubernetes_versions(self) -> list[str]:
Expand Down
4 changes: 2 additions & 2 deletions dev/breeze/tests/test_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@
[
("backend", "mysql", (True, ["sqlite", "mysql", "postgres", "mssql"]), None),
("backend", "xxx", (False, ["sqlite", "mysql", "postgres", "mssql"]), None),
("python_major_minor_version", "3.8", (True, ["3.7", "3.8", "3.9", "3.10"]), None),
("python_major_minor_version", "3.5", (False, ["3.7", "3.8", "3.9", "3.10"]), None),
("python_major_minor_version", "3.8", (True, ["3.7", "3.8", "3.9", "3.10", "3.11"]), None),
("python_major_minor_version", "3.5", (False, ["3.7", "3.8", "3.9", "3.10", "3.11"]), None),
("missing", "value", None, AttributeError),
],
)
Expand Down
40 changes: 20 additions & 20 deletions dev/breeze/tests/test_selective_checks.py
Original file line number Diff line number Diff line change
Expand Up @@ -206,8 +206,8 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str):
pytest.param(
("setup.py",),
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"needs-helm-tests": "true",
"run-tests": "true",
Expand All @@ -223,8 +223,8 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str):
pytest.param(
("generated/provider_dependencies.json",),
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"needs-helm-tests": "true",
"run-tests": "true",
Expand Down Expand Up @@ -260,8 +260,8 @@ def test_expected_output_pull_request_main(
("full tests needed",),
"main",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"run-tests": "true",
"docs-build": "true",
Expand All @@ -280,8 +280,8 @@ def test_expected_output_pull_request_main(
),
"main",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"run-tests": "true",
"docs-build": "true",
Expand All @@ -298,8 +298,8 @@ def test_expected_output_pull_request_main(
("full tests needed",),
"main",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"run-tests": "true",
"docs-build": "true",
Expand All @@ -316,8 +316,8 @@ def test_expected_output_pull_request_main(
("full tests needed",),
"v2-3-stable",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"run-tests": "true",
"docs-build": "true",
Expand Down Expand Up @@ -513,8 +513,8 @@ def test_expected_output_pull_request_target(
(),
"main",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"needs-helm-tests": "true",
"run-tests": "true",
Expand All @@ -529,8 +529,8 @@ def test_expected_output_pull_request_target(
(),
"v2-3-stable",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"needs-helm-tests": "false",
"run-tests": "true",
Expand All @@ -546,8 +546,8 @@ def test_expected_output_pull_request_target(
(),
"main",
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"needs-helm-tests": "true",
"run-tests": "true",
Expand Down Expand Up @@ -595,8 +595,8 @@ def test_no_commit_provided_trigger_full_build_for_any_event_type(github_event):
)
assert_outputs_are_printed(
{
"all-python-versions": "['3.7', '3.8', '3.9', '3.10']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10",
"all-python-versions": "['3.7', '3.8', '3.9', '3.10', '3.11']",
"all-python-versions-list-as-string": "3.7 3.8 3.9 3.10 3.11",
"image-build": "true",
"needs-helm-tests": "true",
"run-tests": "true",
Expand Down
2 changes: 1 addition & 1 deletion dev/provider_packages/prepare_provider_packages.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
from rich.syntax import Syntax
from yaml import safe_load

ALL_PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10"]
ALL_PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]

INITIAL_CHANGELOG_CONTENT = """
Expand Down
2 changes: 1 addition & 1 deletion dev/retag_docker_images.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@

import rich_click as click

PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10"]
PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]

GHCR_IO_PREFIX = "ghcr.io"

Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/installation/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Prerequisites

Starting with Airflow 2.3.0, Airflow is tested with:.

* Python: 3.7, 3.8, 3.9, 3.10
* Python: 3.7, 3.8, 3.9, 3.10, 3.11

* Databases:

Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/upgrading-from-1-10/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Airflow 1.10 was the last release series to support Python 2. Airflow 2.0.0
requires Python 3.6+ and has been tested with Python versions 3.6, 3.7 and 3.8.
Python 3.9 support was added from Airflow 2.1.2.

Airflow 2.3.0 dropped support for Python 3.6. It's tested with Python 3.7, 3.8, 3.9 and 3.10.
Airflow 2.3.0 dropped support for Python 3.6. It's tested with Python 3.7, 3.8, 3.9, 3.10, 3.11.

If you have a specific task that still requires Python 2 then you can use the ``@task.virtualenv``, ``@task.docker`` or ``@task.kubernetes`` decorators for this.

Expand Down
Loading

0 comments on commit 71cbff2

Please sign in to comment.