Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 49 additions & 15 deletions INSTALL
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,21 @@ you get all sources in one place. This is the most convenient way to develop Air
Otherwise, you have to install Airflow and Providers separately from sources in the same environment, which
is not as convenient.


Content of the source archive
-----------------------------

The archive contains a complete snapshot of the whole "apache-airflow" repository, including all
distributions that can be built from the sources: `apache-airflow` meta-distribution (that you can use to install
all other distributions, `apache-airflow-core`, `apache-airflow-task-sdk`, `apache-airflow-ctl`,
`apache-airflow-go-sdk`, more than 90 `apache-airflow-providers-<provider>` distributions, and
all the other distributions that are part of the monorepo and are needed to build other packages,
their documentation and also allows to run tests for all those distributions.

We are using `uv` and workspace tooling to build and manage the packages together, see below for more details.
Whatever distribution you choose to install, you need to localise the right `pyproject.toml` file in the
repository and this is the one that you should use to build the distribution you need.`

Using ``uv`` to manage your Python, virtualenvs, and install airflow for development (recommended)
==================================================================================================

Expand All @@ -62,6 +77,7 @@ Installing ``uv``

You can install uv following the instructions: https://docs.astral.sh/uv/getting-started/installation/


Using ``uv`` to manage your project dependencies
------------------------------------------------

Expand Down Expand Up @@ -107,8 +123,8 @@ You can run any command in the virtual environment created by `uv` by prefixing
This will automatically synchronize your dependencies to latest dependencies needed.


Compiling front-end assets
--------------------------
Compiling front-end assets for Airflow Core
-------------------------------------------

In order to see UI in Airflow, you need to compile front-end assets first.

Expand Down Expand Up @@ -142,12 +158,25 @@ it in the UI) that should contain the git commit hash of the build and it will g
The result of this command is airflow sdist package built in the `dist` folder of `airflow-core`
package as well. It requires ``prek`` to be installed in your system.

There are also similar ``prek`` commands for other packages in the repository - for example:

```
compile-edge-assets -- Compile Edge provider assets
compile-fab-assets -- Compile FAB provider assets
```

However, the compiled, generated assets for those are checked in the repository and you do not need to
compile them manually before building the packages - you only need to do it when you modify the original
UI files for those packages.

Using pip and manually managing your virtualenv
===============================================

While `uv` manages dependencies and venv automatically you might want to manage both manually with
pip and virtualenv. You need to have Python installed in your preferred way for that to work. It is also
way slower than with `uv` and you need to manage your environment manually.
While `uv` manages dependencies and venv automatically and manage workspace automatically, you might want
to manage both manually with pip and virtualenv. You need to have Python installed in your preferred
way for that to work. It is also way slower than with `uv` and you need to manage your environment manually
and sometimes install several distributions together to make tests and documentation work - emulating what
workspace tooling does automatically.

Creating virtualenv
-------------------
Expand Down Expand Up @@ -193,16 +222,19 @@ It's important to keep your hatch up to date. You can do this by running:
uv tool upgrade hatch


Using Hatch to build your packages
----------------------------------
Using Hatch to build packages
-----------------------------

You can use Hatch to build installable packages from the Airflow sources. Such package will
include all metadata configured in `pyproject.toml` and will be installable with ``pip`` and and any other
PEP-compliant packaging front-end. You can run those commands in:

* root folder of the repository to build "meta" airflow package that install other packages
* `airflow-core` folder to build the core airflow package
* any of the `providers` folders that has a pyproject.toml file to build the provider package
* root folder of the repository to build "meta" airflow distribution that install other distribution
* `airflow-core` folder to build the core airflow distribution
* any of the `providers` folders that has a pyproject.toml file to build the provider distribution
* task-sdk to build the task-sdk distribution
* airflow-ctl to build the airflow-ctl distribution
* task-go-sdk to build the task-go-sdk distribution

The packages will have pre-installed dependencies for providers that are available when Airflow is i
onstalled from PyPI. Both `wheel` and `sdist` packages are built by default.
Expand All @@ -214,11 +246,13 @@ You can also build only `wheel` or `sdist` packages:
hatch build -t wheel
hatch build -t sdist

In the `airflow-core` folder, you can also build the package with the `custom` target that will clean
the build directory, update the `git_version` file, and build the assets:
In the `airflow-core` folder, you should also build the package with the `custom` target that will clean
the build directory, update the `git_version` file, and build the assets (in case you have not built
them already manually with `prek`):

hatch build -t custom -t wheel -t sdist


Installing recommended version of dependencies
==============================================

Expand All @@ -231,15 +265,15 @@ that are used in main CI tests and by other contributors.
There are different constraint files for different Python versions. For example, this command will install
all basic devel requirements and requirements of Google provider as last successfully tested for Python 3.10:

uv pip install -e ".[devel,google]"" \
pip install -e ".[devel,google]"" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.10.txt"

Using the 'constraints-no-providers' constraint files, you can upgrade Airflow without paying attention to the provider's dependencies. This allows you to keep installed provider dependencies and install the latest supported ones using pure Airflow core.

uv pip install -e ".[devel]" \
pip install -e ".[devel]" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-no-providers-3.10.txt"

Note that you can also use `pip install` if you do not use `uv`.
Note that you can also use `uv pip install` if you use `uv`.

Airflow extras
==============
Expand Down
7 changes: 7 additions & 0 deletions airflow-core/docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -244,13 +244,20 @@ def add_airflow_core_exclude_patterns_to_sphinx(exclude_patterns: list[str]):
config_descriptions = retrieve_configuration_description(include_providers=False)
configs, deprecated_options = get_configs_and_deprecations(airflow_version, config_descriptions)

# TODO: remove it when we start releasing task-sdk separately from airflow-core
airflow_version_split = PACKAGE_VERSION.split(".")
TASK_SDK_VERSION = f"1.{airflow_version_split[1]}.{airflow_version_split[2]}"

jinja_contexts = {
"config_ctx": {"configs": configs, "deprecated_options": deprecated_options},
"quick_start_ctx": {"doc_root_url": f"https://airflow.apache.org/docs/apache-airflow/{PACKAGE_VERSION}/"},
"official_download_page": {
"base_url": f"https://downloads.apache.org/airflow/{PACKAGE_VERSION}",
"base_url_task_sdk": f"https://downloads.apache.org/airflow/task-sdk/{TASK_SDK_VERSION}",
"closer_lua_url": f"https://www.apache.org/dyn/closer.lua/airflow/{PACKAGE_VERSION}",
"closer_lua_url_task_sdk": f"https://www.apache.org/dyn/closer.lua/airflow/task-sdk/{TASK_SDK_VERSION}",
"airflow_version": PACKAGE_VERSION,
"task_sdk_version": TASK_SDK_VERSION,
},
}

Expand Down
13 changes: 7 additions & 6 deletions airflow-core/docs/installation/installing-from-sources.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,7 @@ Released packages

.. jinja:: official_download_page

This page describes downloading and verifying Airflow® version
``{{ airflow_version }}`` using officially released packages.
This page describes downloading and verifying Airflow® version ``|version|`` using officially released packages.
You can also install ``Apache Airflow`` - as most Python packages - via :doc:`PyPI <installing-from-pypi>`.
You can choose different version of Airflow by selecting a different version from the drop-down at
the top-left of the page.
Expand All @@ -46,10 +45,12 @@ The |version| downloads of Airflow® are available at:
.. jinja:: official_download_page

* `Sources package for airflow <{{ closer_lua_url }}/apache_airflow-{{ airflow_version }}-source.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow-{{ airflow_version }}-source.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow-{{ airflow_version }}-source.tar.gz.sha512>`__)
* `Sdist package for airflow meta package <{{ closer_lua_url }}/apache_airflow-{{ airflow_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow-{{ airflow_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow-{{ airflow_version }}.tar.gz.sha512>`__)
* `Whl package for airflow meta package <{{ closer_lua_url }}/apache_airflow-{{ airflow_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow-{{ airflow_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow-{{ airflow_version }}-py3-none-any.whl.sha512>`__)
* `Sdist package for airflow core package <{{ closer_lua_url }}/apache_airflow_core-{{ airflow_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow_core-{{ airflow_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow_core-{{ airflow_version }}.tar.gz.sha512>`__)
* `Whl package for airflow core package <{{ closer_lua_url }}/apache_airflow_core-{{ airflow_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow_core-{{ airflow_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow_core-{{ airflow_version }}-py3-none-any.whl.sha512>`__)
* `Sdist package for airflow meta distribution <{{ closer_lua_url }}/apache_airflow-{{ airflow_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow-{{ airflow_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow-{{ airflow_version }}.tar.gz.sha512>`__)
* `Whl package for airflow meta distributio <{{ closer_lua_url }}/apache_airflow-{{ airflow_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow-{{ airflow_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow-{{ airflow_version }}-py3-none-any.whl.sha512>`__)
* `Sdist package for airflow core distribution <{{ closer_lua_url }}/apache_airflow_core-{{ airflow_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow_core-{{ airflow_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow_core-{{ airflow_version }}.tar.gz.sha512>`__)
* `Whl package for airflow core distribution <{{ closer_lua_url }}/apache_airflow_core-{{ airflow_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow_core-{{ airflow_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow_core-{{ airflow_version }}-py3-none-any.whl.sha512>`__)
* `Sdist package for airflow task-sdk distribution <{{ closer_lua_url_task_sdk }}/apache_airflow_task_sdk-{{ task_sdk_version }}.tar.gz>`__ (`asc <{{ base_url_task_sdk }}/apache_airflow_task_sdk-{{ task_sdk_version }}.tar.gz.asc>`__, `sha512 <{{ base_url_task_sdk }}/apache_airflow_task_sdk-{{ task_sdk_version }}.tar.gz.sha512>`__)
* `Whl package for airflow task-sdk distribution <{{ closer_lua_url_task_sdk }}/apache_airflow_task_sdk-{{ task_sdk_version }}-py3-none-any.whl>`__ (`asc <{{ base_url_task_sdk }}/apache_airflow_task_sdk-{{ task_sdk_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url_task_sdk }}/apache_airflow_task_sdk-{{ task_sdk_version }}-py3-none-any.whl.sha512>`__)

If you want to install from the source code, you can download from the sources link above, it will contain
a ``INSTALL`` file containing details on how you can build and install Airflow.
Expand Down
12 changes: 9 additions & 3 deletions airflow-ctl/docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -207,12 +207,18 @@ def add_airflow_ctl_exclude_patterns_to_sphinx(exclude_patterns: list[str]):
"config_ctx": {"configs": configs, "deprecated_options": deprecated_options},
"quick_start_ctx": {"doc_root_url": f"https://airflow.apache.org/docs/apache-airflow/{PACKAGE_VERSION}/"},
"official_download_page": {
"base_url": f"https://downloads.apache.org/airflow/{PACKAGE_VERSION}",
"closer_lua_url": f"https://www.apache.org/dyn/closer.lua/airflow/{PACKAGE_VERSION}",
"airflow_version": PACKAGE_VERSION,
"base_url": f"https://downloads.apache.org/airflow/airflow-ctl/{PACKAGE_VERSION}",
"closer_lua_url": f"https://www.apache.org/dyn/closer.lua/airflow/airflow-ctl/{PACKAGE_VERSION}",
"airflowctl_version": PACKAGE_VERSION,
},
}

# Use for generate rst_epilog and other post-generation substitutions
global_substitutions = {
"version": PACKAGE_VERSION,
"experimental": "This is an :ref:`experimental feature <experimental>`.",
}

# -- Options for sphinx.ext.autodoc --------------------------------------------
# See: https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html

Expand Down
22 changes: 6 additions & 16 deletions airflow-ctl/docs/installation/installing-from-sources.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,8 @@ Released packages

.. jinja:: official_download_page

This page describes downloading and verifying Airflow® version
``{{ airflow_version }}`` using officially released packages.
You can also install ``Apache airflowctl`` - as most Python packages - via :doc:`PyPI <installing-from-pypi>`.
This page describes downloading and verifying Airflow Ctl version ``|version|`` using officially released packages.
You can also install ``airflowctl`` - as most Python packages - via :doc:`PyPI <installing-from-pypi>`.
You can choose different version of Airflow by selecting a different version from the drop-down at
the top-left of the page.

Expand All @@ -34,22 +33,13 @@ can use if you want to verify the origin of the packages and want to verify chec
the packages. The packages are available via the
`Official Apache Software Foundations Downloads <https://dlcdn.apache.org/>`_

As of version 2.8 Airflow follows PEP 517/518 and uses ``pyproject.toml`` file to define build dependencies
and build process and it requires relatively modern versions of packaging tools to get airflow built from
local sources or ``sdist`` packages, as PEP 517 compliant build hooks are used to determine dynamic build
dependencies. In case of ``pip`` it means that at least version 22.1.0 is needed (released at the beginning of
2022) to build or install Airflow from sources. This does not affect the ability of installing Airflow from
released wheel packages.

The |version| downloads of airflowctl are available at:
The {{ airflowctl_version }} downloads of Airflow Ctl are available at:

.. jinja:: official_download_page

* `Sources package for airflow <{{ closer_lua_url }}/apache-airflow-ctl-{{ airflowctl_version }}-source.tar.gz>`__ (`asc <{{ base_url }}/apache-airflow-ctl-{{ airflowctl_version }}-source.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache-airflow-ctl-{{ airflowctl_version }}-source.tar.gz.sha512>`__)
* `Sdist package for airflow meta package <{{ closer_lua_url }}/apache-airflow-ctl-{{ airflowctl_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache-airflow-ctl-{{ airflowctl_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache-airflow-ctl-{{ airflowctl_version }}.tar.gz.sha512>`__)
* `Whl package for airflow meta package <{{ closer_lua_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl.sha512>`__)
* `Sdist package for airflow core package <{{ closer_lua_url }}/apache-airflow_ctl-{{ airflowctl_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache-airflow_ctl-{{ airflowctl_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache-airflow_ctl-{{ airflowctl_version }}.tar.gz.sha512>`__)
* `Whl package for airflow core package <{{ closer_lua_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl.sha512>`__)
* `Sources package for airflow-ctl: <{{ closer_lua_url }}/apache_airflow_ctl-{{ airflowctl_version }}-source.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-source.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-source.tar.gz.sha512>`__)
* `Sdist package for airflow-ctl distributions <{{ closer_lua_url }}/apache_airflow_ctl-{{ airflowctl_version }}.tar.gz>`__ (`asc <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}.tar.gz.asc>`__, `sha512 <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}.tar.gz.sha512>`__)
* `Whl package for airflow-ctl distribution <{{ closer_lua_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl>`__ (`asc <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl.asc>`__, `sha512 <{{ base_url }}/apache_airflow_ctl-{{ airflowctl_version }}-py3-none-any.whl.sha512>`__)

If you want to install from the source code, you can download from the sources link above, it will contain
a ``INSTALL`` file containing details on how you can build and install airflowctl.
Expand Down
4 changes: 2 additions & 2 deletions dev/README_RELEASE_AIRFLOWCTL.md
Original file line number Diff line number Diff line change
Expand Up @@ -768,8 +768,8 @@ SOURCE_DIR="${ASF_DIST_PARENT}/asf-dist/dev/airflow/airflow-ctl"
# Create airflow-ctl folder if it does not exist
# All latest releases are kept in this one folder without version sub-folder
cd "${ASF_DIST_PARENT}/asf-dist/release/airflow"
mkdir -pv airflow-ctl
cd airflow-ctl
mkdir -pv airflow-ctl/${VERSION}
cd airflow-ctl/${VERSION}

# Copy your airflow-ctl with the target name to dist directory and to SVN
rm -rf "${AIRFLOW_REPO_ROOT}"/dist/*
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,14 +56,23 @@
</div>
</div>

<div class="row">
<div class="col">
<h2><a href="/docs/task-sdk/stable/index.html">Task SDK</a></h2>
<p>
Task-SDK interface that is used to communicate with airflow core from other components.
</p>
</div>
</div>
<div class="row">
<div class="col">
<h2><a href="/docs/apache-airflow-ctl/stable/index.html"><code>apache-airflow-ctl</code></a></h2>
<p>
Apache Airflow CTL, which is remote CLI for Airflow.
</p>
</div>
</div>

<div class="row">
<div class="col">
<h2><a href="/docs/task-sdk/stable/index.html">Task SDK</a></h2>
<p>
Task-SDK interface that is used to communicate with airflow core from other components.
</p>
</div>
</div>

<div class="row">
<div class="col-md order-md-1">
Expand Down
1 change: 1 addition & 0 deletions docs/spelling_wordlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -366,6 +366,7 @@ csrf
CSRFProtect
css
csv
Ctl
ctor
Ctrl
cubeName
Expand Down