Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
…16692-show-cron-schedule-description-in-ui-v2
  • Loading branch information
pateash committed Jul 15, 2021
2 parents acdd561 + 2ce6e8d commit 070f641
Show file tree
Hide file tree
Showing 160 changed files with 1,515 additions and 542 deletions.
7 changes: 7 additions & 0 deletions .asf.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,3 +53,10 @@ github:
v2-1-stable:
required_pull_request_reviews:
required_approving_review_count: 1

collaborators:
- bbovenzi
- auvipy
- paolaperaza
- petedejoy
- gmcdonald
2 changes: 1 addition & 1 deletion .github/workflows/stale.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:
stale:
runs-on: ubuntu-20.04
steps:
- uses: actions/stale@v3
- uses: actions/stale@v4
with:
stale-pr-message: >
This pull request has been automatically marked as stale because it has not had
Expand Down
12 changes: 6 additions & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ repos:
files: \.mermaid$
exclude: ^airflow/_vendor/
- repo: https://github.com/thlorenz/doctoc.git
rev: v2.0.0
rev: v2.0.1
hooks:
- id: doctoc
name: Add TOC for md files
Expand All @@ -154,7 +154,7 @@ repos:
- "--maxlevel"
- "2"
- repo: https://github.com/psf/black
rev: 21.5b1
rev: 21.6b0
hooks:
- id: black
args: [--config=./pyproject.toml]
Expand All @@ -164,7 +164,7 @@ repos:
hooks:
- id: blacken-docs
alias: black
additional_dependencies: [black==21.5b1]
additional_dependencies: [black==21.6b0]
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.0.1
hooks:
Expand Down Expand Up @@ -196,13 +196,13 @@ repos:
files: ^chart/values\.schema\.json$|^chart/values_schema\.schema\.json$
pass_filenames: true
- repo: https://github.com/asottile/pyupgrade
rev: v2.19.0
rev: v2.21.0
hooks:
- id: pyupgrade
args: ["--py36-plus"]
exclude: ^airflow/_vendor/
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.8.0
rev: v1.9.0
hooks:
- id: rst-backticks
exclude: ^airflow/_vendor/
Expand All @@ -218,7 +218,7 @@ repos:
exclude:
^.*init_git_sync\.template\.yaml$|^.*airflow\.template\.yaml$|^chart/(?:templates|files)/.*\.yaml$|^airflow/_vendor/
- repo: https://github.com/timothycrosley/isort
rev: 5.8.0
rev: 5.9.2
hooks:
- id: isort
name: Run isort to sort imports in Python files
Expand Down
2 changes: 1 addition & 1 deletion BREEZE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1200,7 +1200,7 @@ This is the current syntax for `./breeze <./breeze>`_:
you want to pull the image with specific COMMIT_SHA tag.
'breeze shell \
-- github-image-id 9a621eaa394c0a0a336f8e1b31b35eff4e4ee86e' - pull/use image with SHA
--github-image-id 9a621eaa394c0a0a336f8e1b31b35eff4e4ee86e' - pull/use image with SHA
'breeze \
--github-image-id 9a621eaa394c0a0a336f8e1b31b35eff4e4ee86e' - pull/use image with SHA
Expand Down
2 changes: 1 addition & 1 deletion CHANGELOG.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Airflow 2.1.2, 2021-07-13
Airflow 2.1.2, 2021-07-14
-------------------------

Bug Fixes
Expand Down
6 changes: 3 additions & 3 deletions CI.rst
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ this image can be built only once and used by all the jobs running tests. The im
rather than build it from the scratch. Pulling such image takes ~ 1 minute, thanks to that we are saving
a lot of precious time for jobs.

We use `GitHub Container Registry <https://docs.github.com/en/packages/guides/about-github-container-registry>`_
We use `GitHub Container Registry <https://docs.github.com/en/packages/guides/about-github-container-registry>`_.
``GITHUB_TOKEN`` is needed to push to the registry and we configured scopes of the tokens in our jobs
to be able to write to the registry.

Expand Down Expand Up @@ -411,9 +411,9 @@ The following components are part of the CI infrastructure
* **GitHub Actions** - (GA) UI + execution engine for our jobs
* **GA CRON trigger** - GitHub Actions CRON triggering our jobs
* **GA Workers** - virtual machines running our jobs at GitHub Actions (max 20 in parallel)
* **GitHub Image Registry**- image registry used as build cache for CI jobs.
* **GitHub Image Registry** - image registry used as build cache for CI jobs.
It is at https://ghcr.io/apache/airflow
* **DockerHub Image Registry**- image registry used to pull base Python images and (manually) publish
* **DockerHub Image Registry** - image registry used to pull base Python images and (manually) publish
the released Production Airflow images. It is at https://dockerhub.com/apache/airflow
* **Official Images** (future) - these are official images that are prominently visible in DockerHub.
We aim our images to become official images so that you will be able to pull them
Expand Down
2 changes: 0 additions & 2 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -872,8 +872,6 @@ This can be done by running this (it utilizes parallel preparation of the constr
for python_version in $(echo "${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}")
do
./breeze build-image --upgrade-to-newer-dependencies --python ${python_version} --build-cache-local
./breeze build-image --upgrade-to-newer-dependencies --python ${python_version} --build-cache-local
./breeze build-image --upgrade-to-newer-dependencies --python ${python_version} --build-cache-local
done
GENERATE_CONSTRAINTS_MODE="pypi-providers" ./scripts/ci/constraints/ci_generate_all_constraints.sh
Expand Down
1 change: 1 addition & 0 deletions INTHEWILD.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,7 @@ Currently, **officially** using Airflow:
1. [Credit Karma](https://www.creditkarma.com/) [[@preete-dixit-ck](https://github.com/preete-dixit-ck) & [@harish-gaggar-ck](https://github.com/harish-gaggar-ck) & [@greg-finley-ck](https://github.com/greg-finley-ck)]
1. [Creditas](https://www.creditas.com.br) [[@dcassiano](https://github.com/dcassiano)]
1. [CreditCards.com](https://www.creditcards.com/) [[@vmAggies](https://github.com/vmAggies) & [@jay-wallaby](https://github.com/jay-wallaby)]
1. [CRST - The Transportation Solution, Inc.](https://crst.com)
1. [Cryptalizer.com](https://www.cryptalizer.com/)
1. [Currency](https://www.gocurrency.com/) [[@FCLI](https://github.com/FCLI) & [@alexbegg](https://github.com/alexbegg)]
1. [Custom Ink](https://www.customink.com/) [[@david-dalisay](https://github.com/david-dalisay), [@dmartin11](https://github.com/dmartin11) & [@mpeteuil](https://github.com/mpeteuil)]
Expand Down
18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,9 +82,9 @@ Airflow is not a streaming solution, but it is often used to process real-time d

Apache Airflow is tested with:

| | Main version (dev) | Stable version (2.1.1) |
| | Main version (dev) | Stable version (2.1.2) |
| -------------------- | ------------------------- | ------------------------ |
| Python | 3.6, 3.7, 3.8, 3.9 | 3.6, 3.7, 3.8 |
| Python | 3.6, 3.7, 3.8, 3.9 | 3.6, 3.7, 3.8, 3.9 |
| Kubernetes | 1.20, 1.19, 1.18 | 1.20, 1.19, 1.18 |
| PostgreSQL | 9.6, 10, 11, 12, 13 | 9.6, 10, 11, 12, 13 |
| MySQL | 5.7, 8 | 5.7, 8 |
Expand Down Expand Up @@ -142,15 +142,15 @@ them to appropriate format and workflow that your tool requires.


```bash
pip install apache-airflow==2.1.1 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.1.1/constraints-3.7.txt"
pip install apache-airflow==2.1.2 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.1.2/constraints-3.7.txt"
```

2. Installing with extras (for example postgres,google)

```bash
pip install apache-airflow[postgres,google]==2.1.1 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.1.1/constraints-3.7.txt"
pip install apache-airflow[postgres,google]==2.1.2 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.1.2/constraints-3.7.txt"
```

For information on installing provider packages check
Expand Down Expand Up @@ -231,7 +231,7 @@ packages:
* **Airflow Providers**: SemVer rules apply to changes in the particular provider's code only.
SemVer MAJOR and MINOR versions for the packages are independent from Airflow version.
For example `google 4.1.0` and `amazon 3.0.3` providers can happily be installed
with `Airflow 2.1.1`. If there are limits of cross-dependencies between providers and Airflow packages,
with `Airflow 2.1.2`. If there are limits of cross-dependencies between providers and Airflow packages,
they are present in providers as `install_requires` limitations. We aim to keep backwards
compatibility of providers with all previously released Airflow 2 versions but
there will be sometimes breaking changes that might make some, or all
Expand All @@ -254,7 +254,7 @@ Apache Airflow version life cycle:

| Version | Current Patch/Minor | State | First Release | Limited Support | EOL/Terminated |
|---------|---------------------|-----------|---------------|-----------------|----------------|
| 2 | 2.1.1 | Supported | Dec 17, 2020 | Dec 2021 | TBD |
| 2 | 2.1.2 | Supported | Dec 17, 2020 | Dec 2021 | TBD |
| 1.10 | 1.10.15 | EOL | Aug 27, 2018 | Dec 17, 2020 | June 17, 2021 |
| 1.9 | 1.9.0 | EOL | Jan 03, 2018 | Aug 27, 2018 | Aug 27, 2018 |
| 1.8 | 1.8.2 | EOL | Mar 19, 2017 | Jan 03, 2018 | Jan 03, 2018 |
Expand All @@ -280,7 +280,7 @@ They are based on the official release schedule of Python and Kubernetes, nicely

2. The "oldest" supported version of Python/Kubernetes is the default one. "Default" is only meaningful
in terms of "smoke tests" in CI PRs which are run using this default version and default reference
image available. Currently ``apache/airflow:latest`` and ``apache/airflow:2.1.1` images
image available. Currently ``apache/airflow:latest`` and ``apache/airflow:2.1.2` images
are both Python 3.6 images, however the first MINOR/MAJOR release of Airflow release after 23.12.2021 will
become Python 3.7 images.

Expand Down
2 changes: 1 addition & 1 deletion STATIC_CODE_CHECKS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,7 @@ Run all tests for last commit :

.. code-block:: bash
./breeze static-check all -- --ref-from HEAD^ --ref-to HEAD
./breeze static-check all -- --from-ref HEAD^ --to-ref HEAD
The ``license`` check is run via a separate script and a separate Docker image containing the
Expand Down
6 changes: 2 additions & 4 deletions UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -954,8 +954,6 @@ in `SubDagOperator`.

#### `airflow.providers.http.operators.http.SimpleHttpOperator`

#### `airflow.providers.http.operators.http.SimpleHttpOperator`

The `do_xcom_push` flag (a switch to push the result of an operator to xcom or not) was appearing in different incarnations in different operators. It's function has been unified under a common name (`do_xcom_push`) on `BaseOperator`. This way it is also easy to globally disable pushing results to xcom.

The following operators were affected:
Expand Down Expand Up @@ -1934,7 +1932,7 @@ We deprecated a number of extras in 2.0.

For example:

If you want to install integration for Microsoft Azure, then instead of `pip install apache-airflow[atlas]`
If you want to install integration for Apache Atlas, then instead of `pip install apache-airflow[atlas]`
you should use `pip install apache-airflow[apache.atlas]`.


Expand All @@ -1946,7 +1944,7 @@ If you want to install integration for Microsoft Azure, then instead of
pip install 'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]'
```

you should execute `pip install 'apache-airflow[azure]'`
you should run `pip install 'apache-airflow[microsoft.azure]'`

If you want to install integration for Amazon Web Services, then instead of
`pip install 'apache-airflow[s3,emr]'`, you should execute `pip install 'apache-airflow[aws]'`
Expand Down
8 changes: 0 additions & 8 deletions airflow/migrations/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,15 +98,7 @@ def run_migrations_online():
)

with context.begin_transaction():
if connection.dialect.name == 'mysql' and connection.dialect.server_version_info >= (5, 6):
connection.execute("select GET_LOCK('alembic',1800);")
if connection.dialect.name == 'postgresql':
context.get_context()._ensure_version_table()
connection.execute("LOCK TABLE alembic_version IN ACCESS EXCLUSIVE MODE")
context.run_migrations()
if connection.dialect.name == 'mysql' and connection.dialect.server_version_info >= (5, 6):
connection.execute("select RELEASE_LOCK('alembic');")
# for Postgres lock is released when transaction ends


if context.is_offline_mode():
Expand Down
Loading

0 comments on commit 070f641

Please sign in to comment.