Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
fd0e266
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 12, 2025
e6cafb7
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 12, 2025
2feee69
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 12, 2025
f7f9845
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 15, 2025
79e802b
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 15, 2025
41078f1
Fix dataflow java system test + link (#56286)
VladaZakharova Oct 2, 2025
b9d91c3
Bump ruff to 0.13.3 and prek to 0.2.3 (#56355)
amoghrajesh Oct 3, 2025
6d3e841
fix: allow mapped tasks to accept zero-length inputs on rerun (#56162)
kevinhongzl Oct 3, 2025
d0da0c0
Chart gitsync vars (#56245)
ido177 Oct 3, 2025
ce55be3
Update provider's metadata (#56357)
potiuk Oct 3, 2025
767ae70
Fix install_java.sh (#56211)
VladaZakharova Oct 3, 2025
6583b6d
Add new PL translations. (#56359)
potiuk Oct 3, 2025
e8bc7b4
feature: Add OpenLineage support for CloudDataTransferServiceS3ToGCSO…
pawelgrochowicz Oct 3, 2025
6a93283
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Oct 3, 2025
19e13ce
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 12, 2025
2db1bf6
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 12, 2025
8f7fa67
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 15, 2025
5004c01
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Sep 15, 2025
9a9305a
Merge branch 'feat-ol-cloud-data-fusion-start-pipeline-operator' of h…
pawelgrochowicz Oct 3, 2025
fce7ffc
feature: Add OpenLineage support for CloudDataFusionStartPipelineOper…
pawelgrochowicz Oct 3, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/actions/install-prek/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ inputs:
default: "0.8.22" # Keep this comment to allow automatic replacement of uv version
prek-version:
description: 'prek version to use'
default: "0.2.1" # Keep this comment to allow automatic replacement of prek version
default: "0.2.3" # Keep this comment to allow automatic replacement of prek version
skip-prek-hooks:
description: "Skip some prek hooks from installation"
default: ""
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -412,7 +412,7 @@ repos:
types_or: [python, pyi]
args: [--fix]
require_serial: true
additional_dependencies: ['ruff==0.13.2']
additional_dependencies: ['ruff==0.13.3']
exclude: ^airflow-core/tests/unit/dags/test_imports\.py$|^performance/tests/test_.*\.py$
- id: ruff-format
name: Run 'ruff format'
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.ci
Original file line number Diff line number Diff line change
Expand Up @@ -1678,7 +1678,7 @@ COPY --from=scripts common.sh install_packaging_tools.sh install_additional_depe
ARG AIRFLOW_PIP_VERSION=25.2
# ARG AIRFLOW_PIP_VERSION="git+https://github.com/pypa/pip.git@main"
ARG AIRFLOW_UV_VERSION=0.8.22
ARG AIRFLOW_PREK_VERSION="0.2.1"
ARG AIRFLOW_PREK_VERSION="0.2.3"

# UV_LINK_MODE=copy is needed since we are using cache mounted from the host
ENV AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION} \
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/best-practices.rst
Original file line number Diff line number Diff line change
Expand Up @@ -310,7 +310,7 @@ Installing and Using ruff

.. code-block:: bash

pip install "ruff>=0.13.2"
pip install "ruff>=0.13.3"

2. **Running ruff**: Execute ``ruff`` to check your Dags for potential issues:

Expand Down
7 changes: 6 additions & 1 deletion airflow-core/src/airflow/models/dagrun.py
Original file line number Diff line number Diff line change
Expand Up @@ -1547,7 +1547,12 @@ def _expand_mapped_task_if_needed(ti: TI) -> Iterable[TI] | None:
)
)
revised_map_index_task_ids.add(schedulable.task.task_id)
ready_tis.append(schedulable)

# _revise_map_indexes_if_mapped might mark the current task as REMOVED
# after calculating mapped task length, so we need to re-check
# the task state to ensure it's still schedulable
if schedulable.state in SCHEDULEABLE_STATES:
ready_tis.append(schedulable)

# Check if any ti changed state
tis_filter = TI.filter_for_tis(old_states)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,8 @@
},
"filter": "Filtr",
"filters": {
"durationFrom": "Czas trwania od",
"durationTo": "Czas trwania do",
"logicalDateFrom": "Data logiczna od",
"logicalDateTo": "Data logiczna do",
"runAfterFrom": "Uruchom po (od)",
Expand Down
2 changes: 2 additions & 0 deletions airflow-core/src/airflow/ui/public/i18n/locales/pl/dag.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
"hourly": "Godzinowo",
"legend": {
"less": "Mniej",
"mixed": "Mieszane",
"more": "Więcej"
},
"navigation": {
Expand All @@ -19,6 +20,7 @@
"previousYear": "Poprzedni rok"
},
"noData": "Brak danych",
"noFailedRuns": "Brak nieudanych wykonań",
"noRuns": "Brak wykonań",
"totalRuns": "Łączna liczba wykonań",
"week": "Tydzień {{weekNumber}}",
Expand Down
55 changes: 54 additions & 1 deletion airflow-core/tests/unit/models/test_dagrun.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
from airflow.providers.standard.operators.bash import BashOperator
from airflow.providers.standard.operators.empty import EmptyOperator
from airflow.providers.standard.operators.python import PythonOperator, ShortCircuitOperator
from airflow.sdk import DAG, BaseOperator, setup, task, task_group, teardown
from airflow.sdk import DAG, BaseOperator, get_current_context, setup, task, task_group, teardown
from airflow.sdk.definitions.deadline import AsyncCallback, DeadlineAlert, DeadlineReference
from airflow.serialization.serialized_objects import LazyDeserializedDAG, SerializedDAG
from airflow.stats import Stats
Expand Down Expand Up @@ -2253,6 +2253,59 @@ def tg(x, y):
}


@pytest.mark.parametrize("rerun_length", [0, 1, 2, 3])
def test_mapped_task_rerun_with_different_length_of_args(session, dag_maker, rerun_length):
@task
def generate_mapping_args():
context = get_current_context()
if context["ti"].try_number == 0:
args = [i for i in range(2)]
else:
args = [i for i in range(rerun_length)]
return args

@task
def mapped_print_value(arg):
return arg

with dag_maker(session=session):
args = generate_mapping_args()
mapped_print_value.expand(arg=args)

# First Run
dr = dag_maker.create_dagrun()
dag_maker.run_ti("generate_mapping_args", dr)

decision = dr.task_instance_scheduling_decisions(session=session)
for ti in decision.schedulable_tis:
dag_maker.run_ti(ti.task_id, dr, map_index=ti.map_index)

clear_task_instances(dr.get_task_instances(), session=session)

# Second Run
ti = dr.get_task_instance(task_id="generate_mapping_args", session=session)
ti.try_number += 1
session.merge(ti)
dag_maker.run_ti("generate_mapping_args", dr)

# Check if the new mapped task instances are correctly scheduled
decision = dr.task_instance_scheduling_decisions(session=session)
assert len(decision.schedulable_tis) == rerun_length
assert all([ti.task_id == "mapped_print_value" for ti in decision.schedulable_tis])

# Check if mapped task rerun successfully
for ti in decision.schedulable_tis:
dag_maker.run_ti(ti.task_id, dr, map_index=ti.map_index)
query = select(TI).where(
TI.dag_id == dr.dag_id,
TI.run_id == dr.run_id,
TI.task_id == "mapped_print_value",
TI.state == TaskInstanceState.SUCCESS,
)
success_tis = session.execute(query).all()
assert len(success_tis) == rerun_length


def test_operator_mapped_task_group_receives_value(dag_maker, session):
with dag_maker(session=session):

Expand Down
22 changes: 13 additions & 9 deletions chart/templates/_helpers.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -255,27 +255,31 @@ If release name contains chart name it will be used as a full name.
- name: GITSYNC_SSH_KNOWN_HOSTS
value: "false"
{{- end }}
{{ else if .Values.dags.gitSync.credentialsSecret }}
- name: GIT_SYNC_USERNAME
{{- else if .Values.dags.gitSync.credentialsSecret }}
{{- $tag := trimPrefix "v" .Values.images.gitSync.tag }}
{{- if or (eq $tag "latest") (semverCompare ">=4.0.0" $tag) }}
- name: GITSYNC_USERNAME
valueFrom:
secretKeyRef:
name: {{ .Values.dags.gitSync.credentialsSecret | quote }}
key: GIT_SYNC_USERNAME
- name: GITSYNC_USERNAME
key: GITSYNC_USERNAME
- name: GITSYNC_PASSWORD
valueFrom:
secretKeyRef:
name: {{ .Values.dags.gitSync.credentialsSecret | quote }}
key: GITSYNC_USERNAME
- name: GIT_SYNC_PASSWORD
key: GITSYNC_PASSWORD
{{- else }}
- name: GIT_SYNC_USERNAME
valueFrom:
secretKeyRef:
name: {{ .Values.dags.gitSync.credentialsSecret | quote }}
key: GIT_SYNC_PASSWORD
- name: GITSYNC_PASSWORD
key: GIT_SYNC_USERNAME
- name: GIT_SYNC_PASSWORD
valueFrom:
secretKeyRef:
name: {{ .Values.dags.gitSync.credentialsSecret | quote }}
key: GITSYNC_PASSWORD
key: GIT_SYNC_PASSWORD
{{- end }}
{{- end }}
- name: GIT_SYNC_REV
value: {{ .Values.dags.gitSync.rev | quote }}
Expand Down
3 changes: 3 additions & 0 deletions chart/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -124,6 +124,9 @@ images:
gitSync:
repository: registry.k8s.io/git-sync/git-sync
tag: v4.4.2
# NOTE:
# - If the tag is "v3.x.x" or any version < 4.0.0 - use GIT_SYNC_* env variables
# - If the tag is "v4.x.x" or "latest" - use GITSYNC_* env variables
pullPolicy: IfNotPresent

# Select certain nodes for airflow pods.
Expand Down
2 changes: 1 addition & 1 deletion dev/breeze/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,6 @@ PLEASE DO NOT MODIFY THE HASH BELOW! IT IS AUTOMATICALLY UPDATED BY PREK.

---------------------------------------------------------------------------------------------------------

Package config hash: 500727c0d31e88142b441b8e58c9db20a495af2994d65a4cc1a3e1e7d6eab8466a5e87242fbf361b888c5dd271861284dfb286409adb26b8f50e25d5fdfec034
Package config hash: 71b4a2063359e3487de35d5f6b6acc11096799ccf64bf9e4497a3c7d258cba5fca384e8e199b5933bb1c46c262bdf3ff636167c8863b48905cb31cd10b3312c5

---------------------------------------------------------------------------------------------------------
2 changes: 1 addition & 1 deletion dev/breeze/doc/ci/02_images.md
Original file line number Diff line number Diff line change
Expand Up @@ -444,7 +444,7 @@ can be used for CI images:
| `ADDITIONAL_DEV_APT_ENV` | | Additional env variables defined when installing dev deps |
| `AIRFLOW_PIP_VERSION` | `25.2` | `pip` version used. |
| `AIRFLOW_UV_VERSION` | `0.8.22` | `uv` version used. |
| `AIRFLOW_PREK_VERSION` | `0.2.1` | `prek` version used. |
| `AIRFLOW_PREK_VERSION` | `0.2.3` | `prek` version used. |
| `AIRFLOW_USE_UV` | `true` | Whether to use UV for installation. |
| `PIP_PROGRESS_BAR` | `on` | Progress bar for PIP installation |

Expand Down
2 changes: 1 addition & 1 deletion dev/breeze/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ dependencies = [
"jinja2>=3.1.5",
"jsonschema>=4.19.1",
"packaging>=25.0",
"prek>=0.2.1",
"prek>=0.2.3",
"psutil>=5.9.6",
"pygithub>=2.1.1",
"pytest-xdist>=3.3.1",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -264,7 +264,7 @@ class VersionedFile(NamedTuple):
AIRFLOW_USE_UV = False
GITPYTHON_VERSION = "3.1.45"
RICH_VERSION = "14.1.0"
PREK_VERSION = "0.2.1"
PREK_VERSION = "0.2.3"
HATCH_VERSION = "1.14.2"
PYYAML_VERSION = "6.0.3"

Expand Down
Loading
Loading