Skip to content

Conversation

@kaxil
Copy link
Member

@kaxil kaxil commented Jul 23, 2025

This ensures DAG callbacks receive the same rich context as task callbacks, improving consistency and providing access to template variables and macros similar to Airflow 2.

This has been a blocker for few users similar to #53058

Closes #52824
Closes #51402
Closes #51949
Closes #53654
Related to #53618

Logs from an example run:

{"timestamp":"2025-07-23T21:07:02.487673","level":"info","event":"Filling up the DagBag from /files/dags/daga.py","logger":"airflow.models.dagbag.DagBag"}
{"timestamp":"2025-07-23T21:07:02.514790","level":"info","event":"Executing on_success dag callback","dag_id":"a_dag","logger":"task"}
{"timestamp":"2025-07-23T21:07:02.520721Z","level":"info","event":"{'conn': <ConnectionAccessor (dynamic access)>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.520836Z","level":"info","event":" 'dag': <DAG: a_dag>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.520874Z","level":"info","event":" 'dag_run': DagRun(dag_id='a_dag', run_id='manual__2025-07-23T21:06:58.041559+00:00', logical_date=datetime.datetime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')), data_interval_start=datetime.datetime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')), data_interval_end=datetime.datetime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')), run_after=datetime.datetime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')), start_date=datetime.datetime(2025, 7, 23, 21, 6, 58, 575761, tzinfo=Timezone('UTC')), end_date=datetime.datetime(2025, 7, 23, 21, 7, 1, 507739, tzinfo=Timezone('UTC')), clear_number=0, run_type=<DagRunType.MANUAL: 'manual'>, state=<DagRunState.SUCCESS: 'success'>, conf={}, consumed_asset_events=[]),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.520925Z","level":"info","event":" 'data_interval_end': DateTime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.520964Z","level":"info","event":" 'data_interval_start': DateTime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.520993Z","level":"info","event":" 'ds': '2025-07-23',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521024Z","level":"info","event":" 'ds_nodash': '20250723',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521058Z","level":"info","event":" 'inlet_events': InletEventsAccessors(_inlets=[], _assets={}, _asset_aliases={}),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521087Z","level":"info","event":" 'inlets': [],","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521115Z","level":"info","event":" 'logical_date': DateTime(2025, 7, 23, 21, 6, 56, 583000, tzinfo=Timezone('UTC')),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521142Z","level":"info","event":" 'macros': <MacrosAccessor (dynamic access to macros)>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521170Z","level":"info","event":" 'map_index_template': None,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521197Z","level":"info","event":" 'outlet_events': <airflow.sdk.execution_time.context.OutletEventAccessors object at 0xffffa86d8520>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521225Z","level":"info","event":" 'outlets': [],","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.521256Z","level":"info","event":" 'params': {},","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.540978Z","level":"info","event":" 'prev_data_interval_end_success': <Proxy at 0xffffa86d9510 with factory <function RuntimeTaskInstance.get_template_context.<locals>.<lambda> at 0xffffa86b9900>>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541156Z","level":"info","event":" 'prev_data_interval_start_success': <Proxy at 0xffffa86d94e0 with factory <function RuntimeTaskInstance.get_template_context.<locals>.<lambda> at 0xffffa86b9870>>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541273Z","level":"info","event":" 'prev_end_date_success': <Proxy at 0xffffa86d8ee0 with factory <function RuntimeTaskInstance.get_template_context.<locals>.<lambda> at 0xffffa86bad40>>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541346Z","level":"info","event":" 'prev_start_date_success': <Proxy at 0xffffa86d8a30 with factory <function RuntimeTaskInstance.get_template_context.<locals>.<lambda> at 0xffffab843400>>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541404Z","level":"info","event":" 'reason': 'success',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541512Z","level":"info","event":" 'run_id': 'manual__2025-07-23T21:06:58.041559+00:00',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541622Z","level":"info","event":" 'task': <Task(_PythonDecoratedOperator): print_the_context>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541730Z","level":"info","event":" 'task_instance': RuntimeTaskInstance(id=UUID('0198391c-619c-77b9-b26b-a41462239ff8'), task_id='print_the_context', dag_id='a_dag', run_id='manual__2025-07-23T21:06:58.041559+00:00', try_number=1, dag_version_id=UUID('01983904-64e0-7aa9-bda6-592268defae2'), map_index=-1, hostname='aa8528989b89', context_carrier=None, task=<Task(_PythonDecoratedOperator): print_the_context>, max_tries=0, end_date=None, state=None, is_mapped=None, rendered_map_index=None, log_url=None),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541849Z","level":"info","event":" 'task_instance_key_str': 'a_dag__print_the_context__20250723',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.541985Z","level":"info","event":" 'task_reschedule_count': 0,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542045Z","level":"info","event":" 'ti': RuntimeTaskInstance(id=UUID('0198391c-619c-77b9-b26b-a41462239ff8'), task_id='print_the_context', dag_id='a_dag', run_id='manual__2025-07-23T21:06:58.041559+00:00', try_number=1, dag_version_id=UUID('01983904-64e0-7aa9-bda6-592268defae2'), map_index=-1, hostname='aa8528989b89', context_carrier=None, task=<Task(_PythonDecoratedOperator): print_the_context>, max_tries=0, end_date=None, state=None, is_mapped=None, rendered_map_index=None, log_url=None),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542088Z","level":"info","event":" 'triggering_asset_events': TriggeringAssetEventsAccessor(_events=defaultdict(<class 'list'>, {})),","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542118Z","level":"info","event":" 'ts': '2025-07-23T21:06:56.583000+00:00',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542148Z","level":"info","event":" 'ts_nodash': '20250723T210656',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542176Z","level":"info","event":" 'ts_nodash_with_tz': '20250723T210656.583000+0000',","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542204Z","level":"info","event":" 'var': {'json': <VariableAccessor (dynamic access)>,","chan":"stdout","logger":"processor"}
{"timestamp":"2025-07-23T21:07:02.542232Z","level":"info","event":"         'value': <VariableAccessor (dynamic access)>}}","chan":"stdout","logger":"processor"}

Dag used:

from pprint import pprint
import pendulum
from airflow.sdk import dag, task, Context


def test_cb(context: Context):
    pprint(context)


@dag(
    on_success_callback=test_cb,
)
def a_dag():
    @task(task_id="print_the_context")
    def print_context(**kwargs):
        pass

    print_context()

example_dag = a_dag()
image

^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

@kaxil kaxil added this to the Airflow 3.0.4 milestone Jul 23, 2025
@boring-cyborg boring-cyborg bot added area:DAG-processing area:Scheduler including HA (high availability) scheduler area:task-sdk labels Jul 23, 2025
@kaxil kaxil force-pushed the add-proper-dagrun-context branch from 1b96bfe to 9436213 Compare July 23, 2025 21:18
@kaxil kaxil requested a review from uranusjr July 23, 2025 21:22
@kaxil kaxil force-pushed the add-proper-dagrun-context branch 2 times, most recently from 1747b78 to 9cb63eb Compare July 24, 2025 08:40
This ensures DAG callbacks receive the same rich context as task callbacks,
improving consistency and providing access to template variables and macros similar to Airflow 2.

This has been a blocker for few users similar to apache#53058

Fixes apache#52824
Fixes apache#51402
Closes apache#51949
Related to apache#53654
Related to apache#53618
@kaxil kaxil force-pushed the add-proper-dagrun-context branch from 9cb63eb to af266d5 Compare July 24, 2025 11:09
@kaxil kaxil requested a review from gopidesupavan July 24, 2025 19:39
@kaxil kaxil merged commit ef80507 into apache:main Jul 24, 2025
77 checks passed
@kaxil kaxil deleted the add-proper-dagrun-context branch July 24, 2025 22:27
@github-actions
Copy link

Backport failed to create: v3-0-test. View the failure log Run details

Status Branch Result
v3-0-test Commit Link

You can attempt to backport this manually by running:

cherry_picker ef80507 v3-0-test

This should apply the commit to the v3-0-test branch and leave the commit in conflict state marking
the files that need manual conflict resolution.

After you have resolved the conflicts, you can continue the backport process by running:

cherry_picker --continue

ferruzzi pushed a commit to aws-mwaa/upstream-to-airflow that referenced this pull request Aug 7, 2025
This ensures DAG callbacks receive the same rich context as task callbacks,
improving consistency and providing access to template variables and macros similar to Airflow 2.

This has been a blocker for few users similar to apache#53058

Fixes apache#52824
Fixes apache#51402
Closes apache#51949
Related to apache#53654
Related to apache#53618
kaxil added a commit that referenced this pull request Aug 9, 2025
This ensures DAG callbacks receive the same rich context as task callbacks,
improving consistency and providing access to template variables and macros similar to Airflow 2.

This has been a blocker for few users similar to #53058

Fixes #52824
Fixes #51402
Closes #51949
Related to #53654
Related to #53618

(cherry picked from commit ef80507)
@kaxil kaxil modified the milestones: Airflow 3.0.4, Airflow 3.0.5 Aug 9, 2025
@frodo2000
Copy link
Contributor

I have upgraded my Airflow instance to 3.0.4 but for Dag success context is still limited. Notify method in Notifier class doesn’t get dag_run or task_instance (needed for host information).

@potiuk
Copy link
Member

potiuk commented Aug 10, 2025

I have upgraded my Airflow instance to 3.0.4 but for Dag success context is still limited. Notify method in Notifier class doesn’t get dag_run or task_instance (needed for host information).

I suggest to open a new "feature" issue - describing the use case and what shoud be added to context.

@kaxil
Copy link
Member Author

kaxil commented Aug 10, 2025

I have upgraded my Airflow instance to 3.0.4 but for Dag success context is still limited. Notify method in Notifier class doesn’t get dag_run or task_instance (needed for host information).

This was missed to include in 3.0.4. It will now be part of 3.0.5

fweilun pushed a commit to fweilun/airflow that referenced this pull request Aug 11, 2025
This ensures DAG callbacks receive the same rich context as task callbacks,
improving consistency and providing access to template variables and macros similar to Airflow 2.

This has been a blocker for few users similar to apache#53058

Fixes apache#52824
Fixes apache#51402
Closes apache#51949
Related to apache#53654
Related to apache#53618
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:DAG-processing area:Scheduler including HA (high availability) scheduler area:task-sdk

Projects

None yet

5 participants