Skip to content

end_date and duration are not available in task_instance variable if task's state is failed #52630

@Nevioxis

Description

@Nevioxis

Apache Airflow version

3.0.2

If "Other Airflow 2 version" selected, which one?

No response

What happened?

After I encountered with Slack notification (using airflow.providers.slack.notifications.slack_webhook.send_slack_webhook_notification) issue after upgrading to Airflow 3 like #50754 and email notification issue using airflow.utils.email.send_email with context variables on on_failure_callback, I checked the log of my Slack and email sending callback and got this exception

Slack

ERROR - Failed to run task callback: kind="on_failure_callback": index=0: callback="<airflow.providers.slack.notifications.slack_webhook.SlackWebhookNotifier object at 0x734bc9afe960>": source="task"
TypeError: unsupported operand type(s) for +: 'NoneType' and 'datetime.timedelta'
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1096 in _run_task_state_change_callbacks

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/callback_runner.py", line 81 in run

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/notifier.py", line 103 in __call__

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/notifier.py", line 75 in render_template_fields

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 121 in _do_render_template_fields

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 190 in render_template

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 192 in render_template

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 190 in render_template

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 192 in render_template

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 177 in render_template

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/notifier.py", line 57 in _render

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/_internal/templater.py", line 133 in _render

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/helpers.py", line 244 in render_template_to_string

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/helpers.py", line 239 in render_template

File "<template>", line 14 in root

Email

ERROR - Failed to run task callback: kind="on_failure_callback": index=1: callback="<function failure_email at 0x734bc99f6840>": source="task"
TypeError: unsupported operand type(s) for +: 'NoneType' and 'datetime.timedelta'
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1096 in _run_task_state_change_callbacks

File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/callback_runner.py", line 81 in run

File "/opt/airflow/dags/email_integration.py", line 26 in failure_email

*email_integration.py is my module to send email

So, I checked on line 26 of email_integration.py

def failure_email(context):

...

    task_instance = context['task_instance'] #Line 17

...

    "end_date": (task_instance.end_date + datetime.timedelta(hours=7)).strftime('%A, %d %B %Y \n%H:%M:%S') #Line 26

...

Then I tried to print out task_instance variable and got

id=UUID('0197b0bb-801d-7d98-981c-242e83a6cee3') task_id='test_email' dag_id='test_email' run_id='manual__2025-06-27T09:32:47.498866+00:00' try_number=1 map_index=-1 hostname='airflow-worker' context_carrier={} task=<Task(BashOperator): test_email> bundle_instance=LocalDagBundle(name=dags-folder) max_tries=0 start_date=datetime.datetime(2025, 6, 27, 9, 32, 48, 366437, tzinfo=TzInfo(UTC)) end_date=None state=<TaskInstanceState.FAILED: 'failed'> is_mapped=False rendered_map_index=None: chan="stdout": source="task"

As you can see, end_date is set to None and no duration was sent to this variable, but on Web UI and schedule log shown both end_date and duration

Web UI

Image

Schedule log

[2025-06-27T09:32:49.670+0000] {scheduler_job_runner.py:852} INFO - TaskInstance Finished: dag_id=test_email, task_id=test_email, run_id=manual__2025-06-27T09:32:47.498866+00:00, map_index=-1, run_start_date=2025-06-27 09:32:48.366437+00:00, run_end_date=2025-06-27 09:32:48.705769+00:00, run_duration=0.339332, state=failed, executor=CeleryExecutor(parallelism=32), executor_state=success, try_number=1, max_tries=0, pool=default_pool, queue=worker, priority_weight=1, operator=BashOperator, queued_dttm=2025-06-27 09:32:48.304738+00:00, scheduled_dttm=2025-06-27 09:32:48.281697+00:00,queued_by_job_id=56436, pid=42623

After I remove usage of task_instance.end_date and task_instance.duration from Slack Jinja2 variable and inside failure_email function, Both Slack and email notification is now back online. But, is it possible make this two variable back like what Airflow 2 does?

What you think should happen instead?

task_instance.end_date variable should return datetime.datetime object of when does the task end failed. And task_instance.duration variable should return float object of how long does the task run before it failed, like what does the schedule log and what UI show.

How to reproduce

Calling task_instance.end_date and/or task_instance.duration variable on context callback function or Slack Jinja2 variable using on_failure_callback task parameter

Operating System

Debian 12

Versions of Apache Airflow Providers

apache-airflow-providers-amazon           9.8.0
apache-airflow-providers-celery           3.11.0
apache-airflow-providers-cncf-kubernetes  10.5.0
apache-airflow-providers-common-compat    1.7.0
apache-airflow-providers-common-io        1.6.0
apache-airflow-providers-common-messaging 1.0.2
apache-airflow-providers-common-sql       1.27.1
apache-airflow-providers-docker           4.4.0
apache-airflow-providers-elasticsearch    6.3.0
apache-airflow-providers-fab              2.2.0
apache-airflow-providers-ftp              3.13.0
apache-airflow-providers-git              0.0.2
apache-airflow-providers-google           15.1.0
apache-airflow-providers-grpc             3.8.0
apache-airflow-providers-hashicorp        4.2.0
apache-airflow-providers-http             5.3.0
apache-airflow-providers-microsoft-azure  12.4.0
apache-airflow-providers-mysql            6.3.0
apache-airflow-providers-odbc             4.10.0
apache-airflow-providers-openlineage      2.3.0
apache-airflow-providers-postgres         6.2.0
apache-airflow-providers-redis            4.1.0
apache-airflow-providers-sendgrid         4.1.0
apache-airflow-providers-sftp             5.3.0
apache-airflow-providers-slack            9.1.0
apache-airflow-providers-smtp             2.1.0
apache-airflow-providers-snowflake        6.3.1
apache-airflow-providers-ssh              4.1.0
apache-airflow-providers-standard         1.2.0

Deployment

Docker-Compose

Deployment details

  • Separate Celery worker to other node from another components
  • Redis and PostgreSQL as a service on cloud provider
  • Custom Airflow image based on official Airflow image with FAB Auth and php-cli for executing php file

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:coregood first issuekind:bugThis is a clearly a bugpriority:mediumBug that should be fixed before next release but would not block a release

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions