Skip to content

dag_run timestamp incorrect due to setting time zone #54006

@linwanlong

Description

@linwanlong

Apache Airflow version

3.0.3

If "Other Airflow 2 version" selected, which one?

No response

What happened?

Problem description

This is the time zone in the airflow configuration, whether set to UTC or Asia/Shanghai will cause the following problems.

# Default timezone in case supplied date times are naive
# can be `UTC` (default), `system`, or any `IANA <https://www.iana.org/time-zones>`
# timezone string (e.g. Europe/Amsterdam)
#
# Variable: AIRFLOW__CORE__DEFAULT_TIMEZONE
#
default_timezone = utc
  1. I set the time zone of all containers in the airflow cluster to Asia/Shanghai.

  2. Manually executing the RESTful API interface triggers dag_run run.
    http://<your-airflow-host>/api/v2/dags/{dag_id}/dagRuns

  3. The dag_run_id received back is manual__2025-07-31T17:34:53.759767+08:00.

  4. The confusing question came, logical_date turned out to be 2025-07-31 09:34:53.597894+08

Results shown

Below is a screenshot of my postgres database:

Image

More details

After many experiments on my own, I found some more detailed problems:

curl -X POST http://<your-airflow-host>/api/v2/dags/{dag_id}/dagRuns \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer <your-token>" \
  -d '{
    "conf": {
    },
    "logical_date": "2025-07-31T09:34:53.759767+00:00"
  }'

If the logical_date field in the interface request body is UTC, then the following error will appear in the database.

Image

If the logical_date field in the interface request body is Asia/Shanghai, then the following seemingly normal record will appear in the database.

Image

Both of these situations will cause the log timestamp format to be abnormal.

Image

Just set the time zone in the container to UTC and the above problem will not occur.

What you think should happen instead?

I think there is a problem with the program's resolution of timestamps with timezones and the logic of generating run_id, which depends on the host's timezone but does not solve the problem of timezone conversion

How to reproduce

  1. Set the time zone of the platform where the airflow process is located to Asia/Shanghai.
  2. Manually trigger the dag_run interface to change the logical_date to two different time zone formats: UTC and Asia/Shanghai.
  3. Observe the correspondence between the returned run_id and the logical_date.
  4. Observe the log timestamps of the task in the dag.

Operating System

Ubuntu 20.04.6 LTS \n \l

Versions of Apache Airflow Providers

apache-airflow-providers-fab==2.3.0

Deployment

Docker-Compose

Deployment details

I manually packaged a custom image via dockerfile, and the base image is python:3.10.13-slim

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions