-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Description
Apache Airflow version
3.0.3
If "Other Airflow 2 version" selected, which one?
No response
What happened?
Problem description
This is the time zone in the airflow configuration, whether set to UTC or Asia/Shanghai will cause the following problems.
# Default timezone in case supplied date times are naive
# can be `UTC` (default), `system`, or any `IANA <https://www.iana.org/time-zones>`
# timezone string (e.g. Europe/Amsterdam)
#
# Variable: AIRFLOW__CORE__DEFAULT_TIMEZONE
#
default_timezone = utc-
I set the time zone of all containers in the airflow cluster to
Asia/Shanghai. -
Manually executing the RESTful API interface triggers
dag_runrun.
http://<your-airflow-host>/api/v2/dags/{dag_id}/dagRuns -
The dag_run_id received back is
manual__2025-07-31T17:34:53.759767+08:00. -
The confusing question came,
logical_dateturned out to be2025-07-31 09:34:53.597894+08
Results shown
Below is a screenshot of my postgres database:
More details
After many experiments on my own, I found some more detailed problems:
curl -X POST http://<your-airflow-host>/api/v2/dags/{dag_id}/dagRuns \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <your-token>" \
-d '{
"conf": {
},
"logical_date": "2025-07-31T09:34:53.759767+00:00"
}'
If the logical_date field in the interface request body is UTC, then the following error will appear in the database.
If the logical_date field in the interface request body is Asia/Shanghai, then the following seemingly normal record will appear in the database.
Both of these situations will cause the log timestamp format to be abnormal.
Just set the time zone in the container to UTC and the above problem will not occur.
What you think should happen instead?
I think there is a problem with the program's resolution of timestamps with timezones and the logic of generating run_id, which depends on the host's timezone but does not solve the problem of timezone conversion
How to reproduce
- Set the time zone of the platform where the airflow process is located to
Asia/Shanghai. - Manually trigger the dag_run interface to change the
logical_dateto two different time zone formats:UTCandAsia/Shanghai. - Observe the correspondence between the returned
run_idand thelogical_date. - Observe the log timestamps of the task in the dag.
Operating System
Ubuntu 20.04.6 LTS \n \l
Versions of Apache Airflow Providers
apache-airflow-providers-fab==2.3.0
Deployment
Docker-Compose
Deployment details
I manually packaged a custom image via dockerfile, and the base image is python:3.10.13-slim
Anything else?
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct