-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Description
Apache Airflow version
3.0.4
If "Other Airflow 2 version" selected, which one?
No response
What happened?
After upgrading our Airflow deployment from version 3.0.3 to 3.0.4, all tasks started to fail with a RuntimeError: generator didn't yield. Our Airflow instance is deployed on Kubernetes and is configured to use S3 for remote logging.
The error occurred because a missing aws_default connection caused a failure during the logging setup phase for tasks. This seems to be a new requirement in version 3.0.4 that wasn't present in 3.0.3, or a regression that causes a critical failure when the connection is absent.
Worker logs traceback:
2025-08-14 09:02:05.978736 [warning ] Server error [airflow.sdk.api.client] detail={'detail': {'reason': 'not_found', 'message': 'Connection with ID aws_default not found'}}
2025-08-14 09:02:05.979060 [error ] Connection not found [airflow.sdk.api.client] conn_id=aws_default detail={'detail': {'reason': 'not_found', 'message': 'Connection with ID aws_default not found'}} status_code=404
2025-08-14 09:02:05.985273 [error ] Task execute_workload[fe7b5051-82c5-4dec-880d-e4e7c37b3ef2] raised unexpected: RuntimeError("generator didn't yield") [celery.app.trace]
The traceback points to this line in the code:
https://github.com/apache/airflow/blob/3.0.4/task-sdk/src/airflow/sdk/execution_time/supervisor.py#L1700
│ /app/.venv/lib/python3.11/site-packages/airflow/sdk/execution_time/supervisor.py:1700 in │
│ _configure_logging │
│ │
│ ❱ 1700 with _remote_logging_conn(client): │
...
RuntimeError: generator didn't yield
What you think should happen instead?
Tasks should execute successfully. If a connection is required for remote logging, Airflow should either:
-
Use a different, non-critical logging path if the connection is missing and warn users.
-
Provide a clear, helpful error message about the missing connection instead of a cryptic RuntimeError: generator didn't yield.
-
Include a warning in the release notes about the new requirement for a
aws_defaultconnection when using S3 remote logging.
How to reproduce
-
Deploy an Airflow instance on Kubernetes.
-
Configure remote_logging to use S3 in airflow.cfg or through environment variables.
-
Ensure that there is no Airflow connection named aws_default.
-
Upgrade the Airflow version to 3.0.4.
-
Run a DAG with one or more tasks.
-
Observe the task failure and the logs.
Operating System
Airflow in k8s
Versions of Apache Airflow Providers
No response
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct