Skip to content

Custom logging handler is ignored #58053

@AradOz

Description

@AradOz

Apache Airflow version

3.1.2

If "Other Airflow 2/3 version" selected, which one?

No response

What happened?

Hi there,
I am using airflow 3.1.2 and I am trying to add an additional custom logging handler to the default logger.
I saw other people complaining about similar problems but I have not found a solution yet.

The goal of the handler is to merge all task logs of a dag-run into a single file:
logs/dag_id=<dag_id>/run_id=<run_id>/dag.log
NOTE: I want to keep the normal logging of tasks

I added it using this piece of code:

from copy import deepcopy
from airflow.config_templates.airflow_local_settings import DEFAULT_LOGGING_CONFIG

LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)

LOGGING_CONFIG["handlers"]["dag_run_log_aggregator"] = {
    "class": "config.handlers.dag_run_log_aggregator.DAGRunLogAggregatorHandler",
    "formatter": "airflow",
    "base_log_folder": "/opt/airflow/logs",
    "level": "INFO",
}

LOGGING_CONFIG["loggers"]["airflow.task"] = {
    "handlers": ["dag_run_log_aggregator", "task"],
    "level": "INFO",
    "propagate": True,
}

REMOTE_LOGGING = False
REMOTE_TASK_LOG = None

As you can see, it is loaded:

airflow info

Apache Airflow
version                | 3.1.2                                                                  
executor               | CeleryExecutor                                                         
task_logging_handler   | config.handlers.dag_run_log_aggregator.DAGRunLogAggregatorHandler,     
                       | airflow.utils.log.file_task_handler.FileTaskHandler                    
sql_alchemy_conn       | postgresql+psycopg2://airflow:airflow@postgres/airflow                 
dags_folder            | /opt/airflow/dags                                                      
plugins_folder         | /opt/airflow/plugins                                                   
base_log_folder        | /opt/airflow/logs                                                      
remote_base_log_folder | 

But, airflow ignores all.
I tried to check the handlers of the airflow.task logger on runtime:

@task()
def airflow() -> None:
    import logging
    logger = logging.getLogger("airflow.task")
    
    logger.info(logger.hasHandlers())
    logger.info(logger.handlers)

    parent = logger
    while parent:
        print(parent.name, parent.handlers)
        parent = parent.parent if hasattr(parent, "parent") else None

Output was weird:

[2025-11-07 17:09:36] INFO - Filling up the DagBag from /opt/airflow/dags/dag.py
[2025-11-07 17:09:36] INFO - True
[2025-11-07 17:09:36] INFO - []
[2025-11-07 17:09:36] INFO - airflow.task []
[2025-11-07 17:09:36] INFO - airflow []
[2025-11-07 17:09:36] INFO - Done. Returned value was: None
[2025-11-07 17:09:36] INFO - root [<StreamHandler 98 (INFO)>]

I would be grateful for any help about this situation, I tried almost everything according to airflow's logging documentation.

What you think should happen instead?

No response

How to reproduce

Just setup airflow 3.1.2 on docker compose and try to add a custom logging handler

Operating System

Ubuntu 20.04.6 LTS

Versions of Apache Airflow Providers

No response

Deployment

Docker-Compose

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions