Replies: 6 comments 8 replies
-
Thanks for opening your first issue here! Be sure to follow the issue template! |
Beta Was this translation helpful? Give feedback.
-
You are using logging module improperly. By https://docs.python.org/3/library/logging.html#logging.info
Instead you should log message to "airflow.task" logger if you want messages to show up in task log:
|
Beta Was this translation helpful? Give feedback.
-
Actually I have tried to use logger Moreover, considering the level and handler of |
Beta Was this translation helpful? Give feedback.
-
Cannot reproduce it with "standard" airflow configuration. You must have some custom logging configuration in local settings that causes it: import datetime
from airflow import DAG
from airflow.decorators import task
from airflow.operators.python import PythonOperator
def test_log():
import logging
logger = logging.getLogger("airflow.task")
logger.info("airflow.task >>> INFO logger test")
logger.warn("airflow.task >>> WARN logger test")
logger.error("airflow.task >>> ERROR logger test")
logger.critical("airflow.task >>> CRITICAL logger test")
with DAG(
'test_log',
description='test log',
schedule_interval=None,
start_date=datetime.datetime(2022, 1, 19),
catchup=False,
tags=['log'],
) as dag:
task = PythonOperator(
task_id='test_log_task',
python_callable=test_log,
dag=dag
) Resulting log:
|
Beta Was this translation helpful? Give feedback.
-
Hey! Have you found a solution? |
Beta Was this translation helpful? Give feedback.
-
We are experiencing the same problems. I tried several ways to log information in a virtualenv-operator: by using print-statements, logging.info, and the special logger ( logging.getLogger("airflow.task") ). None of them worked for us. We are using airflow in a kubernetes setup. ( very similar to this issue #34865 ) Is the fix provided by SamJia (adding the red-marked lined to the Given what I read here and in the other issue, i got the feeling that this might be related to the config somehow: in your "standart config" this seems to work for you but other people had problems. Do you have any idea which config properties might be the culprits here? |
Beta Was this translation helpful? Give feedback.
-
Apache Airflow version
2.2.3 (latest released)
What happened
Operator logging not work.
For example, for a task with logging:
the log in webserver is:
no any log information are recorded.
In my view, the reason is that when a TaskInstance is loaded through session with DagRun.run_id in function
_get_ti
in airflow/cli/commands/task_command.py, ti._log object is not properly initialzed.Then, when call 'ti.init_run_context', ti._log will be set as logging.getLogger(self.class.module + '.' + self.class.name) according to function log of class LoggingMixin in airflow/utils/log/logging_mixin.py. And the context of logging.getLogger('airflow.task') is not properly set.
Following, in function '_capture_task_logs' in airflow/cli/commands/task_command.py, the root logger configuration is replaced with the airflow.task configuration, which context is not set.
It means all the loging happen after the replacement cannot be recorded.
Possible way to fix:
In function
refresh_from_task
, initialize self._log like follows:Moreover, it may be also addeed in function
refresh_from_db
What you expected to happen
I wish the log looks like this:
How to reproduce
My config related to logging in airlfow.cfg:
My DAG and task:
Operating System
NAME="Ubuntu" VERSION="18.04.4 LTS (Bionic Beaver)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 18.04.4 LTS" VERSION_ID="18.04" HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" VERSION_CODENAME=bionic UBUNTU_CODENAME=bionic
Versions of Apache Airflow Providers
None
Deployment
Virtualenv installation
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions