-
Notifications
You must be signed in to change notification settings - Fork 14.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Traceback in LogRecord in JSONFormatter
#15414
Conversation
Currently traceback is not included when ``JSONFormatter`` is used. (`[logging] json_format = True`) . However, the default Handler includes the Stacktrace. To currently include the trace we need to add `json_fields = asctime, filename, lineno, levelname, message, exc_text`. This is a bigger problem when using Elasticsearch Logging with: ```ini [elasticsearch] write_stdout = True json_format = True json_fields = asctime, filename, lineno, levelname, message, exc_text [logging] log_format = [%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s - %(exc_text)s ``` Running the following DAG with the above config won't show trace: ```python from airflow import DAG from airflow.operators.python import PythonOperator from airflow.utils.dates import days_ago with DAG( dag_id='example_error', schedule_interval=None, start_date=days_ago(2), ) as dag: def raise_error(**kwargs): raise Exception("I am an exception from task logs") task_1 = PythonOperator( task_id='task_1', python_callable=raise_error, ) ``` Before: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` After: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task self._prepare_and_execute_task_with_callbacks(context, task) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks result = self._execute_task(context, task_copy) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task result = task_copy.execute(context=context) File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 117, in execute return_value = self.execute_callable() File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 128, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/usr/local/airflow/dags/eg-2.py", line 25, in print_context raise Exception("I am an exception from task logs") Exception: I am an exception from task logs [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ```
msg = record_dict["message"] | ||
if record.exc_text: | ||
if msg[-1:] != "\n": | ||
msg = msg + "\n" | ||
msg = msg + record.exc_text | ||
if record.stack_info: | ||
if msg[-1:] != "\n": | ||
msg = msg + "\n" | ||
msg = msg + self.formatStack(record.stack_info) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest master at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. |
Currently traceback is not included when ``JSONFormatter`` is used. (`[logging] json_format = True`) . However, the default Handler includes the Stacktrace. To currently include the trace we need to add `json_fields = asctime, filename, lineno, levelname, message, exc_text`. This is a bigger problem when using Elasticsearch Logging with: ```ini [elasticsearch] write_stdout = True json_format = True json_fields = asctime, filename, lineno, levelname, message, exc_text [logging] log_format = [%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s - %(exc_text)s ``` Running the following DAG with the above config won't show trace: ```python from airflow import DAG from airflow.operators.python import PythonOperator from airflow.utils.dates import days_ago with DAG( dag_id='example_error', schedule_interval=None, start_date=days_ago(2), ) as dag: def raise_error(**kwargs): raise Exception("I am an exception from task logs") task_1 = PythonOperator( task_id='task_1', python_callable=raise_error, ) ``` Before: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` After: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task self._prepare_and_execute_task_with_callbacks(context, task) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks result = self._execute_task(context, task_copy) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task result = task_copy.execute(context=context) File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 117, in execute return_value = self.execute_callable() File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 128, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/usr/local/airflow/dags/eg-2.py", line 25, in print_context raise Exception("I am an exception from task logs") Exception: I am an exception from task logs [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` (cherry picked from commit 99ec208) (cherry picked from commit cb2d35a09789492b579925b4bf808cdd6b82833e)
Currently traceback is not included when ``JSONFormatter`` is used. (`[logging] json_format = True`) . However, the default Handler includes the Stacktrace. To currently include the trace we need to add `json_fields = asctime, filename, lineno, levelname, message, exc_text`. This is a bigger problem when using Elasticsearch Logging with: ```ini [elasticsearch] write_stdout = True json_format = True json_fields = asctime, filename, lineno, levelname, message, exc_text [logging] log_format = [%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s - %(exc_text)s ``` Running the following DAG with the above config won't show trace: ```python from airflow import DAG from airflow.operators.python import PythonOperator from airflow.utils.dates import days_ago with DAG( dag_id='example_error', schedule_interval=None, start_date=days_ago(2), ) as dag: def raise_error(**kwargs): raise Exception("I am an exception from task logs") task_1 = PythonOperator( task_id='task_1', python_callable=raise_error, ) ``` Before: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` After: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task self._prepare_and_execute_task_with_callbacks(context, task) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks result = self._execute_task(context, task_copy) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task result = task_copy.execute(context=context) File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 117, in execute return_value = self.execute_callable() File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 128, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/usr/local/airflow/dags/eg-2.py", line 25, in print_context raise Exception("I am an exception from task logs") Exception: I am an exception from task logs [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` (cherry picked from commit 99ec208)
Currently traceback is not included when ``JSONFormatter`` is used. (`[logging] json_format = True`) . However, the default Handler includes the Stacktrace. To currently include the trace we need to add `json_fields = asctime, filename, lineno, levelname, message, exc_text`. This is a bigger problem when using Elasticsearch Logging with: ```ini [elasticsearch] write_stdout = True json_format = True json_fields = asctime, filename, lineno, levelname, message, exc_text [logging] log_format = [%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s - %(exc_text)s ``` Running the following DAG with the above config won't show trace: ```python from airflow import DAG from airflow.operators.python import PythonOperator from airflow.utils.dates import days_ago with DAG( dag_id='example_error', schedule_interval=None, start_date=days_ago(2), ) as dag: def raise_error(**kwargs): raise Exception("I am an exception from task logs") task_1 = PythonOperator( task_id='task_1', python_callable=raise_error, ) ``` Before: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` After: ``` [2021-04-17 00:11:00,152] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: example_python_operator.print_the_context 2021-04-17T00:10:57.110189+00:00 [queued]> ... ... [2021-04-17 00:11:00,298] {taskinstance.py:1482} ERROR - Task failed with exception Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task self._prepare_and_execute_task_with_callbacks(context, task) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks result = self._execute_task(context, task_copy) File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task result = task_copy.execute(context=context) File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 117, in execute return_value = self.execute_callable() File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 128, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/usr/local/airflow/dags/eg-2.py", line 25, in print_context raise Exception("I am an exception from task logs") Exception: I am an exception from task logs [2021-04-17 00:11:00,300] {taskinstance.py:1532} INFO - Marking task as FAILED. dag_id=example_python_operator, task_id=print_the_context, execution_date=20210417T001057, start_date=20210417T001100, end_date=20210417T001100 [2021-04-17 00:11:00,325] {local_task_job.py:146} INFO - Task exited with return code 1 ``` (cherry picked from commit 99ec208)
Currently traceback is not included when
JSONFormatter
is used.(
[logging] json_format = True
) . However, the default Handlerincludes the Stacktrace.
Running the following DAG with the above config won't show trace (because of the fix in #14456):
Before:
After fix in this PR:
To currently include the trace (without this PR) we need to
add
json_fields = asctime, filename, lineno, levelname, message, exc_text
. The change in this commit removes the need of addingexc_text
as it causes the following issue where it prepends- None
to every message.This is a bigger problem when using Elasticsearch Logging with:
^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.