Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 0 additions & 28 deletions airflow-core/docs/core-concepts/debug.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,15 +69,6 @@ is manually ingested. The cleanup step is also skipped, making the intermediate
run = dag.test(mark_success_pattern="wait_for_.*|cleanup")
print(f"Intermediate csv: {run.get_task_instance('collect_stats').xcom_pull(task_id='collect_stats')}")

Comparison with DebugExecutor
-----------------------------

The ``dag.test`` command has the following benefits over the :class:`~airflow.executors.debug_executor.DebugExecutor`
class, which is now deprecated:

1. It does not require running an executor at all. Tasks are run one at a time with no executor or scheduler logs.
2. It is faster than running code with a DebugExecutor as it does not need to go through a scheduler loop.


Debugging Airflow dags on the command line
******************************************
Expand All @@ -98,25 +89,6 @@ Run ``python -m pdb <path to dag file>.py`` for an interactive debugging experie
(Pdb) run_this_last
<Task(EmptyOperator): run_this_last>

.. _executor:DebugExecutor:

Debug Executor (deprecated)
***************************

The :class:`~airflow.executors.debug_executor.DebugExecutor` is meant as
a debug tool and can be used from IDE. It is a single process executor that
queues :class:`~airflow.models.taskinstance.TaskInstance` and executes them by running
``_run_raw_task`` method.

Due to its nature the executor can be used with SQLite database. When used
with sensors the executor will change sensor mode to ``reschedule`` to avoid
blocking the execution of DAG.

Additionally ``DebugExecutor`` can be used in a fail-fast mode that will make
all other running or scheduled tasks fail immediately. To enable this option set
``AIRFLOW__DEBUG__FAIL_FAST=True`` or adjust ``fail_fast`` option in your ``airflow.cfg``.
For more information on setting the configuration, see :doc:`../../howto/set-config`.

**IDE setup steps:**

1. Add ``main`` block at the end of your DAG file to make it runnable.
Expand Down
11 changes: 0 additions & 11 deletions airflow-core/src/airflow/config_templates/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1290,17 +1290,6 @@ secrets:
type: integer
example: ~
default: "900"
debug:
description: ~
options:
fail_fast:
description: |
Used only with ``DebugExecutor``. If set to ``True`` DAG will fail with first
failed task. Helpful for debugging purposes.
version_added: 1.10.8
type: string
example: ~
default: "False"
api:
description: ~
options:
Expand Down
155 changes: 0 additions & 155 deletions airflow-core/src/airflow/executors/debug_executor.py

This file was deleted.

2 changes: 0 additions & 2 deletions airflow-core/src/airflow/executors/executor_constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,13 +31,11 @@ class ConnectorSource(Enum):
SEQUENTIAL_EXECUTOR = "SequentialExecutor"
CELERY_EXECUTOR = "CeleryExecutor"
KUBERNETES_EXECUTOR = "KubernetesExecutor"
DEBUG_EXECUTOR = "DebugExecutor"
MOCK_EXECUTOR = "MockExecutor"
CORE_EXECUTOR_NAMES = {
LOCAL_EXECUTOR,
SEQUENTIAL_EXECUTOR,
CELERY_EXECUTOR,
KUBERNETES_EXECUTOR,
DEBUG_EXECUTOR,
MOCK_EXECUTOR,
}
2 changes: 0 additions & 2 deletions airflow-core/src/airflow/executors/executor_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@
from airflow.executors.executor_constants import (
CELERY_EXECUTOR,
CORE_EXECUTOR_NAMES,
DEBUG_EXECUTOR,
KUBERNETES_EXECUTOR,
LOCAL_EXECUTOR,
SEQUENTIAL_EXECUTOR,
Expand Down Expand Up @@ -61,7 +60,6 @@ class ExecutorLoader:
CELERY_EXECUTOR: "airflow.providers.celery.executors.celery_executor.CeleryExecutor",
KUBERNETES_EXECUTOR: "airflow.providers.cncf.kubernetes."
"executors.kubernetes_executor.KubernetesExecutor",
DEBUG_EXECUTOR: "airflow.executors.debug_executor.DebugExecutor",
}

@classmethod
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@
from airflow.executors import executor_loader
from airflow.executors.executor_constants import (
CELERY_EXECUTOR,
DEBUG_EXECUTOR,
KUBERNETES_EXECUTOR,
LOCAL_EXECUTOR,
SEQUENTIAL_EXECUTOR,
Expand All @@ -41,12 +40,10 @@ class TestStandaloneCommand:
(SEQUENTIAL_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(CELERY_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(KUBERNETES_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(DEBUG_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(LOCAL_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
(SEQUENTIAL_EXECUTOR, "other_db_conn_string", SEQUENTIAL_EXECUTOR),
(CELERY_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
(KUBERNETES_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
(DEBUG_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
],
)
def test_calculate_env(self, conf_executor_name, conf_sql_alchemy_conn, expected_standalone_executor):
Expand Down
Loading
Loading