Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ Serving logs from workers and triggerer

Most task handlers send logs upon completion of a task. In order to view logs in real time, Airflow starts an HTTP server to serve the logs in the following cases:

- If ``SequentialExecutor`` or ``LocalExecutor`` is used, then when ``airflow scheduler`` is running.
- If ``LocalExecutor`` is used, then when ``airflow scheduler`` is running.
- If ``CeleryExecutor`` is used, then when ``airflow worker`` is running.

In triggerer, logs are served unless the service is started with option ``--skip-serve-logs``.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ code you will need to restart those processes. However, it will not be reflected
By default, task execution uses forking. This avoids the slowdown associated with creating a new Python interpreter
and re-parsing all of Airflow's code and startup routines. This approach offers significant benefits, especially for shorter tasks.
This does mean that if you use plugins in your tasks, and want them to update you will either
need to restart the worker (if using CeleryExecutor) or scheduler (Local or Sequential executors). The other
need to restart the worker (if using CeleryExecutor) or scheduler (LocalExecutor). The other
option is you can accept the speed hit at start up set the ``core.execute_tasks_new_python_interpreter``
config setting to True, resulting in launching a whole new python interpreter for tasks.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,11 +56,8 @@ Once that is done, you can run -
Multi-Node Cluster
==================

Airflow uses :class:`~airflow.executors.sequential_executor.SequentialExecutor` by default. However, by its
nature, the user is limited to executing at most one task at a time. ``Sequential Executor`` also pauses
the scheduler when it runs a task, hence it is not recommended in a production setup. You should use the
:class:`~airflow.executors.local_executor.LocalExecutor` for a single machine.
For a multi-node setup, you should use the :doc:`Kubernetes executor <apache-airflow-providers-cncf-kubernetes:kubernetes_executor>` or
Airflow uses :class:`~airflow.executors.local_executor.LocalExecutor` by default. For a multi-node setup,
you should use the :doc:`Kubernetes executor <apache-airflow-providers-cncf-kubernetes:kubernetes_executor>` or
the :doc:`Celery executor <apache-airflow-providers-celery:celery_executor>`.


Expand Down
8 changes: 5 additions & 3 deletions airflow-core/docs/core-concepts/executor/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,14 +46,17 @@ If you want to check which executor is currently set, you can use the ``airflow
.. code-block:: bash

$ airflow config get-value core executor
SequentialExecutor
LocalExecutor



Executor Types
--------------

There are two types of executors - those that run tasks *locally* (inside the ``scheduler`` process), and those that run their tasks *remotely* (usually via a pool of *workers*). Airflow comes configured with the ``SequentialExecutor`` by default, which is a local executor, and the simplest option for execution. However, the ``SequentialExecutor`` is not suitable for production since it does not allow for parallel task running and due to that, some Airflow features (e.g. running sensors) will not work properly. You should instead use the ``LocalExecutor`` for small, single-machine production installations, or one of the remote executors for a multi-machine/cloud installation.
There is only one type of executor that runs tasks *locally* (inside the ``scheduler`` process) in the repo tree, but custom ones
can be written to achieve similar results, and there are those that run their tasks *remotely* (usually via a pool of *workers*). Airflow comes configured with the ``LocalExecutor`` by default, which is a local executor, and the simplest option for execution.
However, as the ``LocalExecutor`` runs processes in the scheduler process that can have an impact on the performance of the scheduler. You can use the ``LocalExecutor``
for small, single-machine production installations, or one of the remote executors for a multi-machine/cloud installation.


.. _executor-types-comparison:
Expand All @@ -73,7 +76,6 @@ Airflow tasks are run locally within the scheduler process.
:maxdepth: 1

local
sequential

Remote Executors
^^^^^^^^^^^^^^^^
Expand Down
5 changes: 0 additions & 5 deletions airflow-core/docs/core-concepts/executor/local.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,11 +43,6 @@ The following strategies are implemented:
| LocalExecutor receives the call to shutdown the executor a poison token is sent to the
| workers to terminate them. Processes used in this strategy are of class :class:`~airflow.executors.local_executor.QueuedLocalWorker`.

Arguably, :class:`~airflow.executors.sequential_executor.SequentialExecutor` could be thought of as a ``LocalExecutor`` with limited
parallelism of just 1 worker, i.e. ``self.parallelism = 1``.
This option could lead to the unification of the executor implementations, running
locally, into just one :class:`~airflow.executors.local_executor.LocalExecutor` with multiple modes.

.. note::

When multiple Schedulers are configured with ``executor = LocalExecutor`` in the ``[core]`` section of your ``airflow.cfg``, each Scheduler will run a LocalExecutor. This means tasks would be processed in a distributed fashion across the machines running the Schedulers.
Expand Down
26 changes: 0 additions & 26 deletions airflow-core/docs/core-concepts/executor/sequential.rst

This file was deleted.

2 changes: 1 addition & 1 deletion airflow-core/docs/howto/set-config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ the example below.
.. code-block:: bash

$ airflow config get-value core executor
SequentialExecutor
LocalExecutor

.. note::
For more information on configuration options, see :doc:`../configurations-ref`
Expand Down
6 changes: 3 additions & 3 deletions airflow-core/docs/howto/set-up-database.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,8 +72,8 @@ Setting up a SQLite Database
----------------------------

SQLite database can be used to run Airflow for development purpose as it does not require any database server
(the database is stored in a local file). There are many limitations of using the SQLite database (for example
it only works with Sequential Executor) and it should NEVER be used for production.
(the database is stored in a local file). There are plenty of limitations of using the SQLite database which you can
easily find online, and it should NEVER be used for production.

There is a minimum version of sqlite3 required to run Airflow 2.0+ - minimum version is 3.15.0. Some of the
older systems have an earlier version of sqlite installed by default and for those system you need to manually
Expand Down Expand Up @@ -468,5 +468,5 @@ Caution
What's next?
------------

By default, Airflow uses ``SequentialExecutor``, which does not provide parallelism. You should consider
By default, Airflow uses ``LocalExecutor``. You should consider
configuring a different :doc:`executor </core-concepts/executor/index>` for better performance.
1 change: 0 additions & 1 deletion airflow-core/docs/redirects.txt
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,6 @@ executor/dask.rst core-concepts/executor/dask.rst
executor/debug.rst core-concepts/executor/debug.rst
executor/celery.rst core-concepts/executor/celery.rst
executor/local.rst core-concepts/executor/local.rst
executor/sequential.rst core-concepts/executor/sequential.rst
upgrading-from-1-10/upgrade-check.rst howto/upgrading-from-1-10/upgrade-check.rst
core-concepts/dag-run.rst authoring-and-scheduling/cron.rst
core-concepts/executor/debug.rst core-concepts/debug.rst
Expand Down
8 changes: 0 additions & 8 deletions airflow-core/docs/start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -102,14 +102,6 @@ the ``Admin->Configuration`` menu. The PID file for the webserver will be stored
in ``$AIRFLOW_HOME/airflow-webserver.pid`` or in ``/run/airflow/webserver.pid``
if started by systemd.

Out of the box, Airflow uses a SQLite database, which you should outgrow
fairly quickly since no parallelization is possible using this database
backend. It works in conjunction with the
:class:`~airflow.executors.sequential_executor.SequentialExecutor` which will
only run task instances sequentially. While this is very limiting, it allows
you to get up and running quickly and take a tour of the UI and the
command line utilities.

As you grow and deploy Airflow to production, you will also want to move away
from the ``standalone`` command we use here to running the components
separately. You can read more in :doc:`/administration-and-deployment/production-deployment`.
Expand Down
8 changes: 2 additions & 6 deletions airflow-core/src/airflow/cli/commands/standalone_command.py
Original file line number Diff line number Diff line change
Expand Up @@ -172,12 +172,8 @@ def calculate_env(self):
# Make sure we're using a local executor flavour
executor_class, _ = ExecutorLoader.import_default_executor_cls()
if not executor_class.is_local:
if "sqlite" in conf.get("database", "sql_alchemy_conn"):
self.print_output("standalone", "Forcing executor to SequentialExecutor")
env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.SEQUENTIAL_EXECUTOR
else:
self.print_output("standalone", "Forcing executor to LocalExecutor")
env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.LOCAL_EXECUTOR
self.print_output("standalone", "Forcing executor to LocalExecutor")
env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.LOCAL_EXECUTOR

# Make sure we're using SimpleAuthManager
simple_auth_manager_classpath = (
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/src/airflow/config_templates/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ core:
executor:
description: |
The executor class that airflow should use. Choices include
``SequentialExecutor``, ``LocalExecutor``, ``CeleryExecutor``,
``LocalExecutor``, ``CeleryExecutor``,
``KubernetesExecutor`` or the full import path to the class when using a custom executor.
version_added: ~
type: string
Expand Down
2 changes: 0 additions & 2 deletions airflow-core/src/airflow/executors/executor_constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,11 @@ class ConnectorSource(Enum):


LOCAL_EXECUTOR = "LocalExecutor"
SEQUENTIAL_EXECUTOR = "SequentialExecutor"
CELERY_EXECUTOR = "CeleryExecutor"
KUBERNETES_EXECUTOR = "KubernetesExecutor"
MOCK_EXECUTOR = "MockExecutor"
CORE_EXECUTOR_NAMES = {
LOCAL_EXECUTOR,
SEQUENTIAL_EXECUTOR,
CELERY_EXECUTOR,
KUBERNETES_EXECUTOR,
MOCK_EXECUTOR,
Expand Down
2 changes: 0 additions & 2 deletions airflow-core/src/airflow/executors/executor_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@
CORE_EXECUTOR_NAMES,
KUBERNETES_EXECUTOR,
LOCAL_EXECUTOR,
SEQUENTIAL_EXECUTOR,
ConnectorSource,
)
from airflow.executors.executor_utils import ExecutorName
Expand Down Expand Up @@ -56,7 +55,6 @@ class ExecutorLoader:

executors = {
LOCAL_EXECUTOR: "airflow.executors.local_executor.LocalExecutor",
SEQUENTIAL_EXECUTOR: "airflow.executors.sequential_executor.SequentialExecutor",
CELERY_EXECUTOR: "airflow.providers.celery.executors.celery_executor.CeleryExecutor",
KUBERNETES_EXECUTOR: "airflow.providers.cncf.kubernetes."
"executors.kubernetes_executor.KubernetesExecutor",
Expand Down
101 changes: 0 additions & 101 deletions airflow-core/src/airflow/executors/sequential_executor.py

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,6 @@ def setup_class(cls):
[
("CeleryExecutor", False),
("LocalExecutor", True),
("SequentialExecutor", True),
("KubernetesExecutor", False),
],
)
Expand All @@ -64,7 +63,7 @@ def test_serve_logs_on_scheduler(self, mock_process, mock_scheduler_job, executo

@mock.patch("airflow.cli.commands.scheduler_command.SchedulerJobRunner")
@mock.patch("airflow.cli.commands.scheduler_command.Process")
@pytest.mark.parametrize("executor", ["LocalExecutor", "SequentialExecutor"])
@pytest.mark.parametrize("executor", ["LocalExecutor"])
def test_skip_serve_logs(self, mock_process, mock_scheduler_job, executor):
mock_scheduler_job.return_value.job_type = "SchedulerJob"
args = self.parser.parse_args(["scheduler", "--skip-serve-logs"])
Expand Down Expand Up @@ -100,7 +99,7 @@ def test_check_migrations_is_true(self, mock_process, mock_scheduler_job, mock_l

@mock.patch("airflow.cli.commands.scheduler_command.SchedulerJobRunner")
@mock.patch("airflow.cli.commands.scheduler_command.Process")
@pytest.mark.parametrize("executor", ["LocalExecutor", "SequentialExecutor"])
@pytest.mark.parametrize("executor", ["LocalExecutor"])
def test_graceful_shutdown(self, mock_process, mock_scheduler_job, executor):
mock_scheduler_job.return_value.job_type = "SchedulerJob"
args = self.parser.parse_args(["scheduler"])
Expand Down
20 changes: 5 additions & 15 deletions airflow-core/tests/unit/cli/commands/test_standalone_command.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,33 +28,23 @@
CELERY_EXECUTOR,
KUBERNETES_EXECUTOR,
LOCAL_EXECUTOR,
SEQUENTIAL_EXECUTOR,
)


class TestStandaloneCommand:
@pytest.mark.parametrize(
"conf_executor_name, conf_sql_alchemy_conn, expected_standalone_executor",
[
(LOCAL_EXECUTOR, "sqlite_conn_string", LOCAL_EXECUTOR),
(SEQUENTIAL_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(CELERY_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(KUBERNETES_EXECUTOR, "sqlite_conn_string", SEQUENTIAL_EXECUTOR),
(LOCAL_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
(SEQUENTIAL_EXECUTOR, "other_db_conn_string", SEQUENTIAL_EXECUTOR),
(CELERY_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
(KUBERNETES_EXECUTOR, "other_db_conn_string", LOCAL_EXECUTOR),
],
"conf_executor_name",
[LOCAL_EXECUTOR, CELERY_EXECUTOR, KUBERNETES_EXECUTOR],
)
def test_calculate_env(self, conf_executor_name, conf_sql_alchemy_conn, expected_standalone_executor):
def test_calculate_env(self, conf_executor_name):
"""Should always force a local executor compatible with the db."""
with mock.patch.dict(
"os.environ",
{
"AIRFLOW__CORE__EXECUTOR": conf_executor_name,
"AIRFLOW__DATABASE__SQL_ALCHEMY_CONN": conf_sql_alchemy_conn,
},
):
reload(executor_loader)
env = StandaloneCommand().calculate_env()
assert env["AIRFLOW__CORE__EXECUTOR"] == expected_standalone_executor
# all non local executors will fall back to localesecutor
assert env["AIRFLOW__CORE__EXECUTOR"] == LOCAL_EXECUTOR
2 changes: 0 additions & 2 deletions airflow-core/tests/unit/cli/test_cli_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -352,7 +352,6 @@ def test_positive_int(self):
)
def test_executor_specific_commands_not_accessible(self, command):
with (
conf_vars({("core", "executor"): "SequentialExecutor"}),
contextlib.redirect_stderr(StringIO()) as stderr,
):
reload(executor_loader)
Expand All @@ -369,7 +368,6 @@ def test_executor_specific_commands_not_accessible(self, command):
("CeleryExecutor", ["celery"]),
("KubernetesExecutor", ["kubernetes"]),
("LocalExecutor", []),
("SequentialExecutor", []),
# custom executors are mapped to the regular ones in `conftest.py`
("custom_executor.CustomLocalExecutor", []),
("custom_executor.CustomCeleryExecutor", ["celery"]),
Expand Down
Loading