diff --git a/contributing-docs/03_contributors_quick_start.rst b/contributing-docs/03_contributors_quick_start.rst index acc33accc3346..bffdbc5bb30c2 100644 --- a/contributing-docs/03_contributors_quick_start.rst +++ b/contributing-docs/03_contributors_quick_start.rst @@ -510,8 +510,8 @@ Using Breeze 1. Starting the Breeze environment using ``breeze start-airflow`` starts the Breeze environment with last configuration run( In this case Python version and backend are picked up from last execution ``breeze --python 3.9 --backend postgres``) - It also automatically starts the webserver, triggerer, dag processor, FastAPI api and scheduler. It drops you in tmux with triggerer to the right, and - Scheduler, FastAPI API, DAG processor and webserver from left to right at the bottom. Use ``[Ctrl + B] and Arrow keys`` to navigate. + It also automatically starts the API server (FastAPI api and UI), triggerer, dag processor and scheduler. It drops you in tmux with triggerer to the right, and + Scheduler, API server (FastAPI api and UI), Dag processor from left to right at the bottom. Use ``[Ctrl + B] and Arrow keys`` to navigate. .. code-block:: bash diff --git a/contributing-docs/05_pull_requests.rst b/contributing-docs/05_pull_requests.rst index fbd74ffb09a20..a0aa446ba1252 100644 --- a/contributing-docs/05_pull_requests.rst +++ b/contributing-docs/05_pull_requests.rst @@ -246,7 +246,7 @@ Airflow Operators might have some fields added to the list of ``template_fields` set in the constructor (``__init__`` method) of the operator and usually their values should come from the ``__init__`` method arguments. The reason for that is that the templated fields are evaluated at the time of the operator execution and when you pass arguments to the operator -in the DAG, the fields that are set on the class just before the ``execute`` method is called +in the Dag, the fields that are set on the class just before the ``execute`` method is called are processed through templating engine and the fields values are set to the result of applying the templating engine to the fields (in case the field is a structure such as dict or list, the templating engine is applied to all the values of the structure). diff --git a/contributing-docs/09_testing.rst b/contributing-docs/09_testing.rst index 7496acb1c47f7..842d009d1183b 100644 --- a/contributing-docs/09_testing.rst +++ b/contributing-docs/09_testing.rst @@ -42,7 +42,7 @@ includes: rendered correctly for various configuration parameters. * `System tests `__ are automatic tests that use external systems like - Google Cloud and AWS. These tests are intended for an end-to-end DAG execution. + Google Cloud and AWS. These tests are intended for an end-to-end Dag execution. You can also run other kinds of tests when you are developing airflow packages: @@ -52,7 +52,7 @@ You can also run other kinds of tests when you are developing airflow packages: * `Python client tests `__ are tests we run to check if the Python API client works correctly. -* `DAG testing `__ is a document that describes how to test DAGs in a local environment +* `Dag testing `__ is a document that describes how to test Dags in a local environment with ``dag.test()``. ------ diff --git a/contributing-docs/12_provider_distributions.rst b/contributing-docs/12_provider_distributions.rst index 434ebccbe99a2..18bed77d03168 100644 --- a/contributing-docs/12_provider_distributions.rst +++ b/contributing-docs/12_provider_distributions.rst @@ -235,7 +235,7 @@ The rules are as follows: * PROVIDER * system * PROVIDER - * example_dags -> example DAGs are stored here (used for documentation and System Tests) + * example_dags -> example Dags are stored here (used for documentation and System Tests) * Module names do not contain word "hooks", "operators" etc. The right type comes from the python package. For example 'hooks.datastore' module contains DataStore hook and @@ -293,7 +293,7 @@ Well documented provider contains those: You can see for example ``google`` provider which has very comprehensive documentation: * `Documentation <../../providers/google/docs>`_ -* `System tests/Example DAGs <../providers/google/tests/system/google/>`_ +* `System tests/Example Dags <../providers/google/tests/system/google/>`_ Part of the documentation are example dags (placed in the ``tests/system`` folder). The reason why they are in ``tests/system`` is because we are using the example dags for various purposes: diff --git a/contributing-docs/13_airflow_dependencies_and_extras.rst b/contributing-docs/13_airflow_dependencies_and_extras.rst index 0cb3d26820307..4cfb717b3feb6 100644 --- a/contributing-docs/13_airflow_dependencies_and_extras.rst +++ b/contributing-docs/13_airflow_dependencies_and_extras.rst @@ -29,7 +29,7 @@ For applications, pinning the dependencies makes it more stable to install in th be open to allow several different libraries with the same requirements to be installed at the same time. The problem is that Apache Airflow is a bit of both - application to install and library to be used when -you are developing your own operators and DAGs. +you are developing your own operators and Dags. This - seemingly unsolvable - puzzle is solved by having pinned constraints files. diff --git a/contributing-docs/16_adding_api_endpoints.rst b/contributing-docs/16_adding_api_endpoints.rst index 54cbe6163e722..408b8c791ba25 100644 --- a/contributing-docs/16_adding_api_endpoints.rst +++ b/contributing-docs/16_adding_api_endpoints.rst @@ -54,7 +54,7 @@ Example: paused: bool | None = None, order_by: str = "dag_id", session: SessionDep, - ) -> DAGCollectionResponse: + ) -> DagCollectionResponse: pass @@ -88,8 +88,8 @@ In some cases, you may need to define additional models for new data structures. .. code-block:: python - class DAGModelResponse(BaseModel): - """DAG serializer for responses.""" + class DagModelResponse(BaseModel): + """Dag serializer for responses.""" dag_id: str dag_display_name: str diff --git a/contributing-docs/20_debugging_airflow_components.rst b/contributing-docs/20_debugging_airflow_components.rst index c6502f787fb2a..b24b983428735 100644 --- a/contributing-docs/20_debugging_airflow_components.rst +++ b/contributing-docs/20_debugging_airflow_components.rst @@ -46,10 +46,10 @@ To start Airflow with debugging enabled, use the ``--debug`` flag to specify whi Available Components for Debugging ---------------------------------- -* **scheduler** - The Airflow scheduler that monitors DAGs and triggers task instances +* **scheduler** - The Airflow scheduler that monitors Dags and triggers task instances * **triggerer** - The triggerer service that handles deferred tasks and triggers * **api-server** - The Airflow REST API server -* **dag-processor** - The DAG processor service (when using standalone DAG processor) +* **dag-processor** - The Dag processor service (when using standalone Dag processor) * **edge-worker** - The edge worker service (when using EdgeExecutor) * **celery-worker** - Celery worker processes (when using CeleryExecutor) @@ -113,7 +113,7 @@ Setting up VSCode for Remote Debugging when you start Airflow with debugging enabled: * **Scheduler**: 50231 - * **DAG Processor**: 50232 + * **Dag Processor**: 50232 * **Triggerer**: 50233 * **API Server**: 50234 * **Celery Worker**: 50235 @@ -147,10 +147,10 @@ Debugging Workflow Perform an action that will trigger the code path with your breakpoint: - - For scheduler: Trigger a DAG or wait for scheduled execution + - For scheduler: Trigger a Dag or wait for scheduled execution - For API server: Make an API call - For triggerer: Create a deferred task - - For DAG processor: Parse a DAG file + - For Dag processor: Parse a Dag file 5. **Debug Session** diff --git a/contributing-docs/quick-start-ide/contributors_quick_start_pycharm.rst b/contributing-docs/quick-start-ide/contributors_quick_start_pycharm.rst index a7e8de7c1614c..276ed80fd83b6 100644 --- a/contributing-docs/quick-start-ide/contributors_quick_start_pycharm.rst +++ b/contributing-docs/quick-start-ide/contributors_quick_start_pycharm.rst @@ -152,7 +152,7 @@ It requires "airflow-env" virtual environment configured locally. - Now set ``sql_alchemy_conn = mysql+pymysql://root:@127.0.0.1:23306/airflow?charset=utf8mb4`` in file ``~/airflow/airflow.cfg`` on local machine. -2. Debugging an example DAG +2. Debugging an example Dag - Add Interpreter to PyCharm pointing interpreter path to ``~/.pyenv/versions/airflow-env/bin/python``, which is virtual environment ``airflow-env`` created with pyenv earlier. For adding an Interpreter go to ``File -> Setting -> Project: @@ -165,13 +165,13 @@ It requires "airflow-env" virtual environment configured locally. alt="Adding existing interpreter"> -- In PyCharm IDE open airflow project, directory ``/files/dags`` of local machine is by default mounted to docker - machine when breeze airflow is started. So any DAG file present in this directory will be picked automatically by +- In PyCharm IDE open Airflow project, directory ``/files/dags`` of local machine is by default mounted to docker + machine when breeze Airflow is started. So any Dag file present in this directory will be picked automatically by scheduler running in docker machine and same can be seen on ``http://127.0.0.1:28080``. -- Copy any example DAG present in the ``/airflow/example_dags`` directory to ``/files/dags/``. +- Copy any example Dag present in the ``/airflow/example_dags`` directory to ``/files/dags/``. -- Add a ``__main__`` block at the end of your DAG file to make it runnable: +- Add a ``__main__`` block at the end of your Dag file to make it runnable: .. code-block:: python diff --git a/contributing-docs/quick-start-ide/contributors_quick_start_vscode.rst b/contributing-docs/quick-start-ide/contributors_quick_start_vscode.rst index 34991d027155c..e95fe7b722524 100644 --- a/contributing-docs/quick-start-ide/contributors_quick_start_vscode.rst +++ b/contributing-docs/quick-start-ide/contributors_quick_start_vscode.rst @@ -93,15 +93,15 @@ Setting up debugging - Now set ``sql_alchemy_conn = mysql+pymysql://root:@127.0.0.1:23306/airflow?charset=utf8mb4`` in file ``~/airflow/airflow.cfg`` on local machine. -1. Debugging an example DAG +1. Debugging an example Dag -- In Visual Studio Code open airflow project, directory ``/files/dags`` of local machine is by default mounted to docker - machine when breeze airflow is started. So any DAG file present in this directory will be picked automatically by +- In Visual Studio Code open Airflow project, directory ``/files/dags`` of local machine is by default mounted to docker + machine when breeze Airflow is started. So any Dag file present in this directory will be picked automatically by scheduler running in docker machine and same can be seen on ``http://127.0.0.1:28080``. -- Copy any example DAG present in the ``/airflow/example_dags`` directory to ``/files/dags/``. +- Copy any example Dag present in the ``/airflow/example_dags`` directory to ``/files/dags/``. -- Add a ``__main__`` block at the end of your DAG file to make it runnable. It will run a ``back_fill`` job: +- Add a ``__main__`` block at the end of your Dag file to make it runnable. It will run a ``back_fill`` job: .. code-block:: python diff --git a/contributing-docs/testing/dag_testing.rst b/contributing-docs/testing/dag_testing.rst index 8fbe8a27efb8a..522cde734b4d9 100644 --- a/contributing-docs/testing/dag_testing.rst +++ b/contributing-docs/testing/dag_testing.rst @@ -16,15 +16,15 @@ specific language governing permissions and limitations under the License. -DAG Testing +Dag Testing =========== -To ease and speed up the process of developing DAGs, you can use +To ease and speed up the process of developing Dags, you can use py:meth:`~airflow.models.dag.DAG.test`, which will run a dag in a single process. To set up the IDE: -1. Add ``main`` block at the end of your DAG file to make it runnable. +1. Add ``main`` block at the end of your Dag file to make it runnable. .. code-block:: python @@ -32,7 +32,7 @@ To set up the IDE: dag.test() -2. Run and debug the DAG file. +2. Run and debug the Dag file. You can also run the dag in the same manner with the Airflow CLI command ``airflow dags test``: @@ -46,7 +46,7 @@ By default ``/files/dags`` folder is mounted from your local `` the directory used by airflow scheduler and webserver to scan dags for. You can place your dags there to test them. -The DAGs can be run in the main version of Airflow but they also work +The Dags can be run in the main version of Airflow but they also work with older versions. ----- diff --git a/contributing-docs/testing/docker_compose_tests.rst b/contributing-docs/testing/docker_compose_tests.rst index 921a3cafb193b..8e86a8ce0e42b 100644 --- a/contributing-docs/testing/docker_compose_tests.rst +++ b/contributing-docs/testing/docker_compose_tests.rst @@ -34,8 +34,8 @@ The way the tests work: 1. They first build the Airflow production image 2. Then they take the Docker Compose file of ours and use the image to start it -3. Then they perform some simple DAG trigger tests which checks whether Airflow is up and can process - an example DAG +3. Then they perform some simple Dag trigger tests which checks whether Airflow is up and can process + an example Dag This is done in a local environment, not in the Breeze CI image. It uses ``COMPOSE_PROJECT_NAME`` set to ``quick-start`` to avoid conflicts with other docker compose deployments you might have. @@ -95,7 +95,7 @@ but make sure to use the docker-compose file from the sources in ``docs/apache-airflow/stable/howto/docker-compose/`` folder. Then, the usual ``docker compose`` and ``docker`` commands can be used to debug such running instances. -The test performs a simple API call to trigger a DAG and wait for it, but you can follow our +The test performs a simple API call to trigger a Dag and wait for it, but you can follow our documentation to connect to such running docker compose instances and test it manually. ----- diff --git a/contributing-docs/testing/python_client_tests.rst b/contributing-docs/testing/python_client_tests.rst index 9731fc3dd366d..2b4851dd0c83c 100644 --- a/contributing-docs/testing/python_client_tests.rst +++ b/contributing-docs/testing/python_client_tests.rst @@ -30,7 +30,7 @@ The way the tests work: 1. The Airflow Python API client package is first built into a wheel file and placed in the dist folder. 2. The ``breeze testing python-api-client-tests`` command is used to initiate the tests. 3. This command installs the package from the dist folder. -4. Example DAGs are then parsed and executed to validate the Python API client. +4. Example Dags are then parsed and executed to validate the Python API client. 5. The webserver is started with the credentials admin/admin, and tests are run against the webserver. If you have python client repository not cloned, you can clone it by running the following command: diff --git a/contributing-docs/testing/system_tests.rst b/contributing-docs/testing/system_tests.rst index 298eac36c4c15..a1e8b44a0e797 100644 --- a/contributing-docs/testing/system_tests.rst +++ b/contributing-docs/testing/system_tests.rst @@ -18,8 +18,8 @@ Airflow System Tests ==================== -System tests verify the correctness of Airflow Operators by running them in DAGs and allowing to communicate with -external services. A system test tries to look as close to a regular DAG as possible, and it generally checks the +System tests verify the correctness of Airflow Operators by running them in Dags and allowing to communicate with +external services. A system test tries to look as close to a regular Dag as possible, and it generally checks the "happy path" (a scenario featuring no errors) ensuring that the Operator works as expected. System tests need to communicate with external services/systems that are available @@ -35,7 +35,7 @@ The purpose of these tests is to: - assure high quality of providers and their integration with Airflow core, - avoid regression in providers when doing changes to the Airflow, - autogenerate documentation for Operators from code, -- provide runnable example DAGs with use cases for different Operators, +- provide runnable example Dags with use cases for different Operators, - serve both as examples and test files. - the excerpts from these system tests are used to generate documentation @@ -57,17 +57,17 @@ set it before running test command. Running the System Tests ------------------------ -There are multiple ways of running system tests. Each system test is a self-contained DAG, so it can be run as any -other DAG. Some tests may require access to external services, enabled APIs or specific permissions. Make sure to +There are multiple ways of running system tests. Each system test is a self-contained Dag, so it can be run as any +other Dag. Some tests may require access to external services, enabled APIs or specific permissions. Make sure to prepare your environment correctly, depending on the system tests you want to run - some may require additional configuration which should be documented by the relevant providers in their subdirectory ``tests/system//README.md``. -Running as Airflow DAGs +Running as Airflow Dags ....................... If you have a working Airflow environment with a scheduler and a webserver, you can import system test files into -your Airflow instance as DAGs and they will be automatically triggered. If the setup of the environment is correct +your Airflow instance as Dags and they will be automatically triggered. If the setup of the environment is correct (depending on the type of tests you want to run), they should be executed without any issues. The instructions on how to set up the environment is documented in each provider's system tests directory. Make sure that all resource required by the tests are also imported. diff --git a/dev/breeze/README.md b/dev/breeze/README.md index 2b6895c4dbbd7..9f04a8021d4b6 100644 --- a/dev/breeze/README.md +++ b/dev/breeze/README.md @@ -135,6 +135,6 @@ PLEASE DO NOT MODIFY THE HASH BELOW! IT IS AUTOMATICALLY UPDATED BY PREK. --------------------------------------------------------------------------------------------------------- -Package config hash: 9828608f74f22e9289563572007befdc15a1149e840bbe4acceff8273bdd15b52d8106fd891c15344709921e2aec6c8c6c7e234f7199926c84b28797f2475af3 +Package config hash: eb5378d0e948bc7b1a1b72678913f093d19bf27f8049705abc5ab5626a376eb5fc349115573156918d5e63c536cde5885563d186dfb0e089bcfac326314134f8 --------------------------------------------------------------------------------------------------------- diff --git a/dev/breeze/doc/03_developer_tasks.rst b/dev/breeze/doc/03_developer_tasks.rst index c0a55f22de5f2..ca4489b2d572e 100644 --- a/dev/breeze/doc/03_developer_tasks.rst +++ b/dev/breeze/doc/03_developer_tasks.rst @@ -171,7 +171,7 @@ Remote Debugging in IDE One of the possibilities (albeit only easy if you have a paid version of IntelliJ IDEs for example) with Breeze is an option to run remote debugging in your IDE graphical interface. -When you run tests, airflow, example DAGs, even if you run them using unit tests, they are run in a separate +When you run tests, airflow, example Dags, even if you run them using unit tests, they are run in a separate container. This makes it a little harder to use with IDE built-in debuggers. Fortunately, IntelliJ/PyCharm provides an effective remote debugging feature (but only in paid versions). See additional details on diff --git a/dev/breeze/doc/10_advanced_breeze_topics.rst b/dev/breeze/doc/10_advanced_breeze_topics.rst index 364d0a1d7a1df..7b4c3d8c0aaf2 100644 --- a/dev/breeze/doc/10_advanced_breeze_topics.rst +++ b/dev/breeze/doc/10_advanced_breeze_topics.rst @@ -76,7 +76,7 @@ When you are in the CI container, the following directories are used: unittest.cfg - unit test configuration generated when entering the environment; webserver_config.py - webserver configuration generated when running Airflow in the container. /files - files mounted from "files" folder in your sources. You can edit them in the host as well - dags - this is the folder where Airflow DAGs are read from + dags - this is the folder where Airflow Dags are read from airflow-breeze-config - this is where you can keep your own customization configuration of breeze Note that when running in your local environment, the ``/root/airflow/logs`` folder is actually mounted @@ -95,7 +95,7 @@ When you are in the production container, the following directories are used: unittest.cfg - unit test configuration generated when entering the environment; webserver_config.py - webserver configuration generated when running Airflow in the container. /files - files mounted from "files" folder in your sources. You can edit them in the host as well - dags - this is the folder where Airflow DAGs are read from + dags - this is the folder where Airflow Dags are read from Note that when running in your local environment, the ``/root/airflow/logs`` folder is actually mounted from your ``logs`` directory in the Airflow sources, so all logs created in the container are automatically @@ -118,7 +118,7 @@ configure and run Docker. They will not be removed between Docker runs. By default ``/files/dags`` folder is mounted from your local ``/files/dags`` and this is the directory used by Airflow scheduler and api-server to scan dags for. You can use it to test your dags -from local sources in Airflow. If you wish to add local DAGs that can be run by Breeze. +from local sources in Airflow. If you wish to add local Dags that can be run by Breeze. The ``/files/airflow-breeze-config`` folder contains configuration files that might be used to customize your breeze instance. Those files will be kept across checking out a code from different diff --git a/devel-common/src/docs/shared/template-examples/taskflow-kwargs.rst b/devel-common/src/docs/shared/template-examples/taskflow-kwargs.rst index ed92b991ff322..c8dd9e23ee549 100644 --- a/devel-common/src/docs/shared/template-examples/taskflow-kwargs.rst +++ b/devel-common/src/docs/shared/template-examples/taskflow-kwargs.rst @@ -28,4 +28,4 @@ print(f"Duration: {ti.duration}") # Duration: 0.972019 dr: DagRun = kwargs["dag_run"] - print(f"DAG Run queued at: {dr.queued_at}") # 2023-08-10 00:00:01+02:20 + print(f"Dag Run queued at: {dr.queued_at}") # 2023-08-10 00:00:01+02:20 diff --git a/devel-common/src/docs/shared/template-examples/taskflow.rst b/devel-common/src/docs/shared/template-examples/taskflow.rst index ee5251ba256fa..2caecaa4b1144 100644 --- a/devel-common/src/docs/shared/template-examples/taskflow.rst +++ b/devel-common/src/docs/shared/template-examples/taskflow.rst @@ -25,4 +25,4 @@ def print_ti_info(task_instance: TaskInstance, dag_run: DagRun): print(f"Run ID: {task_instance.run_id}") # Run ID: scheduled__2023-08-09T00:00:00+00:00 print(f"Duration: {task_instance.duration}") # Duration: 0.972019 - print(f"DAG Run queued at: {dag_run.queued_at}") # 2023-08-10 00:00:01+02:20 + print(f"Dag Run queued at: {dag_run.queued_at}") # 2023-08-10 00:00:01+02:20 diff --git a/devel-common/src/sphinx_exts/includes/dag-definition.rst b/devel-common/src/sphinx_exts/includes/dag-definition.rst index 3314ef161f495..f3643e19afc47 100644 --- a/devel-common/src/sphinx_exts/includes/dag-definition.rst +++ b/devel-common/src/sphinx_exts/includes/dag-definition.rst @@ -16,7 +16,7 @@ under the License. .. dag-definition-start -A DAG is a model that encapsulates everything needed to execute a workflow. Some DAG attributes include the following: +A Dag is a model that encapsulates everything needed to execute a workflow. Some Dag attributes include the following: * **Schedule**: When the workflow should run. * **Tasks**: :doc:`tasks ` are discrete units of work that are run on workers. @@ -28,5 +28,5 @@ A DAG is a model that encapsulates everything needed to execute a workflow. Some .. dag-etymology-start .. note:: - The term "DAG" comes from the mathematical concept "directed acyclic graph", but the meaning in Airflow has evolved well beyond just the literal data structure associated with the mathematical DAG concept. + The term "DAG" comes from the mathematical concept "directed acyclic graph", but the meaning in Airflow has evolved well beyond just the literal data structure associated with the mathematical DAG concept. Therefore it was decided to use the term Dag in Airflow. .. dag-etymology-end diff --git a/docker-stack-docs/build.rst b/docker-stack-docs/build.rst index efbc93eac066b..eac338092a714 100644 --- a/docker-stack-docs/build.rst +++ b/docker-stack-docs/build.rst @@ -366,7 +366,7 @@ You should be aware, about a few things The dags in production image are in ``/opt/airflow/dags`` folder. * You can build your image without any need for Airflow sources. It is enough that you place the - ``Dockerfile`` and any files that are referred to (such as DAG files) in a separate directory and run + ``Dockerfile`` and any files that are referred to (such as Dag files) in a separate directory and run a command ``docker build . --pull --tag my-image:my-tag`` (where ``my-image`` is the name you want to name it and ``my-tag`` is the tag you want to tag the image with. diff --git a/docker-stack-docs/entrypoint.rst b/docker-stack-docs/entrypoint.rst index d4ab8a5d645f8..bda71b31d66d4 100644 --- a/docker-stack-docs/entrypoint.rst +++ b/docker-stack-docs/entrypoint.rst @@ -169,7 +169,7 @@ If there are any other arguments - they are simply passed to the "airflow" comma backfill Manage backfills config View configuration connections Manage connections - dags Manage DAGs + dags Manage Dags db Database operations jobs Manage jobs pools Manage pools diff --git a/providers-summary-docs/core-extensions/notifications.rst b/providers-summary-docs/core-extensions/notifications.rst index f16916c55bb3c..248b67f9ab124 100644 --- a/providers-summary-docs/core-extensions/notifications.rst +++ b/providers-summary-docs/core-extensions/notifications.rst @@ -19,7 +19,7 @@ Notifications ------------- This is a summary of all Apache Airflow Community provided implementations of notifications. -Notifications allow you to send messages to external systems when a task instance/DAG run changes state. +Notifications allow you to send messages to external systems when a task instance/Dag run changes state. Notifications are explained in :doc:`apache-airflow:howto/notifications` and here you can also see all the notifications provided by the community-managed providers: