Skip to content

Commit

Permalink
Use Airflow trademark as it has been registered (astronomer#1105)
Browse files Browse the repository at this point in the history
  • Loading branch information
pankajastro authored and dwreeves committed Jul 31, 2024
1 parent f39592f commit b88440b
Show file tree
Hide file tree
Showing 8 changed files with 8 additions and 8 deletions.
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

|fury| |ossrank| |downloads| |pre-commit|

Run your dbt Core projects as `Apache Airflow <https://airflow.apache.org/>`_ DAGs and Task Groups with a few lines of code. Benefits include:
Run your dbt Core projects as `Apache Airflow® <https://airflow.apache.org/>`_ DAGs and Task Groups with a few lines of code. Benefits include:

- Run dbt projects against Airflow connections instead of dbt profiles
- Native support for installing and running dbt in a virtual environment to avoid dependency conflicts with Airflow
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/hosting-docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Hosting Docs
============

dbt docs can be served directly from the Apache Airflow webserver with the Cosmos Airflow plugin, without requiring the user to set up anything outside of Airflow. This page describes how to host docs in the Airflow webserver directly, although some users may opt to host docs externally.
dbt docs can be served directly from the `Apache Airflow® <https://airflow.apache.org/>`_ webserver with the Cosmos Airflow plugin, without requiring the user to set up anything outside of Airflow. This page describes how to host docs in the Airflow webserver directly, although some users may opt to host docs externally.

Overview
~~~~~~~~
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/scheduling.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ To schedule a dbt project on a time-based schedule, you can use Airflow's schedu
Data-Aware Scheduling
---------------------

Apache Airflow 2.4 introduced the concept of `scheduling based on Datasets <https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/datasets.html>`_.
`Apache Airflow® <https://airflow.apache.org/>`_ 2.4 introduced the concept of `scheduling based on Datasets <https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/datasets.html>`_.

By default, if Airflow 2.4 or higher is used, Cosmos emits `Airflow Datasets <https://airflow.apache.org/docs/apache-airflow/stable/concepts/datasets.html>`_ when running dbt projects. This allows you to use Airflow's data-aware scheduling capabilities to schedule your dbt projects. Cosmos emits datasets using the OpenLineage URI format, as detailed in the `OpenLineage Naming Convention <https://github.com/OpenLineage/OpenLineage/blob/main/spec/Naming.md>`_.

Expand Down
2 changes: 1 addition & 1 deletion docs/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ We currently use `hatch <https://github.com/pypa/hatch>`_ for building and distr

The tool can also be used for local development. The `pyproject.toml <https://github.com/astronomer/astronomer-cosmos/blob/main/pyproject.toml>`_ file currently defines a matrix of supported versions of Python and Airflow for which a user can run the tests against.

For instance, to run the tests using Python 3.10 and Apache Airflow 2.5, use the following:
For instance, to run the tests using Python 3.10 and `Apache Airflow® <https://airflow.apache.org/>`_ 2.5, use the following:

.. code-block:: bash
Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started/docker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Step-by-step instructions

**Install Airflow and Cosmos**

Create a python virtualenv, activate it, upgrade pip to the latest version and install apache airflow & astronomer-postgres
Create a python virtualenv, activate it, upgrade pip to the latest version and install `Apache Airflow® <https://airflow.apache.org/>`_ & astronomer-postgres

.. code-block:: bash
Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started/execution-modes-local-conflicts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Airflow and dbt dependencies conflicts
======================================

When using the `Local Execution Mode <execution-modes.html#local>`__, users may face dependency conflicts between
Apache Airflow and dbt. The conflicts may increase depending on the Airflow providers and dbt adapters being used.
`Apache Airflow® <https://airflow.apache.org/>`_ and dbt. The conflicts may increase depending on the Airflow providers and dbt adapters being used.

If you find errors, we recommend users look into using `alternative execution modes <execution-modes.html>`__.

Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started/mwaa.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Getting Started on MWAA
=======================

Users can face Python dependency issues when trying to use the Cosmos `Local Execution Mode <execution-modes.html#local>`_ in Amazon Managed Workflows for Apache Airflow (MWAA).
Users can face Python dependency issues when trying to use the Cosmos `Local Execution Mode <execution-modes.html#local>`_ in Amazon Managed Workflows for `Apache Airflow® <https://airflow.apache.org/>`_ (MWAA).

This step-by-step illustrates how to use the Local Execution Mode, together with the
`MWAA's startup script <https://docs.aws.amazon.com/mwaa/latest/userguide/using-startup-script.html>`_ and
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@

|fury| |ossrank| |downloads| |pre-commit|

Run your dbt Core projects as `Apache Airflow <https://airflow.apache.org/>`_ DAGs and Task Groups with a few lines of code. Benefits include:
Run your dbt Core projects as `Apache Airflow® <https://airflow.apache.org/>`_ DAGs and Task Groups with a few lines of code. Benefits include:

- Run dbt projects against Airflow connections instead of dbt profiles
- Native support for installing and running dbt in a virtual environment to avoid dependency conflicts with Airflow
Expand Down

0 comments on commit b88440b

Please sign in to comment.