Skip to content

Commit

Permalink
Migrate Qubole example DAGs to new design apache#22460
Browse files Browse the repository at this point in the history
  • Loading branch information
chethanuk committed Jun 3, 2022
1 parent f294a26 commit 03265d7
Show file tree
Hide file tree
Showing 5 changed files with 40 additions and 33 deletions.
17 changes: 0 additions & 17 deletions airflow/providers/qubole/example_dags/__init__.py

This file was deleted.

2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-qubole/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Content
:maxdepth: 1
:caption: Resources

Example DAGs <https://github.com/apache/airflow/tree/main/airflow/providers/qubole/example_dags>
Example DAGs <https://github.com/apache/airflow/tree/main/tests/system/providers/qubole>
PyPI Repository <https://pypi.org/project/apache-airflow-providers-qubole/>
Installing from sources <installing-providers-from-sources>

Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-qubole/operators/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,4 +29,4 @@ Qubole Operators

.. note::
You can learn how to use Google Cloud integrations by analyzing the
`source code <https://github.com/apache/airflow/tree/main/airflow/providers/qubole/example_dags/>`_ of the particular example DAGs.
`source code <https://github.com/apache/airflow/tree/main/tests/system/providers/qubole/>`_ of the particular example DAGs.
24 changes: 12 additions & 12 deletions docs/apache-airflow-providers-qubole/operators/qubole.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,15 +43,15 @@ Run Hive command

To run query that shows all tables you can use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_hive_query]
:end-before: [END howto_operator_qubole_run_hive_query]

Also you can run script that locates in the bucket by passing path to query file

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_hive_script]
Expand All @@ -62,7 +62,7 @@ Run Hadoop command

To run jar file in your Hadoop cluster use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_hadoop_jar]
Expand All @@ -73,7 +73,7 @@ Run Pig command

To run script script in *Pig Latin* in your Hadoop cluster use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_pig_script]
Expand All @@ -84,7 +84,7 @@ Run Shell command

To run Shell-script script use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_shell_script]
Expand All @@ -95,7 +95,7 @@ Run Presto command

To run query using Presto use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_presto_query]
Expand All @@ -106,23 +106,23 @@ Run DB commands

To run query as `DbTap <https://docs.qubole.com/en/latest/rest-api/dbtap_api/index.html>`_ use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_db_tap_query]
:end-before: [END howto_operator_qubole_run_db_tap_query]

To run DB export command use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_db_export]
:end-before: [END howto_operator_qubole_run_db_export]

To run DB import command use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_db_import]
Expand All @@ -133,7 +133,7 @@ Run Spark commands

To run Scala script as a Spark job use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_operator_qubole_run_spark_scala]
Expand All @@ -153,7 +153,7 @@ File or directory existence

To wait for file or directory existence in cluster use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_sensor_qubole_run_file_sensor]
Expand All @@ -173,7 +173,7 @@ Partition existence

To wait for table partition existence in cluster use

.. exampleinclude:: /../../airflow/providers/qubole/example_dags/example_qubole.py
.. exampleinclude:: /../../tests/system/providers/qubole/example_qubole.py
:language: python
:dedent: 4
:start-after: [START howto_sensor_qubole_run_partition_sensor]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
# specific language governing permissions and limitations
# under the License.

import os
import filecmp
import random
import textwrap
Expand All @@ -35,8 +36,12 @@

START_DATE = datetime(2021, 1, 1)

ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
DAG_ID = "example_qubole_operator"
DAG2_ID = "example_qubole_sensor"

with DAG(
dag_id='example_qubole_operator',
dag_id=DAG_ID,
schedule_interval=None,
start_date=START_DATE,
tags=['example'],
Expand Down Expand Up @@ -218,8 +223,14 @@ def main(args: Array[String]) {

branching >> db_import >> spark_cmd >> join

from tests.system.utils.watcher import watcher

# This test needs watcher in order to properly mark success/failure
# when "tearDown" task with trigger rule is part of the DAG
list(dag.tasks) >> watcher()

with DAG(
dag_id='example_qubole_sensor',
dag_id=DAG2_ID,
schedule_interval=None,
start_date=START_DATE,
tags=['example'],
Expand Down Expand Up @@ -268,3 +279,16 @@ def main(args: Array[String]) {
# [END howto_sensor_qubole_run_partition_sensor]

check_s3_file >> check_hive_partition

from tests.system.utils.watcher import watcher

# This test needs watcher in order to properly mark success/failure
# when "tearDown" task with trigger rule is part of the DAG
list(dag.tasks) >> watcher()


from tests.system.utils import get_test_run # noqa: E402

# Needed to run the example DAG with pytest (see: tests/system/README.md#run_via_pytest)
test_run = get_test_run(dag)
test_run_2 = get_test_run(dag2)

0 comments on commit 03265d7

Please sign in to comment.