Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no template rendering for LivyOperator #37357

Closed
1 of 2 tasks
LudvigLundberg opened this issue Feb 12, 2024 · 1 comment · Fixed by #37361
Closed
1 of 2 tasks

no template rendering for LivyOperator #37357

LudvigLundberg opened this issue Feb 12, 2024 · 1 comment · Fixed by #37361

Comments

@LudvigLundberg
Copy link

LudvigLundberg commented Feb 12, 2024

Apache Airflow Provider(s)

apache-livy

Versions of Apache Airflow Providers

apache-airflow-providers-apache-livy==3.7.1

Apache Airflow version

2.8.1

Operating System

macOS Sonoma 14.0 (23A344)

Deployment

docker compose

Deployment details

No response

What happened

Template rendering for the livy operator doesn*t work (tested with args and files), nothing is rendered in the rendered template tab in the task instance view and the un-rendered string is sent to livy, e.g.:

Submitting job {"file": "/spark-examples.jar", "className": "org.apache.spark.examples.SparkPi", "args": ["{{ task.task_id }}"], "numExecutors": 1, "conf": {"spark.shuffle.compress": "false"}}

I verified that template rendering is working in other DAGS by running the included example_bash_operator DAG

What you think should happen instead

When downgrading the version to apache-airflow-providers-apache-livy==3.7.0, rendered template tab is filled with values and the rendered string is sent to livy:

Submitting job {"file": "/spark-examples.jar", "className": "org.apache.spark.examples.SparkPi", "args": ["pi_java_task"], "numExecutors": 1, "conf": {"spark.shuffle.compress": "false"}}

How to reproduce

Following the instructions from https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html using docker compose and building the following Dockerfile:

FROM apache/airflow:2.8.1
ADD requirements.txt .
RUN pip install apache-airflow==2.8.1 -r requirements.txt

A custom requirements.txt:

apache-airflow-providers-apache-livy==3.7.1

Running the following example DAG:

from datetime import datetime

from airflow import DAG
from airflow.providers.apache.livy.operators.livy import LivyOperator

with DAG(
    dag_id='example_livy_operator',
    default_args={'args': [10]},
    schedule_interval='@daily',
    start_date=datetime(2021, 1, 1),
    catchup=False,
) as dag:

    # [START create_livy]
    livy_java_task = LivyOperator(
        task_id="pi_java_task",
        file='/spark-examples.jar',
        args=["{{ task.task_id }}"],
        num_executors=1,
        conf={
            'spark.shuffle.compress': 'false',
        },
        class_name='org.apache.spark.examples.SparkPi',
    )

To get template rendering working change the requirements.txt to the following and rebuild/deploy airflow:

apache-airflow-providers-apache-livy==3.7.0

Anything else

Seems the issue was introduced in #36490 to fix #36484

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@LudvigLundberg LudvigLundberg added area:providers kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet labels Feb 12, 2024
Copy link

boring-cyborg bot commented Feb 12, 2024

Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.

@Taragolis Taragolis added provider:apache-livy and removed needs-triage label for new issues that we didn't triage yet labels Feb 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants