Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SparkKubernetesOperator not rendering the templated kuberntes_conn_id after upgrading #42546

Closed
1 of 2 tasks
nikhilkarve opened this issue Sep 27, 2024 · 3 comments · Fixed by #42786
Closed
1 of 2 tasks
Assignees
Labels
area:core area:providers kind:bug This is a clearly a bug provider:cncf-kubernetes Kubernetes provider related issues

Comments

@nikhilkarve
Copy link

Apache Airflow version

Other Airflow 2 version (please specify below)

If "Other Airflow 2 version" selected, which one?

2.7

What happened?

I upgraded from airflow version 2.5.3 to 2.7.1 and with that apache-airflow-providers-cncf-kubernetes was also upgraded from 5.2.2 to 7.5.0.

Earlier I was using Kubernetes connection ID like this from Xcoms. I put it in xcoms in the first task where I load all the configs to xcoms
kubernetes_conn_id: "{% raw %}{{ ti.xcom_pull(key ='env_vars', task_ids='fetch_vars')['{% endraw %}{{root_key}}{% raw %}']['kubernetes_conn_id']}}{% endraw %}"

This was working fine, but after upgrading this stopped working and I am getting the following error
The conn_id {{ ti.xcom_pull(key ='env_variables', task_ids='fetch_env')['cycle_process_client_2332']['kubernetes_conn_id']}} isn't defined

Also in the UI if I check rendered templates tag I can see the conn ID resolved correctly. I am confused, any help is appreciated.

What you think should happen instead?

It should pull the connection ID correctly from the Xcoms.

How to reproduce

Create a dag with and add a task using SparkKubernetesOperator, pass kubernetes_conn_id as a param and assign it Xcom value.

Operating System

linux

Versions of Apache Airflow Providers

apache-airflow-providers-cncf-kubernetes - 7.5
airflow - 2.7

Deployment

Other Docker-based deployment

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@nikhilkarve nikhilkarve added area:core kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet labels Sep 27, 2024
@dosubot dosubot bot added area:providers provider:cncf-kubernetes Kubernetes provider related issues labels Sep 27, 2024
@gopidesupavan
Copy link
Member

It would be better to share your dag example. Which your getting .

@nikhilkarve
Copy link
Author

nikhilkarve commented Sep 30, 2024

 submit_task = SparkKubernetesOperator(
        task_id="submit_task",
        file_source="URL",
        namespace="{{ ti.xcom_pull(key ='env_variables', task_ids='fetch_env')['dag_id_here']['namespace']}}",
        base_url="{{ ti.xcom_pull(key ='env_variables', task_ids='fetch_env')['dag_id_here']['spark_base_url']}}",
        endpoint="{{ ti.xcom_pull(key ='env_variables', task_ids='fetch_env')['dag_id_here']['deployment_file']}}",
        application_file="{{ ti.xcom_pull(key ='env_variables', task_ids='fetch_env')['dag_id_here']['application_file']}}",
        kubernetes_conn_id="{{ ti.xcom_pull(key ='env_variables', task_ids='fetch_env')['dag_id_here']['kubernetes_conn_id']}}",
        do_xcom_push=True,
        sla=timedelta(seconds=1200),
    )
    submit_batch_extractor_task.set_upstream([prev_task])

I did not add the complete dag as we use many custom operators and that may cause the confusion. This is a bit boiled down version of the same task which was succeeding with 2.5.3 and not with 2.7 airflow with cncf-kub-operator also upgraded.

@gopidesupavan
Copy link
Member

Thanks..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:core area:providers kind:bug This is a clearly a bug provider:cncf-kubernetes Kubernetes provider related issues
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants