CloudBatchSubmitJobOperator does not forward the project_id in the hook #40559
Labels
area:providers
good first issue
kind:bug
This is a clearly a bug
provider:google
Google (including GCP) related issues
Apache Airflow Provider(s)
google
Versions of Apache Airflow Providers
All, even in the latest, there isn't the correct line of code.
Apache Airflow version
2.6.3
Operating System
composer-2.6.0
Deployment
Google Cloud Composer
Deployment details
Composer 2 version
composer-2.6.0-airflow-2.6.3
What happened
When using the CloudBatchSubmitJobOperator, the project_id is a required parameter. However, this parameter is ignored (not added in the hook submit_batch_job). Therefore, the runtime project (where Composer is deployed) is always taken into account and never the target project (where the real workload must be done)
What you think should happen instead
The project_id provided in the operator should be forwarded to the subsequent hook. All the other operation (get, list, delete) have this feature well implemented. Only the submit has this issue
How to reproduce
Create a Cloud Composer in a project, Configure a CloudBatchSubmitJobOperator, with an explicit project_id referencing an other project. The Cloud Batch deployment is made in the Cloud Composer project, and not in the other project
Anything else
I will submit the fix
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: