-
Notifications
You must be signed in to change notification settings - Fork 14.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to specify Python version for AwsGlueJobOperator #20832
Comments
Thanks for opening your first issue here! Be sure to follow the issue template! |
I think that this is because the GlueHook is pretty opinionated and hardcodes the value of airflow/airflow/providers/amazon/aws/hooks/glue.py Lines 181 to 225 in 2ab2ae8
So when you provide Anyways, thoughts on just making the If y'all think that this is a satisfactory fix, feel free to assign this issue to me and I can put up a quick PR. |
Also for what its worth, I think that the https://docs.aws.amazon.com/glue/latest/webapi/API_JobCommand.html |
@SamWheating We can make them configurable I'm just not I'm not 100% why must we create the command. Can't we just leave the
There are other parameters that don't get this special treatment: |
Apache Airflow Provider(s)
amazon
Versions of Apache Airflow Providers
No response
Apache Airflow version
2.0.2
Operating System
Amazon Linux
Deployment
MWAA
Deployment details
No response
What happened
When a new Glue job is created using the AwsGlueJobOperator, the job is defaulting to Python2. Setting the version in create_job_kwargs fails with key error.
What you expected to happen
Expected the Glue job to be created with a Python3 runtime. create_job_kwargs are passed to the boto3 glue client create_job method which includes a "Command" parameter that is a dictionary containing the Python version.
How to reproduce
Create a dag with an AwsGlueJobOperator and pass a "Command" parameter in the create_job_kwargs argument.
Anything else
When a new job is being created.
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: