-
Notifications
You must be signed in to change notification settings - Fork 16.5k
Description
Apache Airflow version
2.6.2
What happened
I instantiate LambdaInvokeFunctionOperator in my DAG.
I want to call the lambda function with some payload. After following code example from official documentation, I created a dict, and passed its json string version to the operator:
d = {'key': 'value'}
invoke_lambda_task = LambdaInvokeFunctionOperator(..., payload=json.dumps(d))
When I executed the DAG, this task failed. I got the following error message:
Invalid type for parameter Payload, value: {'key': 'value'}, type: <class 'dict'>, valid types: <class 'bytes'>, <class 'bytearray'>, file-like object
Then I went to official boto3 documentation, and found out that indeed, the payload parameter type is bytes or file. See https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/lambda/client/invoke.html
What you think should happen instead
To preserve backward compatibility, The API should encode payload argument if it is str, but also accept bytes or file, in which case it will be passed-through as is.
How to reproduce
- Create lambda function in AWS
- Create a simple DAG with LambdaInvokeFunctionOperator
- pass str value in the payload parameter; use json.dumps() with a simple dict, as lambda expects json payload
- Run the DAG; the task is expected to fail
Operating System
ubuntu
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==7.3.0
Deployment
Virtualenv installation
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct