-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Staging destinations (bigquery, etc): Handle S3 auth error as a ConfigException #20092
Comments
This error is for some reason also not caught in |
I have been trying to reproduce this and so far, what I am seeing is that if |
Notes from grooming: When validating the fix, will also need to test BigQuery (staging), GCS, and Snowflake destinations (S3 and GCS staging). When the fix is done: publish destination-gcs, destination-s3, destination-bigquery, destination-bigquery-denormalize, destination-snowflake |
Was looking into this and was able to reproduce it by creating a service account with the required permissions, setting up a connector then removing the permissions. It looks like On a failed sync, airbyte cloud makes 2 attempts. The
However, when the check method is not run, the error is more opaque
I can put a PR together today with my proposed changes |
@jbfbell the main ask of this issue is that when destinations run into an S3 auth error, they should emit |
@jbfbell could you please link the PR to the issue? |
This is currently blocked by an issue with the github publish action, but otherwise ready to be deployed |
e.g. https://sentry.io/organizations/airbytehq/issues/3804018246/?project=6527718&referrer=slack
This code probably lives in base-java-s3, so we can roll it out for all our staging destinations (bigquery, snowflake, redshift, ...). This should get wrapped into a ConfigException so that it doesn't trigger alerting.
I'm pretty sure we already detect this in
check
, but would be good to confirm that also.example stacktrace:
The text was updated successfully, but these errors were encountered: