-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 Destination Snowflake | BigQuery: add part_size config to UI #9039
Conversation
/test connector=connectors/destination-snowflake
|
/test connector=connectors/destination-bigquery
|
/test connector=connectors/destination-bigquery-denormalized
|
type: "integer" | ||
default: 5 | ||
examples: | ||
- 5 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
type: "integer" | ||
default: 5 | ||
examples: | ||
- 5 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
\ A rule of thumb is to multiply the part size by 10 to get the\ | ||
\ memory requirement. Modify this with care." | ||
title: "Stream Part Size" | ||
order: 5 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Approve from my side. |
/test connector=connectors/destination-bigquery
|
"part_size_mb": { | ||
"title": "Block Size (MB) for GCS multipart upload", | ||
"description": "This is the size of a \"Part\" being buffered in memory. It limits the memory usage when writing. Larger values will allow to upload a bigger files and improve the speed, but consumes9 more memory. Allowed values: min=5MB, max=525MB Default: 5MB.", | ||
"type": "integer", | ||
"default": 5, | ||
"examples": [5] | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[Optional] Please take a look if we have an option to limit the input. To make possible enter only integers in range 5 - 525.
"part_size_mb": { | ||
"title": "Block Size (MB) for GCS multipart upload", | ||
"description": "This is the size of a \"Part\" being buffered in memory. It limits the memory usage when writing. Larger values will allow to upload a bigger files and improve the speed, but consumes9 more memory. Allowed values: min=5MB, max=525MB Default: 5MB.", | ||
"type": "integer", | ||
"default": 5, | ||
"examples": [5] | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment above related to this spec, also.
/test connector=connectors/destination-bigquery-denormalized
|
this PR covers only UI part. So does backend for Destination Snowflake and BigQuery already can handle this new features with part size ore it should be done in scope of other task? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. we can proceed with publishing and merging without airbyte review
@yurii-bidiuk please check github webhook /test command for both connectors with latest credentials version in Google Secret Manager (which include "part_size") before merging |
/test connector=connectors/destination-snowflake
|
/test connector=connectors/destination-bigquery-denormalized
|
/test connector=connectors/destination-bigquery
|
/publish connector=connectors/destination-snowflake
|
/publish connector=connectors/destination-bigquery
|
/publish connector=connectors/destination-bigquery-denormalized
|
What
Fixes #5720 #6245 #8990
How
Add part_size config for Snowflake S3 loading method;
add part_size config for BigQuery "GCS Staging" loading method;
Recommended reading order
destination_specs.yaml
airbyte-integrations/connectors/destination-snowflake/src/main/resources/spec.json
airbyte-integrations/connectors/destination-bigquery/src/main/resources/spec.json
BigQueryUtils.java
🚨 User Impact 🚨
There are no breaking changes. Added new config properties to UI
Pre-merge Checklist
Expand the relevant checklist and delete the others.
New Connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/SUMMARY.md
docs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampledocs/integrations/README.md
airbyte-integrations/builds.md
Airbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing./publish
command described hereUpdating a connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampleAirbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing./publish
command described hereConnector Generator
-scaffold
in their name) have been updated with the latest scaffold by running./gradlew :airbyte-integrations:connector-templates:generator:testScaffoldTemplates
then checking in your changes