-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛Source Amazon Seller Partner: add integration tests #33996
🐛Source Amazon Seller Partner: add integration tests #33996
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Ignored Deployment
|
Before Merging a Connector Pull RequestWow! What a great pull request you have here! 🎉 To merge this PR, ensure the following has been done/considered for each connector added or updated:
If the checklist is complete, but the CI check is failing,
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding a couple of observations. I like where it's going!
...tegrations/connectors/source-amazon-seller-partner/unit_tests/integration/request_builder.py
Outdated
Show resolved
Hide resolved
@HttpMocker() | ||
def test_given_report_when_read_then_return_records(self, http_mocker: HttpMocker) -> None: | ||
for stream_name, params in STREAMS.items(): | ||
with self.subTest(msg=stream_name): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know much about subTest
: Is there a way with PyCharm (or whatever IDE you are using) to replay just one test? For example, if only one specific stream fails and I want to debug this one?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't find any convenient way to achieve this with subTest
unfortunately, so replaced it with pytest.mark.parametrize
. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also don't like pytest.mark.parametrize
because it is very hard to replay a single test. That being said, pytest.mark.parametrize
does seem to have better capabilities (like skipping) so it seems better yes
) | ||
|
||
output = self._read(stream_name, config()) | ||
assert len(output.records) == 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do we know there is two records? From this test setup, it's not clear to me
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree it's not really clear, so replaced it with a constant and added a comment stating every test file contains 2 records
.../connectors/source-amazon-seller-partner/unit_tests/integration/test_report_based_streams.py
Outdated
Show resolved
Hide resolved
.../connectors/source-amazon-seller-partner/unit_tests/integration/test_report_based_streams.py
Outdated
Show resolved
Hide resolved
.../connectors/source-amazon-seller-partner/unit_tests/integration/test_report_based_streams.py
Outdated
Show resolved
Hide resolved
.../connectors/source-amazon-seller-partner/unit_tests/integration/test_report_based_streams.py
Outdated
Show resolved
Hide resolved
.../connectors/source-amazon-seller-partner/unit_tests/integration/test_report_based_streams.py
Outdated
Show resolved
Hide resolved
...urce-amazon-seller-partner/unit_tests/integration/test_vendor_direct_fulfillment_shipping.py
Outdated
Show resolved
Hide resolved
…ests # Conflicts: # airbyte-integrations/connectors/source-amazon-seller-partner/setup.py
…ests # Conflicts: # airbyte-integrations/connectors/source-amazon-seller-partner/source_amazon_seller_partner/streams.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is very good. I like it a lot. You wrote complex tests (it takes 5 requests to fetch one record) and it is very readable so 👍 for that!
I have two small concerns:
- Assertions testing too much or not the right thing
- Using caplog instead of the entrypoint output
Does the first one make sense? Can you explain the reasoning for the second one? In any case, I'll approve because there is a lot of valuable work here but I would still like to discuss those points
|
||
@pytest.fixture | ||
def http_mocker() -> None: | ||
"""This fixture is needed to pass http_mocker parameter from the @HttpMocker decorator to a test""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need this? I think we don't have this for stripe. The argument should be added like this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a sort of workaround. Since I use pytest
to run the tests, when passing the http_mocker
param to a test, pytest
tries to find a fixture with such name instead of just accepting the param from @HttpMocker
. I had to add this "empty" fixture because didn't find a better solution to make it work.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pytest is so invasive :(
config_builder=config_, stream_name=stream_name, sync_mode=SyncMode.full_refresh, expecting_exception=expecting_exception | ||
) | ||
|
||
@pytest.mark.parametrize(("stream_name", "data_format"), STREAMS) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Every decision is a tradeoff. This one for me is very hard and I don't think we have a good solution to that. On one side, we don't want to duplicate tests that are testing the same thing. This is important in terms of maintenance because it probably means less code to change if there is a change that applies to multiple connectors. On the other hand, we would like tests to be independent. This is important because if we change the implementation of only one stream with the current solution, we are in a weird situation: do I update the test to add a specific case to this stream or do I rely on the dev to copy paste the test elsewhere and fix it? Plus, it generates more work if we want to do a transition in multiple steps (let's say we want to add the feature to stream A and B but not C and D). If the tests are coupled, this is a very annoying change which means that we have effectively coupled tests. This can be crucial if shit hits the fan in production. There is another drawback which might be only for me but I struggle to debug parametrized tests in PyCharm (so any tip here would help).
For stripe, we have decided to have one class per stream even though there was duplication. I'm not against having another source doing differently as long as we learn from both at the same time and eventually converge on a set of conditions where one is better than the other.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I totally agree that tests should be independent and the decision you've made for Stripe works great, but, in this case, the set of tests for report-based streams covers all logic which is common for all of them (like create report, then check its status and download it). If we update some specific streams, there should be separate tests for them (like we have for date formatting). A big plus of parametrisation here is that we can add more report-based streams in future and the only thing that should be made with tests is to extend the list of params.
Regarding debug - I just put a breakpoint before the set of params I need to test with and inside the test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, it is a tradeoff and the cost it unclear to me so I'm more than happy to go the way you have identified and learn from it. I trust in your judgement more than mine as you are the expert for Amazon Seller Partner.
Thanks for the pointer on debugging! I didn't know that breaking before the set of params could have helped. I'll try that next time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good, thanks!
mock_auth(http_mocker) | ||
|
||
http_mocker.post(_create_report_request(stream_name).build(), _create_report_response()) | ||
http_mocker.get(_check_report_status_request(_REPORT_ID).build(), _check_report_status_response(stream_name)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where does _REPORT_ID
comes from? Should we have _create_report_response().with_id(_REPORT_ID)
for the reader of this test to understand the link between the two?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree this is not clear, so updated those functions to explicitly pass IDs and links. Thanks!
|
||
http_mocker.post(_create_report_request(stream_name).build(), _create_report_response()) | ||
http_mocker.get(_check_report_status_request(_REPORT_ID).build(), _check_report_status_response(stream_name)) | ||
http_mocker.get(_get_document_download_url_request(_REPORT_DOCUMENT_ID).build(), _get_document_download_url_response()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where does _REPORT_DOCUMENT_ID
comes from? Why is it relevant to this test? I'm not sure I can understand that just by reading at the test but the fact that it is provided here makes me think this is important
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
http_mocker.get(_check_report_status_request(_REPORT_ID).build(), _check_report_status_response(stream_name)) | ||
http_mocker.get(_get_document_download_url_request(_REPORT_DOCUMENT_ID).build(), _get_document_download_url_response()) | ||
http_mocker.get( | ||
_download_document_request(_DOCUMENT_DOWNLOAD_URL).build(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same question here for _DOCUMENT_DOWNLOAD_URL
as the two questions above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
) | ||
|
||
output = self._read(stream_name, config()) | ||
assert len(output.records) == DEFAULT_EXPECTED_NUMBER_OF_RECORDS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm wondering if it would be valuable to relax this assertion. In that case, we are not testing the record extraction per se (it has been testing in test_given_report_when_read_then_return_records
and test_given_compressed_report_when_read_then_return_records
. So it feels like just calling the HTTP request that are defined in this test would be sufficient, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added this assertion so the reader of this test knows that after retry the records have been read even if the reader doesn't know much about how the framework works. If you think it's redundant, please let me know and I'll delete it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wouldn't the other HTTP requests not have been performed if it were the case? I'm fine with leaving the assertion. I'm just wary of this being impacted by other features later on. Let's say we add filtering, now this test could be impacted. I don't know where the balance of those concerns lands but I'll trust your judgement on that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we change the logic of how these streams work, it will either impact the rest of the tests (if it's a change of core functionality, like retrieving reports) and we will have to align our tests or these tests won't be impacted at all, and new tests will need to be added (for the new functionality). I'd like to leave this assertion here just to emphasise the records are still being read after an error received on the first attempt.
"This is most likely due to insufficient permissions on the credentials in use. " | ||
"Try to grant required permissions/scopes or re-authenticate." | ||
) | ||
assert_message_in_output(message_on_access_forbidden, caplog) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems like a very specific test. If the message change, it does not mean that the integration is broken. If there is custom code, I would expect it to be tested in unit tests. I would assume that in this case, we just want to have a stream status incomplete or something like that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a kind of availability check which is used specifically in these streams. If someone changes this message inside read_records
, should it also be changed in this test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If someone changes this message inside read_records, should it also be changed in this test?
If it were a unit test, I would say "yes" without hesitation. Now that it is an integration test, I'm a bit unsure. That being said, I'm fine with both and we can re-adjust if it creates more maintenance than it helps
http_mocker.post(RequestBuilder.auth_endpoint().build(), build_response(response_body, status_code=HTTPStatus.OK)) | ||
|
||
|
||
def assert_message_in_output(message: str, caplog: Any) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should the typing be EntrypointOutput
for caplog
? Else, why not use EntrypointOutput
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a pytest
's fixture for collecting logs. I cannot use the EntrypointOutput
for this purpose as it doesn't have warnings
property for now. I believe this should be added, what do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the solution to that is to use message_repository.log_message
instead of the default Python logger. The thing we are learning more and more is that interacting with the logger is hard to test.
That being said, maybe we can agree that it is unrealistic to migrate everything to the message_repository and we should find a was to capture logs as part of the EntrypointWrapper. I'm willing to discuss that as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm sorry, I didn't notice that we actually have access to all log messages via EntrypointOutput
. This now has been updated, thanks for the advice!
|
||
output = self._read(stream_name, config()) | ||
assert_message_in_output(message_on_report_cancelled, caplog) | ||
assert len(output.records) == 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should the outcome be a stream status "incomplete" here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to their docs, if we're getting the CANCELLED
status for a report, it means the report is empty. So I believe the stream's sync status should be completed
in this case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh shit! This is good. I think the test should be explicit about it. Maybe the then_error_logged
threw me off guard. Should it be then_stream_completed_successfully_and_warn_user_about_potential_manual_cancellation
? Having the name of the test aligned with the outcome and less with the output would be super helpful here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree, updated
), | ||
) | ||
@HttpMocker() | ||
def test_given_report_with_incorrect_date_format_when_read_then_formatted( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would love the test to show what a incorrect date format looks like. I assume there are unit tests for that but the test does not guarantee that the transformation occur (at least, I can't see the invalid format in the test)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will be possible once the framework supports records builder for .csv
templates format. Can we make some refactor after the framework updated or this test change should it be made before merge?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes! It just adds more weigh the to priority on this one. Thanks for pointing that out!
…ests # Conflicts: # docs/integrations/sources/amazon-seller-partner.md
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really like the work you did there and the feedback you provide. Thanks a lot @askarpets
http_mocker.post(RequestBuilder.auth_endpoint().build(), build_response(response_body, status_code=HTTPStatus.OK)) | ||
|
||
|
||
def assert_message_in_output(message: str, caplog: Any) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the solution to that is to use message_repository.log_message
instead of the default Python logger. The thing we are learning more and more is that interacting with the logger is hard to test.
That being said, maybe we can agree that it is unrealistic to migrate everything to the message_repository and we should find a was to capture logs as part of the EntrypointWrapper. I'm willing to discuss that as well
.../connectors/source-amazon-seller-partner/unit_tests/integration/test_report_based_streams.py
Outdated
Show resolved
Hide resolved
...urce-amazon-seller-partner/unit_tests/integration/test_vendor_direct_fulfillment_shipping.py
Outdated
Show resolved
Hide resolved
config_builder=config_, stream_name=stream_name, sync_mode=SyncMode.full_refresh, expecting_exception=expecting_exception | ||
) | ||
|
||
@pytest.mark.parametrize(("stream_name", "data_format"), STREAMS) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, it is a tradeoff and the cost it unclear to me so I'm more than happy to go the way you have identified and learn from it. I trust in your judgement more than mine as you are the expert for Amazon Seller Partner.
Thanks for the pointer on debugging! I didn't know that breaking before the set of params could have helped. I'll try that next time.
) | ||
|
||
output = self._read(stream_name, config()) | ||
assert len(output.records) == DEFAULT_EXPECTED_NUMBER_OF_RECORDS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wouldn't the other HTTP requests not have been performed if it were the case? I'm fine with leaving the assertion. I'm just wary of this being impacted by other features later on. Let's say we add filtering, now this test could be impacted. I don't know where the balance of those concerns lands but I'll trust your judgement on that
"This is most likely due to insufficient permissions on the credentials in use. " | ||
"Try to grant required permissions/scopes or re-authenticate." | ||
) | ||
assert_message_in_output(message_on_access_forbidden, caplog) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If someone changes this message inside read_records, should it also be changed in this test?
If it were a unit test, I would say "yes" without hesitation. Now that it is an integration test, I'm a bit unsure. That being said, I'm fine with both and we can re-adjust if it creates more maintenance than it helps
|
||
output = self._read(stream_name, config()) | ||
assert_message_in_output(message_on_report_cancelled, caplog) | ||
assert len(output.records) == 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh shit! This is good. I think the test should be explicit about it. Maybe the then_error_logged
threw me off guard. Should it be then_stream_completed_successfully_and_warn_user_about_potential_manual_cancellation
? Having the name of the test aligned with the outcome and less with the output would be super helpful here
), | ||
) | ||
@HttpMocker() | ||
def test_given_report_with_incorrect_date_format_when_read_then_formatted( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes! It just adds more weigh the to priority on this one. Thanks for pointing that out!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please check CAT config file and update bypass reasons. Something like - "Data cannot be seeded, but integration tests are being conducted".
Other looks good
…ests # Conflicts: # airbyte-integrations/connectors/source-amazon-seller-partner/metadata.yaml # docs/integrations/sources/amazon-seller-partner.md
…ests # Conflicts: # docs/integrations/sources/amazon-seller-partner.md
* ✨ source-surveymonkey: migrate to poetry (airbytehq#35168) * ✨ source-monday: migrate to poetry (airbytehq#35146) * ✨ source-salesforce: migrate to poetry (airbytehq#35147) * ✨ source-intercom: migrate to poetry (airbytehq#35148) * ✨ source-iterable: migrate to poetry (airbytehq#35150) * ✨ source-mixpanel: migrate to poetry (airbytehq#35151) * ✨ source-typeform: migrate to poetry (airbytehq#35152) * ✨ source-twilio: migrate to poetry (airbytehq#35153) * ✨ source-notion: migrate to poetry (airbytehq#35155) * ✨ source-zendesk-talk: migrate to poetry (airbytehq#35156) * ✨ source-amplitude: migrate to poetry (airbytehq#35162) * ✨ source-jira: migrate to poetry (airbytehq#35160) * ✨ source-google-ads: migrate to poetry (airbytehq#35158) * 🐛 Source Slack: Join to the channels while `read` instead of `discovery` (airbytehq#35131) * ✨ source-hubspot: migrate to poetry (airbytehq#35165) * ✨ source-pinterest: migrate to poetry (airbytehq#35159) * ✨ source-sentry: migrate to poetry (airbytehq#35145) * ✨ source-chargebee: migrate to poetry (airbytehq#35169) * source-snapchat-marketing: adopt our base image (airbytehq#35170) * ✨ source-snapchat-marketing: migrate to poetry (airbytehq#35171) * source-faker: adopt our base image (airbytehq#35172) * ✨ source-faker: migrate to poetry (airbytehq#35174) * ✨ source-amazon-ads: migrate to poetry (airbytehq#35180) * Source Github: add integration tests (airbytehq#34933) * ✨ source-bing-ads: migrate to poetry (airbytehq#35179) * ✨ source-instagram: migrate to poetry (airbytehq#35177) * ✨ source-facebook-marketing: migrate to poetry (airbytehq#35178) * destination-async-framework: make emission of state from FlushWorkers synchronized (airbytehq#35144) * ✨ source-freshdesk: migrate to poetry (airbytehq#35187) * 🐛 source-mysql Support special chars in dbname (airbytehq#34580) * AirbyteLib: Release 0.1.0 (airbytehq#35184) * 📚 Adjust documentation for corepack (airbytehq#35192) * ✨ source-recharge: migrate to poetry (airbytehq#35182) * ✨ source-tiktok-marketing: migrate to poetry (airbytehq#35161) * Bump Airbyte version from 0.50.48 to 0.50.49 * ✨ Destination Postgres: DV2 GA (airbytehq#35042) Co-authored-by: Marius Posta <marius@airbyte.io> Co-authored-by: Evan Tahler <evan@airbyte.io> * Destination snowflake: reorder auth spec options (airbytehq#35194) * ✨ source-zendesk-chat: migrate to poetry (airbytehq#35185) * ✨ source-sendgrid: migrate to poetry (airbytehq#35181) * ✨ source-gitlab: migrate to poetry (airbytehq#35167) * ✨ source-airtable: migrate to poetry (airbytehq#35149) * ✨ source-google-search-console: migrate to poetry (airbytehq#35163) * 🐛Source Amazon Seller Partner: add integration tests (airbytehq#33996) * ✨ source-s3: migrate to poetry (airbytehq#35164) * ✨ source-shopify: migrate to poetry (airbytehq#35166) * ✨ source-file: migrate to poetry (airbytehq#35186) * ✨ source-slack: migrate to poetry (airbytehq#35157) * ✨ source-harvest: migrate to poetry (airbytehq#35154) * Source Chargebee: Updates schemas for validation and missing fields errors, updates test bypass, adds expected records, adds custom error handling, adds incremental support for three streams (airbytehq#34053) * Don't emit final state if there is an underlying stream failure (airbytehq#34869) Co-authored-by: Xiaohan Song <xiaohan@airbyte.io> * Remove IAM Role Setup instructions from s3.md (airbytehq#35190) * Bump Airbyte version from 0.50.49 to 0.50.50 * airbyte-ci: run `poetry check` before `poetry install` on poetry package install (airbytehq#35212) * ✨ Source File: add fixed width file format support (airbytehq#34678) Co-authored-by: mgreene <michael.greene@gravie.com> Co-authored-by: Serhii Lazebnyi <serhii.lazebnyi@globallogic.com> Co-authored-by: Serhii Lazebnyi <53845333+lazebnyi@users.noreply.github.com> * source-postgres: adopt CDK 0.20.4 (airbytehq#35224) * 🐛 Set cdc record subsequent record wait time to initial wait time as a workaround (airbytehq#35114) * AirbyteLib: docs: add Colab quicklink (airbytehq#35215) * AirbyteLib: support secrets in dotenv files (airbytehq#35244) * Add airbyte trace utility to emit analytics messages & emit messages for MongoDB, Postgres & MySQL (airbytehq#35036) * AirbyteLib: Docs: fix colab badge (airbytehq#35248) * AirbyteLib: improve json schema type detection (airbytehq#35263) * 🏥 Source Mixpanel: update stream Funnels with custom_event_id and custom_event fields fields (airbytehq#35203) * write logs to file in addition to stdout when running java connector tests (airbytehq#35236) * destination-duckdb: remove superfluous build.gradle file (airbytehq#35277) * fix `:airbyte-integrations:connectors:destination-duckdb' could not be found in project` (airbytehq#35279) * destination-e2e-test,dev-null: use CDK 0.20.6 (airbytehq#35278) * AirbyteLib: Add support for JSON and VARIANT types (airbytehq#35117) Co-authored-by: Joe Reuter <joe@airbyte.io> * Docs: add deprecation note for normalization and custom transformation (airbytehq#35275) * 🎉 Source Intercom: Update the API Version to `2.10` (airbytehq#35176) * 🐛 Source Harvest: Revert poetry update (airbytehq#35296) * AirbyteLib: Mark and deprioritize slow tests (airbytehq#35298) * source-clickhouse: adopt CDK 0.20.4 (airbytehq#35235) * source-cockroachdb: adopt CDK 0.20.4 (airbytehq#35234) * source-db2: adopt CDK 0.20.4 (airbytehq#35233) * source-dynamodb: adopt CDK 0.20.4 (airbytehq#35232) * source-e2e-test: adopt CDK 0.20.4 (airbytehq#35231) * source-elasticsearch: adopt CDK 0.20.4 (airbytehq#35230) * source-kafka: adopt CDK 0.20.4 (airbytehq#35229) * source-oracle: adopt CDK 0.20.4 (airbytehq#35225) * source-redshift: adopt CDK 0.20.4 (airbytehq#35223) * source-scaffold-java-jdbc: adopt CDK 0.20.4 (airbytehq#35222) * source-sftp: adopt CDK 0.20.4 (airbytehq#35221) * source-snowflake: adopt CDK 0.20.4 (airbytehq#35220) * source-teradata: adopt CDK 0.20.4 (airbytehq#35219) * source-tidb: adopt CDK 0.20.4 (airbytehq#35218) * Throw cdc cursor error * Revert bad commit * AirbyteLib: suppress duckdb reflection warnings (airbytehq#35300) * Source Google Ads: temporary patch to avoid 500 Internal server error (airbytehq#35280) * 🐛 python cdk: mask oauth access key (airbytehq#34931) * 🤖 Bump patch version of Python CDK * Emit multiple error trace messages and continue syncs by default (airbytehq#35129) * 🤖 Bump minor version of Python CDK * ✨Source Amazon Seller Partner: add `VendorOrders` stream (airbytehq#35273) * File-based CDK: enqueue AirbyteMessage of type record instead of sending to the message repository (airbytehq#35318) * 🤖 Bump patch version of Python CDK * 🚨🚨🐛 Source Gitlab fix merge_request_commits stream (airbytehq#34548) * java CDK: improve blobstore module structure (airbytehq#35285) * source-mysql: add and adopt TestDatabaseWithInvalidDatabaseName (airbytehq#35210) * ✨ Source File: support ZIP file (airbytehq#32354) Co-authored-by: Serhii Lazebnyi <53845333+lazebnyi@users.noreply.github.com> Co-authored-by: Serhii Lazebnyi <serhii.lazebnyi@globallogic.com> * destination-async-framework: move the state emission logic into GlobalAsyncStateManager (airbytehq#35240) * 🐛 Source Harvest: Fix pendulum parsing error (airbytehq#35305) Co-authored-by: Christo Grabowski <108154848+ChristoGrab@users.noreply.github.com> * ✨ Source GitHub: updating branches schema and unpin on cloud (airbytehq#35271) Co-authored-by: maxi297 <maxime@airbyte.io> Co-authored-by: Maxime Carbonneau-Leclerc <3360483+maxi297@users.noreply.github.com> * AirbyteLib: Fix no-such-table-error (airbytehq#35311) Co-authored-by: Bindi Pankhudi <bindi@airbyte.com> Co-authored-by: Aaron Steers <aj@airbyte.io> * 📝 add instructions for soft reset (airbytehq#35335) * [source-postgres] Add test for legacy version of postgres (airbytehq#35329) * Source Klaviyo: added transform config for profile stream (airbytehq#35336) * 🏥 Source Hubspot: updated marketing emails schema and expected records (airbytehq#35328) * gradle: split off python cdk (airbytehq#35306) * gradle: overall simplification (airbytehq#35307) * docs: typos (airbytehq#35302) * Docs: Update stripe.md (airbytehq#35142) * Test PR to check Slack notifications (airbytehq#35363) * airbyte-ci: remove reference to buildConnectorImage (airbytehq#35364) * Source S3: revert rollback to 4.4.1 (airbytehq#35055) Co-authored-by: Augustin <augustin@airbyte.io> * 🐛 Source OpsGenie: fix parsing of updated_at timestamps from OpsGenie (airbytehq#35269) Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> * Archive `destination-kvdb` (airbytehq#35370) * Add `archived` as connector support level (airbytehq#35355) * Remove `octavia-cli` (airbytehq#33950) * Docs: update k8s instructions for upgrade (airbytehq#35108) * Destination redshift: delete some unused files (airbytehq#35314) * re-add destination-kvdb as archived connector (airbytehq#35377) * destination-kvdb - publish for real (airbytehq#35379) * Support user-specified test read limits in `connector_builder` code (airbytehq#35312) * 🤖 Bump patch version of Python CDK * destination-kvdb bump to publish (airbytehq#35381) * ✨ Source Paypal Transactions: Siver Certification (airbytehq#34510) Co-authored-by: Alexandre Girard <alexandre@airbyte.io> Co-authored-by: alafanechere <augustin.lafanechere@gmail.com> Co-authored-by: Augustin <augustin@airbyte.io> * Revamp QA checks into a battery included package (airbytehq#35322) * 🏥 Source Pinterest: updated expected records (airbytehq#35353) * .github: fix python CDK publish (airbytehq#35391) * 🐛 Source Amazon Seller Partner: Fix check for Vendor accounts (airbytehq#35331) * doc: Document our connectors QA checks (airbytehq#35324) * airbyte-ci: use connectors-qa instead of connector_ops.qa_check (airbytehq#35325) * Update `metadata-service` to latest version + docs (airbytehq#35419) * Bump destination-kvdb again to test metadata for archival (airbytehq#35422) * connectors_qa: make `CheckPublishToPyPiIsEnabled` only run on source connectors (airbytehq#35426) * gradle: remove archived connectors (airbytehq#35423) * ✨Source Facebook Marketing: add integration tests (airbytehq#35061) * Delete `requirements.txt` on poetry managed connectors (airbytehq#35406) * update doc to reference poetry (airbytehq#35414) * 🧹 remove qa_checks.py (airbytehq#35434) * connectors-qa: fix connector type attribute access (airbytehq#35435) * java-connectors: add thread name as part of the log message (airbytehq#35199) * doc: remove Node requirements on config based getting started tutorial (airbytehq#35436) * airbyte-ci: disable telemetry with env var (airbytehq#35438) * airbyte-ci: disable a flaky test (airbytehq#35418) * ci: check for required reviewers on destinations (airbytehq#35428) * destination-kvdb QA checks (airbytehq#35424) Co-authored-by: Augustin <augustin@airbyte.io> * Add destination-kvdb to OSS registry (airbytehq#35444) * Normalization logs: remove json parse warnings (airbytehq#34978) * Support archived connectors in Docs (airbytehq#35374) * remove destination-kvdb one more time (airbytehq#35382) * [Source-Postgres] : Add config to throw an error on invalid CDC position (airbytehq#35304) * java-cdk:remove unused class (airbytehq#35408) * Source S3: add filter by start date (airbytehq#35392) * Revert "Add destination-kvdb to OSS registry" (airbytehq#35453) * airbyte-ci: do no run QA checks on publish - only MetadataValidation (airbytehq#35437) Co-authored-by: Ella Rohm-Ensing <erohmensing@gmail.com> * restore kvdb to state from airbytehq#35424 (airbytehq#35454) * 🚨🚨 Source Facebook Marketing: Add statuses filters (airbytehq#32449) Co-authored-by: Anatolii Yatsuk <tolikyatsuk@gmail.com> * add proper logging to junit runs (airbytehq#35394) Basically, Junit is not logging any thing about its progress outside of the console. This is aimed at fixing that by outputing progress logs along with the standard logs. So there's going to be a line before each step of a test run, and a line after with the elapsed time. Also, exception are now part of the logs instead of being only part of the junit report. In the process of doing that, I decided to clean up and simplify the log4j2.xml file. I also noted a few issues with ANSI coloring, so there's a fix for that. Finally, I'm removing empty lines from container logs (MSSQL is full of them). The junit printing is done through an intereceptor. That interceptor uses introspection. I wanted to use a factory method, but java's ServiceLoader only allows classes that extends the service interface, hence the need to override every method in the interceptor class, and to plop a proxy on top of that. * Re-ignore documentation structure check for the time being (airbytehq#35458) * [Source-mysql] : Add config to throw an error on invalid CDC position (airbytehq#35338) * [Source-Mongodb] : Add config to throw an error on invalid CDC position (airbytehq#35375) * pin to older version (airbytehq#35469) * Update on-kubernetes-via-helm.md - Add GCS Logging steps (airbytehq#35455) Co-authored-by: Sajarin <sajarindider@gmail.com> * Airbyte CDK: add filter to RemoveFields (airbytehq#35326) Signed-off-by: Artem Inzhyyants <artem.inzhyyants@gmail.com> * 🤖 Bump minor version of Python CDK * 🐛 Source Facebook Marketing: Fix error during transforming state (airbytehq#35467) * .github: remove connector checklist (airbytehq#35484) * connectors_qa: bump to 1.0.3 (airbytehq#35475) * .github: tighter filtering for gradle workflow (airbytehq#35492) * Airbyte docs: Fixed JSON schema rendering issues for dark mode (airbytehq#35489) Co-authored-by: bindipankhudi <bindi@airbyte.com> * Source Quickbooks: fix spec (airbytehq#35457) * 🐛 Change null cursor value query to not use IIF sql function (airbytehq#35405) * Source Google Ads: rollback patch 500 Internal Server Error (airbytehq#35493) * Fix syntax error in `tools/bin/manage.sh`, used to publish airbyte cdk (airbytehq#35466) * [DB sources] : Reduce CDC state compression limit to 1MB (airbytehq#35511) * 🤖 Bump patch version of Python CDK * Add ignore_stream_slicer_parameters_on_paginated_requests flag (airbytehq#35462) * 🤖 Bump minor version of Python CDK * Mangle unhandled MongoCommandException to prevent creating grouping o… (airbytehq#35526) * .github: fix java cdk publish workflow (airbytehq#35533) * [Source-mysql] : Adopt 0.21.4 and reduce cdc state compression threshold to 1MB (airbytehq#35525) * 🏥 Source Notion: update stream schema (airbytehq#35409) * airbyte-ci: make QA check work on strict-encrypt connectors (airbytehq#35536) * Update docs to show archived information if connector is not in registries (airbytehq#35468) * 🐛 Source Facebook Marketing: Add missing config migration (airbytehq#35539) * docs: update ALB configuration docs for exposing API (airbytehq#35520) * chore: remove upgrading-airbyte.md (airbytehq#35545) * 📚 Add documentation for Entra ID (airbytehq#34569) * Bump Airbyte version from 0.50.50 to 0.50.51 * gradle.yml: use a smaller runner (airbytehq#35547) * airbyte-ci: augment the report for java connectors (airbytehq#35317) Today we're missing the logs (both JVM and container logs) in java connector reports. This is creating a link to test artifacts. In the CI, the link will point to a zip file, while on a local run, it will point to a directory. In addition, we recently added the junit XML inlined with the test standard output and error, but that didn't really work as well as we'd hoped: The reports were slow to load, they were not ordered by time, the corresponding logs were lacking. There's still a possibility they'll be useful, so rather than removing them altogether, they will be bundled in the log zip (or directory). I'm also adding a button to copy the standard output or the standard error from a step into the clipboard. Finally, I'm reducing the max vertical size of an expanded step, so it doesn't go over 70%, which seems much cleaner to me. Here's an example of the result (from the child PR): https://storage.cloud.google.com/airbyte-ci-reports-multi/airbyte-ci/connectors/test/pull_request/stephane_02-09-add_background_thread_to_track_mssql_container_status/1708056420/d4683bfb7f90675c6b9e7c6d4bbad3f98c7a7550/source-mssql/3.7.0/output.html * Source SalesForce: Add Stream Slice Step option to specification (airbytehq#35421) Signed-off-by: Artem Inzhyyants <artem.inzhyyants@gmail.com> * Destination Clickhouse - 1.0, remove normalization (airbytehq#34637) Co-authored-by: Aaron ("AJ") Steers <aj@airbyte.io> Co-authored-by: Joe Reuter <joe@airbyte.io> Co-authored-by: Obioma Anomnachi <onanomnachi@gmail.com> Co-authored-by: Anatolii Yatsuk <35109939+tolik0@users.noreply.github.com> Co-authored-by: Maxime Carbonneau-Leclerc <3360483+maxi297@users.noreply.github.com> Co-authored-by: maxi297 <maxi297@users.noreply.github.com> Co-authored-by: Ryan Waskewich <156025126+rwask@users.noreply.github.com> Co-authored-by: Catherine Noll <clnoll@users.noreply.github.com> Co-authored-by: Marius Posta <marius@airbyte.io> Co-authored-by: Edward Gao <edward.gao@airbyte.io> Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com> Co-authored-by: SatishChGit <satishchinthanippu@gmail.com> Co-authored-by: evantahler <evan@airbyte.io> Co-authored-by: Rodi Reich Zilberman <867491+rodireich@users.noreply.github.com> Co-authored-by: Anton Karpets <anton.karpets@globallogic.com> Co-authored-by: Christo Grabowski <108154848+ChristoGrab@users.noreply.github.com> Co-authored-by: Akash Kulkarni <akash@airbyte.io> Co-authored-by: Akash Kulkarni <113392464+akashkulk@users.noreply.github.com> Co-authored-by: Gireesh Sreepathi <gisripa@gmail.com> Co-authored-by: Artem Inzhyyants <36314070+artem1205@users.noreply.github.com> * Airbyte CDK: add interpolation for request options (airbytehq#35485) Signed-off-by: Artem Inzhyyants <artem.inzhyyants@gmail.com> Co-authored-by: Alexandre Girard <alexandre@airbyte.io> * 🤖 Bump minor version of Python CDK * Handle seeing uncompressed sendgrid contact data (airbytehq#35343) * gradle.yml: use XXL runners but only if gradle related files are changed (airbytehq#35548) * ✨ [greenhouse] [iterable] [linkedin-ads] [paypal-transactions] [pinterest] Bump cdk versions for to use continue on stream per-error reporting (airbytehq#35465) * Airbyte CDK: add CustomRecordFilter (airbytehq#35283) Signed-off-by: Artem Inzhyyants <artem.inzhyyants@gmail.com> * 🤖 Bump minor version of Python CDK * Do not add connector header to source and destination index pages (airbytehq#35553) * gradle.yml: fix path filters (airbytehq#35554) * Source Monday: fix gql query to support inline fragment value for the Items stream (airbytehq#35506) * gradle.yml: checkout the repo when not PR trigger (airbytehq#35558) * airbyte-cdk [python]: re-enable tests in CI (airbytehq#35560) Co-authored-by: Marius Posta <marius@airbyte.io> * ✨ [source-mssql] skip sql server agent check if EngineEdition == 8 (airbytehq#35368) * push new source-mssql version (airbytehq#35564) * Destinations CDK: Refactor T+D to gather required world state upfront (airbytehq#35342) Signed-off-by: Gireesh Sreepathi <gisripa@gmail.com> * .github: fix python_cdk_tests.yml (airbytehq#35567) * Bump Airbyte version from 0.50.51 to 0.50.52 * add entry into JAVA_OPTS to always select log4j2.xml as our logger configuration (airbytehq#35569) * destination-s3: bump patch version following airbytehq#35569 (airbytehq#35576) Co-authored-by: Stephane Geneix <stephane@airbyte.io> * destination-snowflake: bump patch version following airbytehq#35569 (airbytehq#35575) Co-authored-by: Stephane Geneix <stephane@airbyte.io> * destination-bigquery: bump patch version following airbytehq#35569 (airbytehq#35574) Co-authored-by: Stephane Geneix <stephane@airbyte.io> * source-mysql: bump patch version following airbytehq#35569 (airbytehq#35573) Co-authored-by: Stephane Geneix <stephane@airbyte.io> * source-postgres: bump patch version following airbytehq#35569 (airbytehq#35572) Co-authored-by: Stephane Geneix <stephane@airbyte.io> * source-mongodb-v2: bump patch version following airbytehq#35569 (airbytehq#35571) Co-authored-by: Stephane Geneix <stephane@airbyte.io> * airbyte-ci-test.yml: only run if modified internal poetry packages (airbytehq#35551) * airbyte-ci-test.yml: checkout repo for path filters when not on PR (airbytehq#35577) * connectors-ci: early exit when no connector changes (airbytehq#35578) * Microsoft Entra ID for Self-Managed Enterprise (airbytehq#35585) * Improve documentation on check command (airbytehq#35542) Co-authored-by: Ella Rohm-Ensing <erohmensing@gmail.com> * 🐛 Source S3: fix exception when setting CSV stream delimiter to `\t`. (airbytehq#35246) Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com> Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> * 🐛 Source BigQuery: fix error with RECORD REPEATED fields (airbytehq#35503) Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com> Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> * re-release source mssql with logger fixes (airbytehq#35596) * Source File: change header=0 to header=null in docs (airbytehq#35595) CI tests failed because the version was not incremented, despite only a single line being altered in the documentation. This change is minor and can be safely merged. * Changed tag to low code (airbytehq#35594) CI tests failed because the version was not incremented. This change is minor and can be safely merged. * Bump Airbyte version from 0.50.52 to 0.50.53 * Destination Postgres: CDK T+D initial state gathering (airbytehq#35385) Signed-off-by: Gireesh Sreepathi <gisripa@gmail.com> * Destination Snowflake: CDK T+D initial state refactor (airbytehq#35456) Signed-off-by: Gireesh Sreepathi <gisripa@gmail.com> * Destination Redshift: CDK T+D initial state refactor (airbytehq#35354) Signed-off-by: Gireesh Sreepathi <gisripa@gmail.com> * delete metadata checks workflow (airbytehq#35580) * Source Recurly: Enable in registries with updated CDK (airbytehq#34622) * reduce interrupt and shutdown delays to 1 minutes and 2 minutes when stopping a connector (initially set at 60minutes and 70minutes) (airbytehq#35527) Fixes airbytehq#32348 discussed here : https://airbytehq-team.slack.com/archives/C02U2SSHP9S/p1708552465201999 * Docs: Add depecration notices to sunsetting connectors (airbytehq#35446) * Cleaned up PyAibyte docs (PR # 35603) (airbytehq#35603) Co-authored-by: bindipankhudi <bindi@airbyte.com> * Source S3: run incremental syncs with concurrency (airbytehq#34895) * old commits added * add file location in output stream * file docker file * docker file version change * pgp docker file * fix * Bump gnupg version and pgp decryption changes * fix bug * fix: discover dtype issued and test cases added * added files --------- Signed-off-by: Artem Inzhyyants <artem.inzhyyants@gmail.com> Signed-off-by: Gireesh Sreepathi <gisripa@gmail.com> Co-authored-by: Augustin <augustin@airbyte.io> Co-authored-by: Baz <oleksandr.bazarnov@globallogic.com> Co-authored-by: Artem Inzhyyants <36314070+artem1205@users.noreply.github.com> Co-authored-by: Subodh Kant Chaturvedi <subodh1810@gmail.com> Co-authored-by: Xiaohan Song <xiaohan@airbyte.io> Co-authored-by: Aaron ("AJ") Steers <aj@airbyte.io> Co-authored-by: Tim Roes <tim@airbyte.io> Co-authored-by: benmoriceau <benmoriceau@users.noreply.github.com> Co-authored-by: Gireesh Sreepathi <gisripa@gmail.com> Co-authored-by: Marius Posta <marius@airbyte.io> Co-authored-by: Evan Tahler <evan@airbyte.io> Co-authored-by: Edward Gao <edward.gao@airbyte.io> Co-authored-by: Anton Karpets <anton.karpets@globallogic.com> Co-authored-by: Patrick Nilan <nilan.patrick@gmail.com> Co-authored-by: Akash Kulkarni <113392464+akashkulk@users.noreply.github.com> Co-authored-by: Tyler B <104733644+tybernstein@users.noreply.github.com> Co-authored-by: bgroff <bgroff@users.noreply.github.com> Co-authored-by: mjgatz <86885812+mjgatz@users.noreply.github.com> Co-authored-by: mgreene <michael.greene@gravie.com> Co-authored-by: Serhii Lazebnyi <serhii.lazebnyi@globallogic.com> Co-authored-by: Serhii Lazebnyi <53845333+lazebnyi@users.noreply.github.com> Co-authored-by: Rodi Reich Zilberman <867491+rodireich@users.noreply.github.com> Co-authored-by: Daryna Ishchenko <80129833+darynaishchenko@users.noreply.github.com> Co-authored-by: Stephane Geneix <147216312+stephane-airbyte@users.noreply.github.com> Co-authored-by: Joe Reuter <joe@airbyte.io> Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com> Co-authored-by: Maxime Carbonneau-Leclerc <3360483+maxi297@users.noreply.github.com> Co-authored-by: Akash Kulkarni <akash@airbyte.io> Co-authored-by: Roman Yermilov [GL] <86300758+roman-yermilov-gl@users.noreply.github.com> Co-authored-by: Alexandre Girard <alexandre@airbyte.io> Co-authored-by: girarda <girarda@users.noreply.github.com> Co-authored-by: Brian Lai <51336873+brianjlai@users.noreply.github.com> Co-authored-by: brianjlai <brianjlai@users.noreply.github.com> Co-authored-by: Catherine Noll <clnoll@users.noreply.github.com> Co-authored-by: midavadim <midavadim@yahoo.com> Co-authored-by: Julien COUTAND <julien.coutand@gmail.com> Co-authored-by: Christo Grabowski <108154848+ChristoGrab@users.noreply.github.com> Co-authored-by: maxi297 <maxime@airbyte.io> Co-authored-by: Bindi Pankhudi <bindi@airbyte.io> Co-authored-by: Bindi Pankhudi <bindi@airbyte.com> Co-authored-by: Ben Drucker <bvdrucker@gmail.com> Co-authored-by: TornadoContre <37258495+TornadoContre@users.noreply.github.com> Co-authored-by: Natik Gadzhi <natik@respawn.io> Co-authored-by: Thomas Dippel <dipth@users.noreply.github.com> Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> Co-authored-by: Alex Birdsall <ambirdsall@gmail.com> Co-authored-by: ambirdsall <ambirdsall@users.noreply.github.com> Co-authored-by: Jose Gerardo Pineda <jose.pineda@airbyte.io> Co-authored-by: alafanechere <augustin.lafanechere@gmail.com> Co-authored-by: Anatolii Yatsuk <35109939+tolik0@users.noreply.github.com> Co-authored-by: Pedro S. Lopez <pedroslopez@me.com> Co-authored-by: Ella Rohm-Ensing <erohmensing@gmail.com> Co-authored-by: Siarhei Ivanou <sinusu@gmail.com> Co-authored-by: Anatolii Yatsuk <tolikyatsuk@gmail.com> Co-authored-by: Ryan Waskewich <156025126+rwask@users.noreply.github.com> Co-authored-by: Sajarin <sajarindider@gmail.com> Co-authored-by: artem1205 <artem1205@users.noreply.github.com> Co-authored-by: perangel <perangel@gmail.com> Co-authored-by: Joe Bell <joseph.bell@airbyte.io> Co-authored-by: Obioma Anomnachi <onanomnachi@gmail.com> Co-authored-by: maxi297 <maxi297@users.noreply.github.com> Co-authored-by: SatishChGit <satishchinthanippu@gmail.com> Co-authored-by: Brian Leonard <brian@bleonard.com> Co-authored-by: David Wallace <dwallace0723@gmail.com> Co-authored-by: pmossman <pmossman@users.noreply.github.com> Co-authored-by: Stephane Geneix <stephane@airbyte.io> Co-authored-by: Alexandre Cuoci <Hesperide@users.noreply.github.com> Co-authored-by: Danny Tiesling <tiesling@gmail.com> Co-authored-by: Marco Fontana <MaxwellJK@users.noreply.github.com> Co-authored-by: rishabh-cldcvr <rishabh@cldcvr.com>
What
Resolves https://github.com/airbytehq/airbyte-internal-issues/issues/2406
How
Add integration tests for the streams with no data in sandbox account
Recommended reading order
test_report_based_streams.py
test_vendor_direct_fulfillment_shipping.py
request_builder.py
response_builder.py
🚨 User Impact 🚨
No breaking changes
Pre-merge Actions
Updating a connector
Community member or Airbyter
Airbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.