Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable full SAT for the Redshift source #19915

Merged
merged 10 commits into from
Dec 8, 2022
Original file line number Diff line number Diff line change
@@ -1,7 +1,35 @@
# See [Source Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/source-acceptance-tests-reference)
# for more information about how to configure these tests
connector_image: airbyte/source-redshift:dev
tests:
acceptance_tests:
spec:
- spec_path: "src/test-integration/resources/expected_spec.json"
config_path: "secrets/config.json"
tests:
- spec_path: "src/test-integration/resources/expected_spec.json"
timeout_seconds: "1200"
config_path: "secrets/config.json"
connection:
tests:
- config_path: "secrets/config.json"
timeout_seconds: "1200"
status: "succeed"
discovery:
tests:
- config_path: "secrets/config.json"
timeout_seconds: "1200"
basic_read:
tests:
- config_path: "secrets/config.json"
timeout_seconds: "1200"
configured_catalog_path: "integration_tests/configured_catalog.json"
expect_records:
path: "integration_tests/expected_records.json"
full_refresh:
tests:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
timeout_seconds: "1200"
incremental:
tests:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog_inc.json"
timeout_seconds: "1200"
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Seeding the dataset
You can find the SQL scripts in this folder if you need to create or fix the SAT dataset.
For more instructions and information about valid scripts, please check this [doc](https://docs.google.com/document/d/1k5TvxaNhKdr44aJIHWWtLk14Tzd2gbNX-J8YNoTj8u0/edit#heading=h.ls9oiedt9wyy).
Comment on lines +2 to +3
Copy link
Contributor

@alafanechere alafanechere Dec 7, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should link a Google Doc here, it's not or usual approach to documentation and can break accessibility.

Moreover I would suggest you add the following to the README:

  • Share the requirements in terms of Redshift setup to run the tests on it.
  • Explain the role of each SQL script.
  • Explain technically how to seed the DB.

I think these instructions will differ for each source, so I'm not in favor of a central document. If there are common instructions that can be useful for all database sources, feel free to edit https://github.com/airbytehq/airbyte/blob/master/docs/connector-development/testing-connectors/source-acceptance-tests-reference.md#L1

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is the point, all instructions are the same. The difference is only in the credentials required for the connection and the SQL syntax. It's common for any SQL source.
@evantahler created the doc and said to put all related instructions there.

P.S. If we have a new demand to improve or rework something, it's better to create a new issue and discuss it at the planning meeting with the whole team. No need to hold the integration tests for the source as a hostage of this discussion.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I'm being asked to be a tie-breaker, in the spirt of "done is better than perfect", I think the google doc is OK for now. We can move the contents into the public repo once we are done with all the Java sources, and we can see how similar/different the instructions will be. Also, the doc currently links to secrets, which is helpful, and shouldn't be in this repo.

Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
CREATE TABLE IF NOT EXISTS sat_test_dataset.sat_basic_dataset (
id INTEGER,
test_column_1 SMALLINT,
test_column_2 INTEGER,
test_column_3 BIGINT,
test_column_4 DECIMAL,
test_column_5 REAL,
test_column_6 DOUBLE PRECISION,
test_column_7 BOOLEAN,
test_column_8 CHAR,
test_column_9 VARCHAR,
test_column_10 DATE,
test_column_11 TIMESTAMP,
test_column_12 TIMESTAMPTZ,
test_column_13 TIME,
test_column_14 TIMETZ,
test_column_15 VARBYTE);

insert into sat_test_dataset.sat_basic_dataset values (1, 1, 126, 1024, 555.666, 777.888, 999.000, true, 'q', 'some text', '2008-12-31', 'Jun 1,2008 09:59:59', 'Jun 1,2008 09:59:59 EST', '04:05:06', '04:05:06 EST', 'xxx'::varbyte);
insert into sat_test_dataset.sat_basic_dataset values (2, -5, -126, -1024, -555.666, -777.888, -999.000, false, 'g', 'new text', '1987-10-10', 'Jun 21,2005 12:00:59', 'Oct 15,2003 09:59:59 EST', '04:05:00', '04:05:00 EST', 'yyy'::varbyte);
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
"streams": [
{
"stream": {
"name": "sat_basic_dataset",
"json_schema": {},
"supported_sync_modes": ["full_refresh"],
"source_defined_cursor": null,
"default_cursor_field": null,
"source_defined_primary_key": [["id"]],
"namespace": null
},
"sync_mode": "full_refresh",
"cursor_field": null,
"destination_sync_mode": "append",
"primary_key": null
}
]
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"streams": [
{
"stream": {
"name": "sat_basic_dataset",
"json_schema": {},
"supported_sync_modes": ["incremental"],
"source_defined_cursor": true,
"default_cursor_field": ["id"]
},
"sync_mode": "incremental",
"cursor_field": ["id"],
"destination_sync_mode": "append"
}
]
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"stream": "sat_basic_dataset", "data": {"id":1,"test_column_1":1,"test_column_2":126,"test_column_3":1024,"test_column_4":556,"test_column_5":777.888,"test_column_6":999.0,"test_column_7":true,"test_column_8":"q","test_column_9":"some text","test_column_10":"2008-12-31T00:00:00Z","test_column_11":"2008-06-01T09:59:59.000000Z","test_column_12":"2008-06-01T14:59:59.000000Z","test_column_13":"1970-01-01T04:05:06Z","test_column_14":"09:05:06+00","test_column_15":"787878"}, "emitted_at": 1669734903259 }
{"stream": "sat_basic_dataset", "data": {"id":2,"test_column_1":-5,"test_column_2":-126,"test_column_3":-1024,"test_column_4":-556,"test_column_5":-777.888,"test_column_6":-999.0,"test_column_7":false,"test_column_8":"g","test_column_9":"new text","test_column_10":"1987-10-10T00:00:00Z","test_column_11":"2005-06-21T12:00:59.000000Z","test_column_12":"2003-10-15T14:59:59.000000Z","test_column_13":"1970-01-01T04:05:00Z","test_column_14":"09:05:00+00","test_column_15":"797979"}, "emitted_at": 1669734903259 }