Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Destination Redshift, Postgres and Snowflake: not able to handle - in namespace/stream name #6340

Closed
Tracked by #11287
marcosmarxm opened this issue Sep 21, 2021 · 5 comments · Fixed by #11729
Closed
Tracked by #11287

Comments

@marcosmarxm
Copy link
Member

Enviroment

  • Airbyte version: 0.29.19-alpha
  • OS Version / Instance: MacOS
  • Deployment: Docker
  • Source Connector and version: Google Analytics v4 0.1.1
  • Destination Connector and version: Redshift 0.3.14
  • Severity: Very Low / Low / Medium / High / Critical
  • Step where error happened: Deploy / Sync job / Setup new connection / Update connector / Upgrade Airbyte

image

Current Behavior

sync failed when you specify a namespace with - (dash) or a stream name has a dash.

2021-09-21 00:28:00 ERROR () LineGobbler(voidCall):85 - Exception in thread "main" java.sql.SQLException: [Amazon](500310) Invalid operation: syntax error at or near "-" 
2021-09-21 00:28:00 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) ~[?:?]
2021-09-21 00:28:00 ERROR () LineGobbler(voidCall):85 - Position: 37;
2021-09-21 00:28:00 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) ~[?:?]
2021-09-21 00:28:00 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - Caused by: com.amazon.support.exceptions.ErrorException: [Amazon](500310) Invalid operation: syntax error at or near "-" 
2021-09-21 00:28:00 ERROR () LineGobbler(voidCall):85 - 	at com.amazon.redshift.client.messages.inbound.ErrorResponse.toErrorException(Unknown Source)
2021-09-21 00:28:00 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - Position: 30;
2021-09-21 00:28:00 ERROR () LineGobbler(voidCall):85 - 	at com.amazon.redshift.client.PGMessagingContext.handleErrorResponse(Unknown Source)
2021-09-21 00:28:00 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	... 13 more

Expected Behavior

If a destination dont support - or other special char should be handle automatically

Logs

If applicable, please upload the logs from the failing operation.
For sync jobs, you can download the full logs from the UI by going to the sync attempt page and
clicking the download logs button at the top right of the logs display window.

LOG

2021-09-21 00:27:54 INFO () WorkerRun(call):62 - Executing worker wrapper. Airbyte version: 0.29.19-alpha
2021-09-21 00:27:54 INFO () TemporalAttemptExecution(get):114 - Executing worker wrapper. Airbyte version: 0.29.19-alpha
2021-09-21 00:27:54 WARN () Databases(createPostgresDatabaseWithRetry):58 - Waiting for database to become available...
2021-09-21 00:27:54 INFO () JobsDatabaseInstance(lambda$static$2):45 - Testing if jobs database is ready...
2021-09-21 00:27:54 INFO () Databases(createPostgresDatabaseWithRetry):75 - Database available!
2021-09-21 00:27:54 INFO () DefaultReplicationWorker(run):102 - start sync worker. job id: 6 attempt id: 1
2021-09-21 00:27:54 INFO () DefaultReplicationWorker(run):111 - configured sync modes: {null.casual_first_funnel=full_refresh - append}
2021-09-21 00:27:54 INFO () DefaultAirbyteDestination(start):78 - Running destination...
2021-09-21 00:27:55 INFO () LineGobbler(voidCall):85 - Checking if airbyte/destination-redshift:0.3.14 exists...
2021-09-21 00:27:55 INFO () LineGobbler(voidCall):85 - airbyte/destination-redshift:0.3.14 was found locally.
2021-09-21 00:27:55 INFO () DockerProcessFactory(create):146 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/6/1 --network host --log-driver none airbyte/destination-redshift:0.3.14 write --config destination_config.json --catalog destination_catalog.json
2021-09-21 00:27:55 INFO () LineGobbler(voidCall):85 - Checking if airbyte/source-google-analytics-v4:0.1.1 exists...
2021-09-21 00:27:55 INFO () LineGobbler(voidCall):85 - airbyte/source-google-analytics-v4:0.1.1 was found locally.
2021-09-21 00:27:55 INFO () DockerProcessFactory(create):146 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/6/1 --network host --log-driver none airbyte/source-google-analytics-v4:0.1.1 read --config source_config.json --catalog source_catalog.json
2021-09-21 00:27:55 INFO () DefaultReplicationWorker(run):139 - Waiting for source thread to join.
2021-09-21 00:27:55 INFO () DefaultReplicationWorker(lambda$getReplicationRunnable$2):210 - Replication thread started.
2021-09-21 00:27:55 INFO () DefaultReplicationWorker(lambda$getDestinationOutputRunnable$3):246 - Destination output thread started.
2021-09-21 00:27:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-21 00:27:57 INFO i.a.i.d.r.RedshiftDestination(main):97 - {} - starting destination: class io.airbyte.integrations.destination.redshift.RedshiftDestination
2021-09-21 00:27:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-21 00:27:57 INFO i.a.i.b.IntegrationRunner(run):96 - {} - Running integration: io.airbyte.integrations.destination.redshift.RedshiftDestination
2021-09-21 00:27:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-21 00:27:57 INFO i.a.i.b.IntegrationCliParser(parseOptions):135 - {} - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2021-09-21 00:27:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-21 00:27:57 INFO i.a.i.b.IntegrationRunner(run):100 - {} - Command: WRITE

Steps to Reproduce

  1. set up a source
  2. set up a Redshift destination
  3. configure sync with custom namespace custom-namespace-redshift

Are you willing to submit a PR?

No

@marcosmarxm marcosmarxm added type/bug Something isn't working area/connectors Connector related issues labels Sep 21, 2021
@marcosmarxm marcosmarxm added the priority/high High priority label Nov 10, 2021
@marcosmarxm marcosmarxm changed the title Destination Redshift: not able to handle - in namespace/stream name Destination Redshift & Postgres: not able to handle - in namespace/stream name Nov 10, 2021
@tuliren
Copy link
Contributor

tuliren commented Nov 10, 2021

Postgres does allow dash in the table name. Putting the table name in quotes should fix this issue for Postgres.

@marcosmarxm marcosmarxm changed the title Destination Redshift & Postgres: not able to handle - in namespace/stream name Destination Redshift, Postgres and Snowflake: not able to handle - in namespace/stream name Nov 10, 2021
@marcosmarxm
Copy link
Member Author

I tested this for Redshift, Postgres and Snowflake.
@tuliren the problem we allow this in UI.

@tuliren
Copy link
Contributor

tuliren commented Nov 10, 2021

@marcosmarxm, I think the root cause is on the backend. The backend should quote the table names when running the query. It's fine that on the UI we allow dash.

@marcosmarxm
Copy link
Member Author

Agree @tuliren it's possible to add this to connector roadmap? Don't look a very complex problem and it's quite problematic for some users.

@misteryeo
Copy link
Contributor

This issue is being handled here: #9351

@VitaliiMaltsev VitaliiMaltsev self-assigned this Apr 1, 2022
@VitaliiMaltsev VitaliiMaltsev moved this from Backlog (unscoped) to Implementation in progress in GL Roadmap Apr 1, 2022
@VitaliiMaltsev VitaliiMaltsev moved this from Implementation in progress to In review (internal) in GL Roadmap Apr 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Archived in project
Development

Successfully merging a pull request may close this issue.

5 participants