Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigQuery Destination V2 - Number of arguments does not match for function ARRAY_CONCAT #29090

Closed
1 task
haithem-souala opened this issue Aug 4, 2023 · 5 comments · Fixed by #29381
Closed
1 task
Assignees
Labels
area/connectors Connector related issues needed-for-v2-release team/destinations Destinations team's backlog type/bug Something isn't working

Comments

@haithem-souala
Copy link
Contributor

haithem-souala commented Aug 4, 2023

Connector Name

destination-bigquery

Connector Version

1.7.2

What step the error happened?

During the sync

Revelant information

Normalization: raw data (json)
Source: mongodb

Relevant log output

Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]
	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114)
	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694)
	at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437)
	at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432)
	at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)
	at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)
	at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)
	at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431)
	at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415)
	at com.google.cloud.bigquery.Job$1.call(Job.java:338)
	at com.google.cloud.bigquery.Job$1.call(Job.java:335)
	at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)
	at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)
	at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)
	at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334)
	at com.google.cloud.bigquery.Job.waitFor(Job.java:244)
	at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63)
	at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100)
	at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243)
	at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306)
	at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82)
	at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95)
	at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154)
	at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272)
	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157)
	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99)
	at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
GET https://www.googleapis.com/bigquery/v2/projects/obf-project/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false
{
  "code": 400,
  "errors": [
    {
      "domain": "global",
      "location": "q",
      "locationType": "parameter",
      "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]",
      "reason": "invalidQuery"
    }
  ],
  "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]",
  "status": "INVALID_ARGUMENT"
}
	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)
	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692)
	... 25 more

Contribute

  • Yes, I want to contribute
@haithem-souala haithem-souala added type/bug Something isn't working area/connectors Connector related issues needs-triage labels Aug 4, 2023
@haithem-souala
Copy link
Contributor Author

@haithem-souala
Copy link
Contributor Author

The array_concat function is empty, when nbr error = 0 -> array_concat([])

WITH intermediate_data AS (
  SELECT

  array_concat(
  ) as _airbyte_cast_errors,

@evantahler
Copy link
Contributor

@haithem-souala can you confirm that the stream in question has no columns? e.g. are we making an empty table (or have you deselected all columns using column selection)?

@haithem-souala
Copy link
Contributor Author

@evantahler, i confirm that the stream has no columns.

@evantahler
Copy link
Contributor

evantahler commented Aug 8, 2023

Grooming:

  • If there is a stream with 0 columns, we should still make the final table with our _airbyte_ columns... but timebox this. if it would be difficult, we can also skip making the final table if there are 0 columns in the source.
  • Then, within T&D we should still copy over the _aribtye_ids and timestamps, even though there will be no source-specified columns.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/connectors Connector related issues needed-for-v2-release team/destinations Destinations team's backlog type/bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants