>> ATTEMPT 1/3 2023-08-04 15:56:35 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get state 2023-08-04 15:57:16 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2023-08-04 15:57:16 destination > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 15:57:16 destination > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: WRITE 2023-08-04 15:57:16 destination > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2023-08-04 15:57:16 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:57:16 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:57:16 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):413 Selected loading method is set to: GCS 2023-08-04 15:57:16 destination > INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):429 All tmp files will be removed from GCS when replication is finished 2023-08-04 15:56:35 INFO i.a.w.h.NormalizationInDestinationHelper(shouldNormalizeInDestination):52 - Requires Normalization: false, Normalization Supported: false, Feature Flag Enabled: false 2023-08-04 15:56:35 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to set attempt sync config 2023-08-04 15:56:36 INFO i.a.c.t.s.DefaultTaskQueueMapper(getTaskQueue):31 - Called DefaultTaskQueueMapper getTaskQueue for geography auto 2023-08-04 15:56:36 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/0/logs.log 2023-08-04 15:56:36 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 15:56:36 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 15:56:36 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the source definition for feature flag checks 2023-08-04 15:56:36 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the source definition 2023-08-04 15:56:36 INFO i.a.w.g.ReplicationWorkerFactory(maybeEnableConcurrentStreamReads):166 - Concurrent stream read enabled? false 2023-08-04 15:56:36 INFO i.a.w.g.ReplicationWorkerFactory(create):127 - Setting up source... 2023-08-04 15:56:36 INFO i.a.w.g.ReplicationWorkerFactory(create):134 - Setting up destination... 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable METRIC_CLIENT: '' 2023-08-04 15:56:36 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to 2023-08-04 15:56:36 INFO i.a.w.g.ReplicationWorkerFactory(create):146 - Setting up replication worker... 2023-08-04 15:56:36 INFO i.a.w.g.DefaultReplicationWorker(run):124 - start sync worker. job id: 129 attempt id: 0 2023-08-04 15:56:36 INFO i.a.w.g.DefaultReplicationWorker(run):129 - configured sync modes: {pickup-points.pickuppoints=incremental - append, pickup-points.bookings=full_refresh - overwrite, pickup-points.bookingprocessingdatas=full_refresh - overwrite} 2023-08-04 15:56:36 INFO i.a.w.i.DefaultAirbyteDestination(start):88 - Running destination... 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:56:36 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:56:36 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START REPLICATION ----- 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:56:36 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:56:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:56:37 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = destination-bigquery-write-129-0-jguvr for airbyte/destination-bigquery:1.7.2 with resources io.airbyte.config.ResourceRequirements@66f2a76c[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:56:37 INFO i.a.w.p.KubeProcessFactory(create):111 - destination-bigquery-write-129-0-jguvr stdoutLocalPort = 9012 2023-08-04 15:56:37 INFO i.a.w.p.KubeProcessFactory(create):114 - destination-bigquery-write-129-0-jguvr stderrLocalPort = 9013 2023-08-04 15:56:37 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:56:37 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:56:37 INFO i.a.w.p.KubePodProcess():584 - Creating pod destination-bigquery-write-129-0-jguvr... 2023-08-04 15:56:37 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:56:37 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: destination_config.json 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/0f420f35-cf82-4456-a5d5-1e8fb116aedb/destination_config.json data/destination-bigquery-write-129-0-jguvr:/config/destination_config.json -c init 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: destination_catalog.json 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/6a402ce8-24c3-43c2-878a-e84cfe9ea5ae/destination_catalog.json data/destination-bigquery-write-129-0-jguvr:/config/destination_catalog.json -c init 2023-08-04 15:56:39 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:56:40 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:56:40 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:56:40 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/8c820477-3f67-4b39-b2ac-0d4a9cfd3fd0/FINISHED_UPLOADING data/destination-bigquery-write-129-0-jguvr:/config/FINISHED_UPLOADING -c init 2023-08-04 15:56:40 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:56:41 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:56:41 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:57:08 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:57:08 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.42.7 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess():639 - Creating stdin socket... 2023-08-04 15:57:09 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):41 - Writing messages to protocol version 0.2.0 2023-08-04 15:57:09 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:57:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:57:09 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = source-mongodb-v2-read-129-0-qqkbh for airbyte/source-mongodb-v2:0.2.5 with resources io.airbyte.config.ResourceRequirements@41e4c842[cpuRequest=0.5,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:57:09 INFO i.a.w.p.KubeProcessFactory(create):111 - source-mongodb-v2-read-129-0-qqkbh stdoutLocalPort = 9014 2023-08-04 15:57:09 INFO i.a.w.p.KubeProcessFactory(create):114 - source-mongodb-v2-read-129-0-qqkbh stderrLocalPort = 9015 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess():584 - Creating pod source-mongodb-v2-read-129-0-qqkbh... 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:57:09 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:57:10 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:57:10 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:57:10 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 15:57:10 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/b0591350-7b76-424e-b13a-f286d0426511/source_config.json data/source-mongodb-v2-read-129-0-qqkbh:/config/source_config.json -c init 2023-08-04 15:57:10 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:57:11 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:57:11 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_catalog.json 2023-08-04 15:57:11 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/68860f68-a9ec-47ed-81df-65df24e9b1fa/source_catalog.json data/source-mongodb-v2-read-129-0-qqkbh:/config/source_catalog.json -c init 2023-08-04 15:57:11 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:57:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:57:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:57:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/733c285d-6314-4eb1-8a1c-08ced74de763/FINISHED_UPLOADING data/source-mongodb-v2-read-129-0-qqkbh:/config/FINISHED_UPLOADING -c init 2023-08-04 15:57:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:57:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:57:12 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:57:14 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:57:15 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 15:57:16 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 15:57:16 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.24.119 2023-08-04 15:57:16 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 15:57:16 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 15:57:16 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):224 - Destination output thread started. 2023-08-04 15:57:16 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):94 - Starting source heartbeat check. Will check every 1 minutes. 2023-08-04 15:57:16 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):268 - Replication thread started. 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):396 Creating BigQuery staging message consumer with staging ID 9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9 at 2023-08-04T15:57:11.496Z 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_bookings, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_csy_pickuppoints_bookings}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_bookingprocessingdatas, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_drl_pickuppoints_bookingprocessingdatas}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_pickuppoints, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_omj_pickuppoints_pickuppoints}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=append, stagedFiles=[]] 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):173 class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):156 Preparing airbyte_raw tables in destination started for 3 streams 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_pickuppoints, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, stage: pickuppoints_pickuppoints 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 Creating dataset airbyte_internal 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 Creating dataset airbyte 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_pickuppoints (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/ 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_bookings, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}}, stage: pickuppoints_bookings 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_bookings (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_bookings/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/ 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(truncateTableIfExists):207 Truncating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} (dataset airbyte) 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):175 Deleting target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} (dataset airbyte) 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_bookingprocessingdatas, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}}, stage: pickuppoints_bookingprocessingdatas 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_bookingprocessingdatas (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_bookingprocessingdatas/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/ 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(truncateTableIfExists):207 Truncating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} (dataset airbyte) 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):175 Deleting target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} (dataset airbyte) 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:57:16 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(prepareFinalTables):60 Preparing final tables 2023-08-04 15:57:16 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql fb851786-c1ce-4c3e-a552-7d3d64af7f4d: CREATE SCHEMA IF NOT EXISTS `eu_poow_ds` OPTIONS(location="EU"); CREATE OR REPLACE TABLE `eu_poow_ds`.`pickuppoints_bookings` ( _airbyte_raw_id STRING NOT NULL, _airbyte_extracted_at TIMESTAMP NOT NULL, _airbyte_meta JSON NOT NULL, ) PARTITION BY (DATE_TRUNC(_airbyte_extracted_at, DAY)) CLUSTER BY `_airbyte_extracted_at`; 2023-08-04 15:57:17 source > INFO i.a.i.s.m.MongoDbSource(main):55 starting source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:57:17 source > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {read=null, catalog=source_catalog.json, config=source_config.json} 2023-08-04 15:57:17 source > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:57:17 source > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: READ 2023-08-04 15:57:17 source > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'} 2023-08-04 15:57:17 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:57:17 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:57:17 source > INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):48 Legacy state manager selected to manage state object with type LEGACY. 2023-08-04 15:57:17 source > INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):191 No cursor field set in catalog but not present in state. Stream: pickup-points_bookingprocessingdatas, New Cursor Field: null. Resetting cursor value 2023-08-04 15:57:17 source > INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):191 No cursor field set in catalog but not present in state. Stream: pickup-points_bookings, New Cursor Field: null. Resetting cursor value 2023-08-04 15:57:17 source > INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):191 No cursor field set in catalog but not present in state. Stream: pickup-points_pickuppoints, New Cursor Field: updatedAt. Resetting cursor value 2023-08-04 15:57:17 source > INFO i.a.i.s.r.CdcStateManager():31 Initialized CDC state with: null 2023-08-04 15:57:17 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job fb851786-c1ce-4c3e-a552-7d3d64af7f4d completed in 1056 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:17 source > INFO c.m.d.l.SLF4JLogger(info):71 Cluster created with settings {hosts=[127.0.0.1:27017], srvHost=pickup-points.bwbt2.mongodb.net, mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', requiredReplicaSetName='atlas-plh3lm-shard-0'} 2023-08-04 15:57:18 source > INFO c.m.d.l.SLF4JLogger(info):71 Cluster description not yet available. Waiting for 30000 ms before timing out 2023-08-04 15:57:18 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-00.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:57:18 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-01.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:57:18 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-02.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:57:18 source > INFO c.m.d.l.SLF4JLogger(info):71 No server chosen by com.mongodb.client.internal.MongoClientDelegate$1@693e4d19 from cluster description ClusterDescription{type=REPLICA_SET, connectionMode=MULTIPLE, serverDescriptions=[ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}]}. Waiting for 30000 ms before timing out 2023-08-04 15:57:18 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql CREATE SCHEMA IF NOT EXISTS `eu_poow_ds` OPTIONS(location="EU")... completed in 80 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:18 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql CREATE OR REPLACE TABLE `eu_poow_ds`.`pickuppoints_bookings` ( _airbyte_raw_id STRING NOT NULL, _air... completed in 89 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:18 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql 4fa7d7e7-d59e-488d-9db6-267690c54916: CREATE SCHEMA IF NOT EXISTS `eu_poow_ds` OPTIONS(location="EU"); CREATE OR REPLACE TABLE `eu_poow_ds`.`pickuppoints_bookingprocessingdatas` ( _airbyte_raw_id STRING NOT NULL, _airbyte_extracted_at TIMESTAMP NOT NULL, _airbyte_meta JSON NOT NULL, ) PARTITION BY (DATE_TRUNC(_airbyte_extracted_at, DAY)) CLUSTER BY `_airbyte_extracted_at`; 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:1, serverValue:107959}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:3, serverValue:269569}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:6, serverValue:269568}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:5, serverValue:107247}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:2, serverValue:107958}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:4, serverValue:107248}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=451292231, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b5839658963dafcb22081c, counter=4}, lastWriteDate=Fri Aug 04 15:57:15 UTC 2023, lastUpdateTimeNanos=5439997912670737} 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=451356512, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=7fffffff000000000000000a, setVersion=1, topologyVersion=TopologyVersion{processId=64b583ecebfb924922e6cfc1, counter=6}, lastWriteDate=Fri Aug 04 15:57:15 UTC 2023, lastUpdateTimeNanos=5439997912437156} 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=459273124, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b584404f6809297d3ea234, counter=3}, lastWriteDate=Fri Aug 04 15:57:15 UTC 2023, lastUpdateTimeNanos=5439997920722553} 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Setting max election id to 7fffffff000000000000000a from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Setting max set version to 1 from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Discovered replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:7, serverValue:269570}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:8, serverValue:269571}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:57:19 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job 4fa7d7e7-d59e-488d-9db6-267690c54916 completed in 1169 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:20 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql CREATE SCHEMA IF NOT EXISTS `eu_poow_ds` OPTIONS(location="EU")... completed in 72 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:20 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql CREATE OR REPLACE TABLE `eu_poow_ds`.`pickuppoints_bookingprocessingdatas` ( _airbyte_raw_id STRING ... completed in 137 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:20 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql 7648777e-3f82-4be9-8f7c-fbc08ed2a6e6: CREATE SCHEMA IF NOT EXISTS `eu_poow_ds` OPTIONS(location="EU"); CREATE OR REPLACE TABLE `eu_poow_ds`.`pickuppoints_pickuppoints` ( _airbyte_raw_id STRING NOT NULL, _airbyte_extracted_at TIMESTAMP NOT NULL, _airbyte_meta JSON NOT NULL, `address` JSON, `packageMaxWeight` JSON, `packageMaxCombinedLength` JSON, `legacyCategory` STRING, `maxCombined` JSON, `maxWeight` JSON, `type` STRING, `closingDates` JSON, `createdAt` STRING, `carrier` JSON, `finalClosingDate` STRING, `packageMaxDimension` JSON, `outdatedAt` STRING, `location` JSON, `openingHours` JSON, `maxPackageQuantity` NUMERIC, `id` STRING, `_id` STRING, `category` STRING, `openingDate` STRING, `maxPackagesQuantity` NUMERIC, `updatedAt` STRING ) PARTITION BY (DATE_TRUNC(_airbyte_extracted_at, DAY)) CLUSTER BY `_airbyte_extracted_at`; 2023-08-04 15:57:21 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job 7648777e-3f82-4be9-8f7c-fbc08ed2a6e6 completed in 1000 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:22 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql CREATE SCHEMA IF NOT EXISTS `eu_poow_ds` OPTIONS(location="EU")... completed in 77 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:22 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql CREATE OR REPLACE TABLE `eu_poow_ds`.`pickuppoints_pickuppoints` ( _airbyte_raw_id STRING NOT NULL, ... completed in 138 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:57:22 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):178 Preparing tables in destination completed. 2023-08-04 15:57:22 source > INFO i.a.c.u.CompositeIterator(lambda$emitStartStreamStatus$1):155 STARTING -> pickup-points_pickuppoints 2023-08-04 15:57:22 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$getOrCreateBuffer$0):109 Starting a new buffer for stream pickuppoints_pickuppoints (current state: 0 bytes in 0 buffers) 2023-08-04 15:57:22 destination > INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):27 Default schema. 2023-08-04 15:58:10 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded. 2023-08-04 15:58:10 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):288 executing on success close procedure. 2023-08-04 15:58:10 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):133 Flushing all 1 current buffers (166 MB in total) 2023-08-04 15:58:10 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):137 Flushing buffer of stream pickuppoints_pickuppoints (166 MB) 2023-08-04 15:58:10 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$5):196 Flushing buffer for stream pickuppoints_pickuppoints (166 MB) to staging 2023-08-04 15:58:10 destination > INFO i.a.i.d.r.BaseSerializedBuffer(flush):172 Finished writing data to 5e7675e6-7fae-4c2e-90d6-93c08742c7735493485812983048938.avro (166 MB) 2023-08-04 15:58:10 destination > INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):116 Uploading records to staging for stream pickuppoints_pickuppoints (dataset airbyte_internal): data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/ 2023-08-04 15:58:10 destination > INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 Initiated multipart upload to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with full ID ABPnzm7lW4tvQuoYqVAKQq2ndodHAZMtjhVR3t-blo20swWwRC2r-hWFZbe1MkStO15xMsdq 2023-08-04 15:58:12 destination > INFO a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 3 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 2 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 8 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 4 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 6 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 5 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 7 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 9 containing 10.01 MB] 2023-08-04 15:58:12 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 1 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 10 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 17 containing 6.71 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 13 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 14 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 12 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 11 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 15 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Finished uploading [Part number 16 containing 10.01 MB] 2023-08-04 15:58:13 destination > INFO a.m.s.StreamTransferManager(complete):395 [Manager uploading to poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro with id ABPnzm7lW...tO15xMsdq]: Completed 2023-08-04 15:58:13 destination > INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):214 Uploaded buffer file to storage: 5e7675e6-7fae-4c2e-90d6-93c08742c7735493485812983048938.avro -> data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro (filename: 0.avro) 2023-08-04 15:58:13 destination > INFO i.a.i.d.s.S3StorageOperations(uploadRecordsToBucket):131 Successfully loaded records to stage data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/ with 0 re-attempt(s) 2023-08-04 15:58:13 destination > INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):61 Added staged file: 0.avro 2023-08-04 15:58:13 destination > INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTableFromStage):133 Uploading records from staging files to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} (dataset airbyte_internal): [0.avro] 2023-08-04 15:58:13 destination > INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTableFromStage$0):138 Uploading staged file: gs://poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro 2023-08-04 15:58:13 destination > INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTableFromStage$0):147 [JobId{project=lastmile-prod, job=9162fed1-7bc9-4390-bd7d-e8dc6b918afc, location=EU}] Created a new job to upload record(s) to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} (dataset airbyte_internal): Job{job=JobId{project=lastmile-prod, job=9162fed1-7bc9-4390-bd7d-e8dc6b918afc, location=EU}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1691164693712, endTime=null, startTime=1691164693816, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, transactionInfo=null, sessionInfo=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte-bq-gcs-sa@lastmile-prod.iam.gserviceaccount.com, etag=AUAylXHS6LDX8Zy1XbRayg==, generatedId=lastmile-prod:EU.9162fed1-7bc9-4390-bd7d-e8dc6b918afc, selfLink=https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/jobs/9162fed1-7bc9-4390-bd7d-e8dc6b918afc?location=EU, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, projectId=lastmile-prod, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=AvroOptions{type=AVRO, useAvroLogicalTypes=null}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, ignoreUnknownValue=null, sourceUris=[gs://poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null, referenceFileSchemaUri=null, connectionProperties=null, createSession=null}} 2023-08-04 15:58:13 destination > INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):437 Waiting for job finish Job{job=JobId{project=lastmile-prod, job=9162fed1-7bc9-4390-bd7d-e8dc6b918afc, location=EU}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1691164693712, endTime=null, startTime=1691164693816, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, transactionInfo=null, sessionInfo=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte-bq-gcs-sa@lastmile-prod.iam.gserviceaccount.com, etag=AUAylXHS6LDX8Zy1XbRayg==, generatedId=lastmile-prod:EU.9162fed1-7bc9-4390-bd7d-e8dc6b918afc, selfLink=https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/jobs/9162fed1-7bc9-4390-bd7d-e8dc6b918afc?location=EU, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, projectId=lastmile-prod, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=AvroOptions{type=AVRO, useAvroLogicalTypes=null}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, ignoreUnknownValue=null, sourceUris=[gs://poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null, referenceFileSchemaUri=null, connectionProperties=null, createSession=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2023-08-04 15:58:09 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):111 - thread status... heartbeat thread: false , replication thread: true 2023-08-04 15:58:09 INFO i.a.w.g.DefaultReplicationWorker(replicate):195 - Waiting for source and destination threads to complete. 2023-08-04 15:58:09 INFO i.a.w.g.DefaultReplicationWorker(replicate):200 - One of source or destination thread complete. Waiting on the other. 2023-08-04 15:58:28 destination > INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):439 Job finish Job{job=JobId{project=lastmile-prod, job=9162fed1-7bc9-4390-bd7d-e8dc6b918afc, location=EU}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1691164693712, endTime=null, startTime=1691164693816, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, transactionInfo=null, sessionInfo=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte-bq-gcs-sa@lastmile-prod.iam.gserviceaccount.com, etag=AUAylXHS6LDX8Zy1XbRayg==, generatedId=lastmile-prod:EU.9162fed1-7bc9-4390-bd7d-e8dc6b918afc, selfLink=https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/jobs/9162fed1-7bc9-4390-bd7d-e8dc6b918afc?location=EU, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, projectId=lastmile-prod, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=AvroOptions{type=AVRO, useAvroLogicalTypes=null}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, ignoreUnknownValue=null, sourceUris=[gs://poow-data-staging/data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null, referenceFileSchemaUri=null, connectionProperties=null, createSession=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2023-08-04 15:58:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTableFromStage$0):152 [JobId{project=lastmile-prod, job=9162fed1-7bc9-4390-bd7d-e8dc6b918afc, location=EU}] Target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} (dataset airbyte_internal) is successfully appended with staging files 2023-08-04 15:58:28 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_pickuppoints 2023-08-04 15:58:28 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql a75a63d4-f6e2-4a12-ae2f-2d31380e941c: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END as `address`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END as `packageMaxWeight`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END as `packageMaxCombinedLength`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) as `legacyCategory`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END as `maxCombined`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END as `maxWeight`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) as `type`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END as `closingDates`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) as `createdAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END as `carrier`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) as `finalClosingDate`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END as `packageMaxDimension`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) as `outdatedAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END as `location`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END as `openingHours`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) as `maxPackageQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) as `id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) as `_id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) as `category`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) as `openingDate`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) as `maxPackagesQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) as `updatedAt`, array_concat( CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.address') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END IS NULL) THEN ["Problem with `address`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END IS NULL) THEN ["Problem with `packageMaxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END IS NULL) THEN ["Problem with `packageMaxCombinedLength`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.legacyCategory') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.legacyCategory')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) IS NULL) THEN ["Problem with `legacyCategory`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END IS NULL) THEN ["Problem with `maxCombined`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END IS NULL) THEN ["Problem with `maxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.type') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.type')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) IS NULL) THEN ["Problem with `type`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END IS NULL) THEN ["Problem with `closingDates`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.createdAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.createdAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) IS NULL) THEN ["Problem with `createdAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.carrier') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END IS NULL) THEN ["Problem with `carrier`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.finalClosingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.finalClosingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) IS NULL) THEN ["Problem with `finalClosingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END IS NULL) THEN ["Problem with `packageMaxDimension`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.outdatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.outdatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) IS NULL) THEN ["Problem with `outdatedAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.location') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END IS NULL) THEN ["Problem with `location`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END IS NULL) THEN ["Problem with `openingHours`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackageQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) IS NULL) THEN ["Problem with `id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$._id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$._id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) IS NULL) THEN ["Problem with `_id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.category') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.category')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) IS NULL) THEN ["Problem with `category`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) IS NULL) THEN ["Problem with `openingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackagesQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.updatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.updatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) IS NULL) THEN ["Problem with `updatedAt`"] ELSE [] END ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` WHERE _airbyte_loaded_at IS NULL ) SELECT `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 15:58:40 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job a75a63d4-f6e2-4a12-ae2f-2d31380e941c completed in 11635 ms; processed 176071839 bytes; billed for 177209344 bytes 2023-08-04 15:58:41 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql BEGIN TRANSACTION completed in 255 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:41 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCom... completed in 5286 ms; processed 87857965 bytes; billed for 88080384 bytes 2023-08-04 15:58:41 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at... completed in 4332 ms; processed 88213874 bytes; billed for 89128960 bytes 2023-08-04 15:58:41 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql COMMIT TRANSACTION completed in 142 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:41 destination > INFO i.a.i.d.r.FileBuffer(deleteFile):109 Deleting tempFile data 5e7675e6-7fae-4c2e-90d6-93c08742c7735493485812983048938.avro 2023-08-04 15:58:41 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):139 Flushing completed for pickuppoints_pickuppoints 2023-08-04 15:58:41 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(close):158 Closing buffer for stream pickuppoints_pickuppoints 2023-08-04 15:58:41 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$6):241 Cleaning up destination started for 3 streams 2023-08-04 15:58:41 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_pickuppoints 2023-08-04 15:58:41 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql f6780c77-63f6-45e4-97ee-ee208adc6335: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END as `address`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END as `packageMaxWeight`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END as `packageMaxCombinedLength`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) as `legacyCategory`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END as `maxCombined`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END as `maxWeight`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) as `type`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END as `closingDates`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) as `createdAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END as `carrier`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) as `finalClosingDate`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END as `packageMaxDimension`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) as `outdatedAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END as `location`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END as `openingHours`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) as `maxPackageQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) as `id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) as `_id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) as `category`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) as `openingDate`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) as `maxPackagesQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) as `updatedAt`, array_concat( CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.address') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END IS NULL) THEN ["Problem with `address`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END IS NULL) THEN ["Problem with `packageMaxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END IS NULL) THEN ["Problem with `packageMaxCombinedLength`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.legacyCategory') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.legacyCategory')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) IS NULL) THEN ["Problem with `legacyCategory`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END IS NULL) THEN ["Problem with `maxCombined`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END IS NULL) THEN ["Problem with `maxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.type') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.type')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) IS NULL) THEN ["Problem with `type`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END IS NULL) THEN ["Problem with `closingDates`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.createdAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.createdAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) IS NULL) THEN ["Problem with `createdAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.carrier') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END IS NULL) THEN ["Problem with `carrier`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.finalClosingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.finalClosingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) IS NULL) THEN ["Problem with `finalClosingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END IS NULL) THEN ["Problem with `packageMaxDimension`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.outdatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.outdatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) IS NULL) THEN ["Problem with `outdatedAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.location') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END IS NULL) THEN ["Problem with `location`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END IS NULL) THEN ["Problem with `openingHours`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackageQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) IS NULL) THEN ["Problem with `id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$._id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$._id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) IS NULL) THEN ["Problem with `_id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.category') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.category')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) IS NULL) THEN ["Problem with `category`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) IS NULL) THEN ["Problem with `openingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackagesQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.updatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.updatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) IS NULL) THEN ["Problem with `updatedAt`"] ELSE [] END ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` WHERE _airbyte_loaded_at IS NULL ) SELECT `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 15:58:46 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job f6780c77-63f6-45e4-97ee-ee208adc6335 completed in 4884 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:47 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql BEGIN TRANSACTION completed in 65 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:47 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCom... completed in 1600 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:47 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at... completed in 1580 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:47 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql COMMIT TRANSACTION completed in 117 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:58:47 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):186 Cleaning up staging path for stream pickuppoints_pickuppoints (dataset airbyte_internal): data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints 2023-08-04 15:58:47 destination > INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 Deleting object data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints/2023/08/04/15/9b8ac21c-7e8f-48d5-afa7-cb1587cbe1b9/0.avro 2023-08-04 15:58:47 destination > INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):306 Storage bucket data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints has been cleaned-up (1 objects were deleted)... 2023-08-04 15:58:47 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_bookings 2023-08-04 15:58:47 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql aa112385-a467-4458-82c5-97df9dfea5ab: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_bookings` ( _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT array_concat( ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_bookings` WHERE _airbyte_loaded_at IS NULL ) SELECT to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_bookings` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 15:58:48 destination > ERROR i.a.i.d.b.BufferedStreamConsumer(close):318 Close failed. com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:338) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:335) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitFor(Job.java:244) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) ~[io.airbyte.airbyte-integrations.bases-base-typing-deduping-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) ~[google-http-client-1.43.1.jar:1.43.1] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] ... 25 more Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more 2023-08-04 15:58:48 destination > ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:338) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:335) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitFor(Job.java:244) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) ~[io.airbyte.airbyte-integrations.bases-base-typing-deduping-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) ~[google-http-client-1.43.1.jar:1.43.1] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] ... 25 more Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more 2023-08-04 15:58:48 destination > Destination process done (exit code 1) 2023-08-04 15:58:48 destination > Skipping in-connector normalization 2023-08-04 15:58:41 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromDestination):219 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@4671c204[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@511e92d6[type=LEGACY,stream=,global=,data={"cdc":false,"streams":[{"stream_name":"bookingprocessingdatas","stream_namespace":"pickup-points","cursor_field":[]},{"stream_name":"bookings","stream_namespace":"pickup-points","cursor_field":[]},{"stream_name":"pickuppoints","stream_namespace":"pickup-points","cursor_field":["updatedAt"],"cursor":"2023-08-04T05:37:21.594Z","cursor_record_count":100}]},additionalProperties={}],trace=,control=,additionalProperties={}] 2023-08-04 15:58:41 INFO i.a.w.i.s.SyncPersistenceImpl(startBackgroundFlushStateTask):180 - starting state flush thread for connectionId f83cbbcb-5223-42e7-a084-55e54599e99a 2023-08-04 15:58:48 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromDestination):219 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@2d01a533[type=TRACE,log=,spec=,connectionStatus=,catalog=,record=,state=,trace=io.airbyte.protocol.models.AirbyteTraceMessage@15bfe13f[type=ERROR,emittedAt=1.691164728724E12,error=io.airbyte.protocol.models.AirbyteErrorTraceMessage@1486f95[message=Something went wrong in the connector. See the logs for more details.,internalMessage=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3],stackTrace=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more ,failureType=system_error,streamDescriptor=,additionalProperties={}],estimate=,streamStatus=,additionalProperties={}],control=,additionalProperties={}] 2023-08-04 15:58:49 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / destination-bigquery-write-129-0-jguvr) - Closed all resources for pod 2023-08-04 15:58:49 ERROR i.a.w.g.DefaultReplicationWorker(replicate):211 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:201) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Suppressed: io.airbyte.workers.exception.WorkerException: Destination process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:138) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:159) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Caused by: io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] ... 1 more 2023-08-04 15:58:49 INFO i.a.w.g.ReplicationWorkerHelper(getReplicationOutput):294 - sync summary: { "status" : "failed", "recordsSynced" : 139833, "bytesSynced" : 168450477, "startTime" : 1691164596985, "endTime" : 1691164729822, "totalStats" : { "bytesCommitted" : 168450477, "bytesEmitted" : 168450477, "destinationStateMessagesEmitted" : 1, "destinationWriteEndTime" : 0, "destinationWriteStartTime" : 1691164596985, "meanSecondsBeforeSourceStateMessageEmitted" : 47, "maxSecondsBeforeSourceStateMessageEmitted" : 47, "maxSecondsBetweenStateMessageEmittedandCommitted" : 32, "meanSecondsBetweenStateMessageEmittedandCommitted" : 32, "recordsEmitted" : 139833, "recordsCommitted" : 139833, "replicationEndTime" : 0, "replicationStartTime" : 1691164596985, "sourceReadEndTime" : 1691164689886, "sourceReadStartTime" : 1691164629373, "sourceStateMessagesEmitted" : 1 }, "streamStats" : [ { "streamName" : "pickuppoints_pickuppoints", "stats" : { "bytesCommitted" : 168450477, "bytesEmitted" : 168450477, "recordsEmitted" : 139833, "recordsCommitted" : 139833 } } ] } 2023-08-04 15:58:49 INFO i.a.w.g.ReplicationWorkerHelper(getReplicationOutput):295 - failures: [ { "failureOrigin" : "destination", "failureType" : "system_error", "internalMessage" : "com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "externalMessage" : "Something went wrong in the connector. See the logs for more details.", "metadata" : { "attemptNumber" : 0, "jobId" : 129, "from_trace_message" : true, "connector_command" : "write" }, "stacktrace" : "com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114)\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694)\n\tat com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437)\n\tat com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432)\n\tat com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)\n\tat com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431)\n\tat com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415)\n\tat com.google.cloud.bigquery.Job$1.call(Job.java:338)\n\tat com.google.cloud.bigquery.Job$1.call(Job.java:335)\n\tat com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)\n\tat com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334)\n\tat com.google.cloud.bigquery.Job.waitFor(Job.java:244)\n\tat io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63)\n\tat io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243)\n\tat io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306)\n\tat io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82)\n\tat io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95)\n\tat io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154)\n\tat io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455)\nCaused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request\nGET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false\n{\n \"code\": 400,\n \"errors\": [\n {\n \"domain\": \"global\",\n \"location\": \"q\",\n \"locationType\": \"parameter\",\n \"message\": \"Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\",\n \"reason\": \"invalidQuery\"\n }\n ],\n \"message\": \"Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\",\n \"status\": \"INVALID_ARGUMENT\"\n}\n\tat com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)\n\tat com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)\n\tat com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)\n\tat com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692)\n\t... 25 more\n", "timestamp" : 1691164728724 }, { "failureOrigin" : "destination", "internalMessage" : "Destination process exited with non-zero exit code 1", "externalMessage" : "Something went wrong within the destination connector", "metadata" : { "attemptNumber" : 0, "jobId" : 129, "connector_command" : "write" }, "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1589)\n", "timestamp" : 1691164729716 } ] 2023-08-04 15:58:49 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 15:58:49 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):160 - sync summary: io.airbyte.config.StandardSyncOutput@5330ffa2[standardSyncSummary=io.airbyte.config.StandardSyncSummary@356c4f9d[status=failed,recordsSynced=139833,bytesSynced=168450477,startTime=1691164596985,endTime=1691164729822,totalStats=io.airbyte.config.SyncStats@676859a3[bytesCommitted=168450477,bytesEmitted=168450477,destinationStateMessagesEmitted=1,destinationWriteEndTime=0,destinationWriteStartTime=1691164596985,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=47,maxSecondsBeforeSourceStateMessageEmitted=47,maxSecondsBetweenStateMessageEmittedandCommitted=32,meanSecondsBetweenStateMessageEmittedandCommitted=32,recordsEmitted=139833,recordsCommitted=139833,replicationEndTime=0,replicationStartTime=1691164596985,sourceReadEndTime=1691164689886,sourceReadStartTime=1691164629373,sourceStateMessagesEmitted=1,additionalProperties={}],streamStats=[io.airbyte.config.StreamSyncStats@1a277846[streamName=pickuppoints_pickuppoints,streamNamespace=,stats=io.airbyte.config.SyncStats@4ebe494f[bytesCommitted=168450477,bytesEmitted=168450477,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=139833,recordsCommitted=139833,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}],additionalProperties={}]],performanceMetrics=,additionalProperties={}],normalizationSummary=,webhookOperationSummary=,state=,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@19f9e7a2[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@3c79107f[stream=io.airbyte.protocol.models.AirbyteStream@1d3daafe[name=pickuppoints_bookings,jsonSchema={"type":"object","properties":{}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@21f56c27[stream=io.airbyte.protocol.models.AirbyteStream@77675b4c[name=pickuppoints_bookingprocessingdatas,jsonSchema={"type":"object","properties":{}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@3e94737[stream=io.airbyte.protocol.models.AirbyteStream@737377ee[name=pickuppoints_pickuppoints,jsonSchema={"type":"object","properties":{"address":{"type":"object","properties":{"country":{"type":"string"},"city":{"type":"string"},"postalCode":{"type":"string"},"addressLine1":{"type":"string"},"addressLine2":{"type":"string"},"title":{"type":"string"}}},"packageMaxWeight":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"packageMaxCombinedLength":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"legacyCategory":{"type":"string"},"maxCombined":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"maxWeight":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"type":{"type":"string"},"closingDates":{"type":"array"},"createdAt":{"type":"string"},"carrier":{"type":"object","properties":{"code":{"type":"string"}}},"finalClosingDate":{"type":"string"},"packageMaxDimension":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"outdatedAt":{"type":"string"},"location":{"type":"object","properties":{"coordinates":{"type":"array"},"type":{"type":"string"}}},"openingHours":{"type":"object","properties":{"sunday":{"type":"array"},"saturday":{"type":"array"},"tuesday":{"type":"array"},"friday":{"type":"array"},"thursday":{"type":"array"},"wednesday":{"type":"array"},"monday":{"type":"array"}}},"maxPackageQuantity":{"type":"number"},"id":{"type":"string"},"_id":{"type":"string"},"category":{"type":"string"},"openingDate":{"type":"string"},"maxPackagesQuantity":{"type":"number"},"updatedAt":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[updatedAt],destinationSyncMode=append,primaryKey=[],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@796da9f3[failureOrigin=destination,failureType=system_error,internalMessage=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3],externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@349d20a9[additionalProperties={attemptNumber=0, jobId=129, from_trace_message=true, connector_command=write}],stacktrace=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_a5WOwfGWnnU9Xr9opaFDZlmpk7Uy?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more ,retryable=,timestamp=1691164728724,additionalProperties={}], io.airbyte.config.FailureReason@7d0409f2[failureOrigin=destination,failureType=,internalMessage=Destination process exited with non-zero exit code 1,externalMessage=Something went wrong within the destination connector,metadata=io.airbyte.config.Metadata@15a0cdbb[additionalProperties={attemptNumber=0, jobId=129, connector_command=write}],stacktrace=io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1589) ,retryable=,timestamp=1691164729716,additionalProperties={}]],additionalProperties={}] 2023-08-04 15:58:49 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):165 - Sync summary length: 11053 2023-08-04 15:58:49 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):307 - Stopping temporal heartbeating... 2023-08-04 15:58:49 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:58:49 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END REPLICATION ----- 2023-08-04 15:58:49 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:27 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to Get a connection by connection Id 2023-08-04 15:59:27 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the most recent source actor catalog 2023-08-04 15:59:27 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to Retrieve Id of the workspace for the source 2023-08-04 15:59:28 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to Get a connection by connection Id >> ATTEMPT 2/3 2023-08-04 15:58:53 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/1/logs.log 2023-08-04 15:58:53 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 15:58:53 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 15:59:40 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2023-08-04 15:59:40 source > INFO i.a.i.s.m.MongoDbSource(main):55 starting source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:59:40 destination > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 15:59:40 destination > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: WRITE 2023-08-04 15:59:40 destination > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2023-08-04 15:59:40 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:40 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:40 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):413 Selected loading method is set to: GCS 2023-08-04 15:59:40 destination > INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):429 All tmp files will be removed from GCS when replication is finished 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):396 Creating BigQuery staging message consumer with staging ID 749440fb-d10c-4208-a4f4-64d5bb29eaf3 at 2023-08-04T15:59:34.650Z 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_bookings, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_lms_pickuppoints_bookings}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_bookingprocessingdatas, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_zgw_pickuppoints_bookingprocessingdatas}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_pickuppoints, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_tjp_pickuppoints_pickuppoints}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=append, stagedFiles=[]] 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):173 class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):156 Preparing airbyte_raw tables in destination started for 3 streams 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_pickuppoints, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, stage: pickuppoints_pickuppoints 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 Creating dataset airbyte_internal 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 Creating dataset airbyte 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table ALREADY EXISTS: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_pickuppoints (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_pickuppoints/2023/08/04/15/749440fb-d10c-4208-a4f4-64d5bb29eaf3/ 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_bookings, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}}, stage: pickuppoints_bookings 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table ALREADY EXISTS: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_bookings (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_bookings/2023/08/04/15/749440fb-d10c-4208-a4f4-64d5bb29eaf3/ 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(truncateTableIfExists):207 Truncating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} (dataset airbyte) 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):175 Deleting target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} (dataset airbyte) 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_bookingprocessingdatas, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}}, stage: pickuppoints_bookingprocessingdatas 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table ALREADY EXISTS: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_bookingprocessingdatas (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_bookingprocessingdatas/2023/08/04/15/749440fb-d10c-4208-a4f4-64d5bb29eaf3/ 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(truncateTableIfExists):207 Truncating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} (dataset airbyte) 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):175 Deleting target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} (dataset airbyte) 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:58:54 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:58:54 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = source-mongodb-v2-check-129-1-doagd for airbyte/source-mongodb-v2:0.2.5 with resources io.airbyte.config.ResourceRequirements@35611181[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:58:54 INFO i.a.w.p.KubeProcessFactory(create):111 - source-mongodb-v2-check-129-1-doagd stdoutLocalPort = 9016 2023-08-04 15:58:54 INFO i.a.w.p.KubeProcessFactory(create):114 - source-mongodb-v2-check-129-1-doagd stderrLocalPort = 9017 2023-08-04 15:58:54 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:58:54 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:58:54 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START CHECK ----- 2023-08-04 15:58:54 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:58:54 INFO i.a.w.p.KubePodProcess():584 - Creating pod source-mongodb-v2-check-129-1-doagd... 2023-08-04 15:58:54 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:58:54 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:58:54 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:58:55 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:58:55 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:58:55 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 15:58:55 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/2454ba12-06ea-479d-82d2-5f1704fab8c6/source_config.json data/source-mongodb-v2-check-129-1-doagd:/config/source_config.json -c init 2023-08-04 15:58:55 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:58:56 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:58:56 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:58:56 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/e512aecd-a7d1-40b2-b94f-58e82788f22c/FINISHED_UPLOADING data/source-mongodb-v2-check-129-1-doagd:/config/FINISHED_UPLOADING -c init 2023-08-04 15:58:56 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:58:56 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:58:56 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:58:58 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:58:58 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 15:58:58 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 15:58:58 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.42.8 2023-08-04 15:58:58 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 15:58:58 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 15:58:59 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.s.m.MongoDbSource(main):55 starting source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:58:59 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json} 2023-08-04 15:58:59 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:58:59 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK 2023-08-04 15:58:59 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2023-08-04 15:59:00 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:00 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Cluster created with settings {hosts=[127.0.0.1:27017], srvHost=pickup-points.bwbt2.mongodb.net, mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', requiredReplicaSetName='atlas-plh3lm-shard-0'} 2023-08-04 15:59:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-00.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:59:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Cluster description not yet available. Waiting for 30000 ms before timing out 2023-08-04 15:59:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-01.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:59:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-02.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:59:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 No server chosen by com.mongodb.client.internal.MongoClientDelegate$1@2bc12da from cluster description ClusterDescription{type=REPLICA_SET, connectionMode=MULTIPLE, serverDescriptions=[ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}]}. Waiting for 30000 ms before timing out 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:3, serverValue:269589}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:2, serverValue:107966}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:4, serverValue:269590}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:1, serverValue:107967}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:6, serverValue:107259}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:5, serverValue:107260}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=511692314, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=7fffffff000000000000000a, setVersion=1, topologyVersion=TopologyVersion{processId=64b583ecebfb924922e6cfc1, counter=6}, lastWriteDate=Fri Aug 04 15:58:55 UTC 2023, lastUpdateTimeNanos=552928273930} 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=521204292, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b584404f6809297d3ea234, counter=3}, lastWriteDate=Fri Aug 04 15:58:55 UTC 2023, lastUpdateTimeNanos=552936812966} 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Setting max election id to 7fffffff000000000000000a from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Setting max set version to 1 from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=527243913, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b5839658963dafcb22081c, counter=4}, lastWriteDate=Fri Aug 04 15:58:55 UTC 2023, lastUpdateTimeNanos=552942915514} 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Discovered replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:7, serverValue:269591}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.s.m.MongoDbSource(lambda$getCheckOperations$0):89 The source passed the basic operation test! 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:59:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.s.m.MongoDbSource(main):57 completed source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:59:02 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / source-mongodb-v2-check-129-1-doagd) - Closed all resources for pod 2023-08-04 15:59:02 INFO i.a.w.g.DefaultCheckConnectionWorker(run):117 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@79b4f370[status=succeeded,message=,additionalProperties={}] 2023-08-04 15:59:02 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 15:59:02 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:02 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK ----- 2023-08-04 15:59:02 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:02 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/1/logs.log 2023-08-04 15:59:02 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 15:59:02 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 15:59:02 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:02 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:02 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START CHECK ----- 2023-08-04 15:59:02 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:59:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:59:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:59:03 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = destination-bigquery-check-129-1-jqxza for airbyte/destination-bigquery:1.7.2 with resources io.airbyte.config.ResourceRequirements@4f89fc08[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:59:03 INFO i.a.w.p.KubeProcessFactory(create):111 - destination-bigquery-check-129-1-jqxza stdoutLocalPort = 9018 2023-08-04 15:59:03 INFO i.a.w.p.KubeProcessFactory(create):114 - destination-bigquery-check-129-1-jqxza stderrLocalPort = 9019 2023-08-04 15:59:03 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:59:03 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:59:03 INFO i.a.w.p.KubePodProcess():584 - Creating pod destination-bigquery-check-129-1-jqxza... 2023-08-04 15:59:03 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:59:03 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/b98baf43-ea11-4ce1-aeb3-cff75ba8e57b/source_config.json data/destination-bigquery-check-129-1-jqxza:/config/source_config.json -c init 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/93d67057-e1a1-451c-9879-c560d6332edd/FINISHED_UPLOADING data/destination-bigquery-check-129-1-jqxza:/config/FINISHED_UPLOADING -c init 2023-08-04 15:59:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:06 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:06 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:59:06 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:59:06 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 15:59:07 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 15:59:07 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.0.14 2023-08-04 15:59:07 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 15:59:07 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 15:59:15 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json} 2023-08-04 15:59:15 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 15:59:15 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK 2023-08-04 15:59:15 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2023-08-04 15:59:16 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:16 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:16 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:17 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):413 Selected loading method is set to: GCS 2023-08-04 15:59:23 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 S3 format config: {"format_type":"CSV","flattening":"No flattening"} 2023-08-04 15:59:23 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testSingleUpload):40 Started testing if all required credentials assigned to user for single file uploading 2023-08-04 15:59:24 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testSingleUpload):48 Finished checking for normal upload mode 2023-08-04 15:59:24 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testMultipartUpload):52 Started testing if all required credentials assigned to user for multipart upload 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 Initiated multipart upload to poow-data-staging/data_sync/ultifile/test_1691164764916 with full ID ABPnzm7guhicKC-FPJIF7O7wWeSbu5lElRHzeunPwcAaUp_sWd45VdmGhYljspZoiPoQvLDx 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000] 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000] 2023-08-04 15:59:25 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN a.m.s.MultiPartOutputStream(close):160 [MultipartOutputStream for parts 1 - 10000] is already closed 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(complete):367 [Manager uploading to poow-data-staging/data_sync/ultifile/test_1691164764916 with id ABPnzm7gu...oiPoQvLDx]: Uploading leftover stream [Part number 1 containing 3.34 MB] 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/test_1691164764916 with id ABPnzm7gu...oiPoQvLDx]: Finished uploading [Part number 1 containing 3.34 MB] 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(complete):395 [Manager uploading to poow-data-staging/data_sync/ultifile/test_1691164764916 with id ABPnzm7gu...oiPoQvLDx]: Completed 2023-08-04 15:59:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testMultipartUpload):74 Finished verification for multipart upload mode 2023-08-04 15:59:26 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 15:59:26 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - Destination process done (exit code 0) 2023-08-04 15:59:27 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / destination-bigquery-check-129-1-jqxza) - Closed all resources for pod 2023-08-04 15:59:27 INFO i.a.w.g.DefaultCheckConnectionWorker(run):117 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@5073fe21[status=succeeded,message=,additionalProperties={}] 2023-08-04 15:59:27 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 15:59:27 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:27 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK ----- 2023-08-04 15:59:27 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:27 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get state 2023-08-04 15:59:27 INFO i.a.w.h.NormalizationInDestinationHelper(shouldNormalizeInDestination):52 - Requires Normalization: false, Normalization Supported: false, Feature Flag Enabled: false 2023-08-04 15:59:27 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to set attempt sync config 2023-08-04 15:59:27 INFO i.a.c.t.s.DefaultTaskQueueMapper(getTaskQueue):31 - Called DefaultTaskQueueMapper getTaskQueue for geography auto 2023-08-04 15:59:28 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/1/logs.log 2023-08-04 15:59:28 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 15:59:28 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 15:59:28 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the source definition for feature flag checks 2023-08-04 15:59:28 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the source definition 2023-08-04 15:59:28 INFO i.a.w.g.ReplicationWorkerFactory(maybeEnableConcurrentStreamReads):166 - Concurrent stream read enabled? false 2023-08-04 15:59:28 INFO i.a.w.g.ReplicationWorkerFactory(create):127 - Setting up source... 2023-08-04 15:59:28 INFO i.a.w.g.ReplicationWorkerFactory(create):134 - Setting up destination... 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable METRIC_CLIENT: '' 2023-08-04 15:59:28 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to 2023-08-04 15:59:28 INFO i.a.w.g.ReplicationWorkerFactory(create):146 - Setting up replication worker... 2023-08-04 15:59:28 INFO i.a.w.g.DefaultReplicationWorker(run):124 - start sync worker. job id: 129 attempt id: 1 2023-08-04 15:59:28 INFO i.a.w.g.DefaultReplicationWorker(run):129 - configured sync modes: {pickup-points.pickuppoints=incremental - append, pickup-points.bookings=full_refresh - overwrite, pickup-points.bookingprocessingdatas=full_refresh - overwrite} 2023-08-04 15:59:28 INFO i.a.w.i.DefaultAirbyteDestination(start):88 - Running destination... 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:59:28 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:59:28 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = destination-bigquery-write-129-1-iolfo for airbyte/destination-bigquery:1.7.2 with resources io.airbyte.config.ResourceRequirements@4cdc48a4[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:59:28 INFO i.a.w.p.KubeProcessFactory(create):111 - destination-bigquery-write-129-1-iolfo stdoutLocalPort = 9020 2023-08-04 15:59:28 INFO i.a.w.p.KubeProcessFactory(create):114 - destination-bigquery-write-129-1-iolfo stderrLocalPort = 9021 2023-08-04 15:59:28 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:28 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START REPLICATION ----- 2023-08-04 15:59:28 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:28 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:59:28 INFO i.a.w.p.KubePodProcess():584 - Creating pod destination-bigquery-write-129-1-iolfo... 2023-08-04 15:59:28 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:59:28 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:59:28 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:59:29 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:59:29 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:59:29 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: destination_config.json 2023-08-04 15:59:29 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/7644e1c6-032d-4dd9-bc8d-2fc33966afbe/destination_config.json data/destination-bigquery-write-129-1-iolfo:/config/destination_config.json -c init 2023-08-04 15:59:29 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: destination_catalog.json 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/e1fb39d3-a19d-4b7e-aa64-d32d40cc4ac8/destination_catalog.json data/destination-bigquery-write-129-1-iolfo:/config/destination_catalog.json -c init 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/209740c4-e77f-4399-9e34-0f872f424279/FINISHED_UPLOADING data/destination-bigquery-write-129-1-iolfo:/config/FINISHED_UPLOADING -c init 2023-08-04 15:59:30 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:31 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:31 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.42.9 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess():639 - Creating stdin socket... 2023-08-04 15:59:32 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):41 - Writing messages to protocol version 0.2.0 2023-08-04 15:59:32 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:59:32 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:59:32 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = source-mongodb-v2-read-129-1-tzwws for airbyte/source-mongodb-v2:0.2.5 with resources io.airbyte.config.ResourceRequirements@33650f65[cpuRequest=0.5,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:59:32 INFO i.a.w.p.KubeProcessFactory(create):111 - source-mongodb-v2-read-129-1-tzwws stdoutLocalPort = 9022 2023-08-04 15:59:32 INFO i.a.w.p.KubeProcessFactory(create):114 - source-mongodb-v2-read-129-1-tzwws stderrLocalPort = 9023 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:59:32 INFO i.a.w.p.KubePodProcess():584 - Creating pod source-mongodb-v2-read-129-1-tzwws... 2023-08-04 15:59:33 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:59:33 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: input_state.json 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/8fab90b6-0fc7-410f-9daf-9d7e68b3a8cd/input_state.json data/source-mongodb-v2-read-129-1-tzwws:/config/input_state.json -c init 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/b42c5501-8d5b-40f2-b2f1-b84afa44bacf/source_config.json data/source-mongodb-v2-read-129-1-tzwws:/config/source_config.json -c init 2023-08-04 15:59:35 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_catalog.json 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/9ee20e92-5a74-4c21-bfd7-2623d5e567e8/source_catalog.json data/source-mongodb-v2-read-129-1-tzwws:/config/source_catalog.json -c init 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/1d30132c-1c40-424a-92d8-3f5288be642a/FINISHED_UPLOADING data/source-mongodb-v2-read-129-1-tzwws:/config/FINISHED_UPLOADING -c init 2023-08-04 15:59:36 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:37 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:37 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:59:38 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:59:38 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 15:59:40 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 15:59:40 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.24.120 2023-08-04 15:59:40 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 15:59:40 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 15:59:40 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):224 - Destination output thread started. 2023-08-04 15:59:40 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):94 - Starting source heartbeat check. Will check every 1 minutes. 2023-08-04 15:59:40 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):268 - Replication thread started. 2023-08-04 15:59:46 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):111 - thread status... heartbeat thread: false , replication thread: true 2023-08-04 15:59:46 INFO i.a.w.g.DefaultReplicationWorker(replicate):195 - Waiting for source and destination threads to complete. 2023-08-04 15:59:46 INFO i.a.w.g.DefaultReplicationWorker(replicate):200 - One of source or destination thread complete. Waiting on the other. 2023-08-04 15:59:46 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromDestination):219 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@1566177e[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@a061887[type=LEGACY,stream=,global=,data={"cdc":false,"streams":[{"stream_name":"bookingprocessingdatas","stream_namespace":"pickup-points","cursor_field":[]},{"stream_name":"bookings","stream_namespace":"pickup-points","cursor_field":[]},{"stream_name":"pickuppoints","stream_namespace":"pickup-points","cursor_field":["updatedAt"],"cursor":"2023-08-04T05:37:21.594Z","cursor_record_count":100}]},additionalProperties={}],trace=,control=,additionalProperties={}] 2023-08-04 15:59:46 INFO i.a.w.i.s.SyncPersistenceImpl(startBackgroundFlushStateTask):180 - starting state flush thread for connectionId f83cbbcb-5223-42e7-a084-55e54599e99a 2023-08-04 15:59:53 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromDestination):219 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@777806fc[type=TRACE,log=,spec=,connectionStatus=,catalog=,record=,state=,trace=io.airbyte.protocol.models.AirbyteTraceMessage@219668f8[type=ERROR,emittedAt=1.69116479387E12,error=io.airbyte.protocol.models.AirbyteErrorTraceMessage@33e1fa26[message=Something went wrong in the connector. See the logs for more details.,internalMessage=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3],stackTrace=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more ,failureType=system_error,streamDescriptor=,additionalProperties={}],estimate=,streamStatus=,additionalProperties={}],control=,additionalProperties={}] 2023-08-04 15:59:40 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(prepareFinalTables):60 Preparing final tables 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.t.BigQuerySqlGenerator(existingSchemaMatchesStreamConfig):245 Alter Table Report [] [] []; Clustering true; Partitioning true 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.t.BigQuerySqlGenerator(existingSchemaMatchesStreamConfig):245 Alter Table Report [] [] []; Clustering true; Partitioning true 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.t.BigQuerySqlGenerator(existingSchemaMatchesStreamConfig):245 Alter Table Report [] [] []; Clustering true; Partitioning true 2023-08-04 15:59:40 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):178 Preparing tables in destination completed. 2023-08-04 15:59:40 source > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {read=null, catalog=source_catalog.json, state=input_state.json, config=source_config.json} 2023-08-04 15:59:40 source > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 15:59:40 source > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: READ 2023-08-04 15:59:40 source > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='input_state.json'} 2023-08-04 15:59:40 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:40 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 15:59:40 source > INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):48 Legacy state manager selected to manage state object with type LEGACY. 2023-08-04 15:59:40 source > INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):178 Found matching cursor in state. Stream: pickup-points_pickuppoints. Cursor Field: updatedAt Value: 2023-08-04T05:37:21.594Z Count: 100 2023-08-04 15:59:40 source > INFO i.a.i.s.r.CdcStateManager():31 Initialized CDC state with: null 2023-08-04 15:59:41 source > INFO c.m.d.l.SLF4JLogger(info):71 Cluster created with settings {hosts=[127.0.0.1:27017], srvHost=pickup-points.bwbt2.mongodb.net, mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', requiredReplicaSetName='atlas-plh3lm-shard-0'} 2023-08-04 15:59:41 source > INFO c.m.d.l.SLF4JLogger(info):71 Cluster description not yet available. Waiting for 30000 ms before timing out 2023-08-04 15:59:41 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-00.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:59:41 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-01.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:59:41 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-02.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 15:59:41 source > INFO c.m.d.l.SLF4JLogger(info):71 No server chosen by com.mongodb.client.internal.MongoClientDelegate$1@773bd77b from cluster description ClusterDescription{type=REPLICA_SET, connectionMode=MULTIPLE, serverDescriptions=[ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}]}. Waiting for 30000 ms before timing out 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:4, serverValue:107264}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:3, serverValue:269599}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:2, serverValue:107972}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:1, serverValue:107971}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:6, serverValue:107265}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:5, serverValue:269600}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=438717121, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b584404f6809297d3ea234, counter=3}, lastWriteDate=Fri Aug 04 15:59:35 UTC 2023, lastUpdateTimeNanos=5440140927087831} 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=439241202, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=7fffffff000000000000000a, setVersion=1, topologyVersion=TopologyVersion{processId=64b583ecebfb924922e6cfc1, counter=6}, lastWriteDate=Fri Aug 04 15:59:35 UTC 2023, lastUpdateTimeNanos=5440140927087827} 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=437455825, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b5839658963dafcb22081c, counter=4}, lastWriteDate=Fri Aug 04 15:59:35 UTC 2023, lastUpdateTimeNanos=5440140925390936} 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Setting max election id to 7fffffff000000000000000a from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Setting max set version to 1 from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Discovered replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:7, serverValue:269601}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:42 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:8, serverValue:269602}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 15:59:45 source > INFO i.a.i.s.r.StateDecoratingIterator(createStateMessage):207 State report for stream pickup-points_pickuppoints - original: updatedAt = 2023-08-04T05:37:21.594Z (count 100) -> latest: updatedAt = 2023-08-04T05:37:21.594Z (count 100) 2023-08-04 15:59:45 source > INFO i.a.c.u.CompositeIterator(lambda$emitStartStreamStatus$1):155 STARTING -> pickup-points_pickuppoints 2023-08-04 15:59:46 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded. 2023-08-04 15:59:46 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):288 executing on success close procedure. 2023-08-04 15:59:46 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):133 Flushing all 0 current buffers (0 bytes in total) 2023-08-04 15:59:46 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$6):241 Cleaning up destination started for 3 streams 2023-08-04 15:59:46 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_pickuppoints 2023-08-04 15:59:46 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql 9c8ad50c-a6aa-46d8-823d-c97a09004258: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END as `address`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END as `packageMaxWeight`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END as `packageMaxCombinedLength`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) as `legacyCategory`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END as `maxCombined`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END as `maxWeight`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) as `type`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END as `closingDates`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) as `createdAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END as `carrier`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) as `finalClosingDate`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END as `packageMaxDimension`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) as `outdatedAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END as `location`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END as `openingHours`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) as `maxPackageQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) as `id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) as `_id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) as `category`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) as `openingDate`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) as `maxPackagesQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) as `updatedAt`, array_concat( CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.address') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END IS NULL) THEN ["Problem with `address`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END IS NULL) THEN ["Problem with `packageMaxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END IS NULL) THEN ["Problem with `packageMaxCombinedLength`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.legacyCategory') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.legacyCategory')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) IS NULL) THEN ["Problem with `legacyCategory`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END IS NULL) THEN ["Problem with `maxCombined`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END IS NULL) THEN ["Problem with `maxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.type') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.type')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) IS NULL) THEN ["Problem with `type`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END IS NULL) THEN ["Problem with `closingDates`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.createdAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.createdAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) IS NULL) THEN ["Problem with `createdAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.carrier') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END IS NULL) THEN ["Problem with `carrier`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.finalClosingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.finalClosingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) IS NULL) THEN ["Problem with `finalClosingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END IS NULL) THEN ["Problem with `packageMaxDimension`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.outdatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.outdatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) IS NULL) THEN ["Problem with `outdatedAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.location') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END IS NULL) THEN ["Problem with `location`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END IS NULL) THEN ["Problem with `openingHours`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackageQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) IS NULL) THEN ["Problem with `id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$._id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$._id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) IS NULL) THEN ["Problem with `_id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.category') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.category')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) IS NULL) THEN ["Problem with `category`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) IS NULL) THEN ["Problem with `openingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackagesQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.updatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.updatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) IS NULL) THEN ["Problem with `updatedAt`"] ELSE [] END ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` WHERE _airbyte_loaded_at IS NULL ) SELECT `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 15:59:51 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job 9c8ad50c-a6aa-46d8-823d-c97a09004258 completed in 4833 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:59:52 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql BEGIN TRANSACTION completed in 47 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:59:52 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCom... completed in 1754 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:59:52 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at... completed in 1614 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:59:52 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql COMMIT TRANSACTION completed in 106 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 15:59:52 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):186 Cleaning up staging path for stream pickuppoints_pickuppoints (dataset airbyte_internal): data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints 2023-08-04 15:59:52 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_bookings 2023-08-04 15:59:52 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql b4d1b93c-9fd0-4e4b-8e53-37b761811444: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_bookings` ( _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT array_concat( ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_bookings` WHERE _airbyte_loaded_at IS NULL ) SELECT to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_bookings` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 15:59:53 destination > ERROR i.a.i.d.b.BufferedStreamConsumer(close):318 Close failed. com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:338) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:335) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitFor(Job.java:244) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) ~[io.airbyte.airbyte-integrations.bases-base-typing-deduping-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) ~[google-http-client-1.43.1.jar:1.43.1] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] ... 25 more Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more 2023-08-04 15:59:53 destination > ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:338) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:335) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitFor(Job.java:244) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) ~[io.airbyte.airbyte-integrations.bases-base-typing-deduping-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) ~[google-http-client-1.43.1.jar:1.43.1] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] ... 25 more Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more 2023-08-04 15:59:53 destination > Destination process done (exit code 1) 2023-08-04 15:59:53 destination > Skipping in-connector normalization 2023-08-04 15:59:55 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / destination-bigquery-write-129-1-iolfo) - Closed all resources for pod 2023-08-04 15:59:55 ERROR i.a.w.g.DefaultReplicationWorker(replicate):211 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:201) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Suppressed: io.airbyte.workers.exception.WorkerException: Destination process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:138) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:159) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Caused by: io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] ... 1 more 2023-08-04 15:59:55 INFO i.a.w.g.ReplicationWorkerHelper(getReplicationOutput):294 - sync summary: { "status" : "failed", "recordsSynced" : 0, "bytesSynced" : 0, "startTime" : 1691164768192, "endTime" : 1691164795068, "totalStats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "destinationStateMessagesEmitted" : 1, "destinationWriteEndTime" : 0, "destinationWriteStartTime" : 1691164768192, "meanSecondsBeforeSourceStateMessageEmitted" : 0, "maxSecondsBeforeSourceStateMessageEmitted" : 0, "maxSecondsBetweenStateMessageEmittedandCommitted" : 1, "meanSecondsBetweenStateMessageEmittedandCommitted" : 1, "recordsEmitted" : 0, "recordsCommitted" : 0, "replicationEndTime" : 0, "replicationStartTime" : 1691164768192, "sourceReadEndTime" : 1691164786228, "sourceReadStartTime" : 1691164772906, "sourceStateMessagesEmitted" : 1 }, "streamStats" : [ ] } 2023-08-04 15:59:55 INFO i.a.w.g.ReplicationWorkerHelper(getReplicationOutput):295 - failures: [ { "failureOrigin" : "destination", "failureType" : "system_error", "internalMessage" : "com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "externalMessage" : "Something went wrong in the connector. See the logs for more details.", "metadata" : { "attemptNumber" : 1, "jobId" : 129, "from_trace_message" : true, "connector_command" : "write" }, "stacktrace" : "com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114)\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694)\n\tat com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437)\n\tat com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432)\n\tat com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)\n\tat com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431)\n\tat com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415)\n\tat com.google.cloud.bigquery.Job$1.call(Job.java:338)\n\tat com.google.cloud.bigquery.Job$1.call(Job.java:335)\n\tat com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)\n\tat com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334)\n\tat com.google.cloud.bigquery.Job.waitFor(Job.java:244)\n\tat io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63)\n\tat io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243)\n\tat io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306)\n\tat io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82)\n\tat io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95)\n\tat io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154)\n\tat io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455)\nCaused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request\nGET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false\n{\n \"code\": 400,\n \"errors\": [\n {\n \"domain\": \"global\",\n \"location\": \"q\",\n \"locationType\": \"parameter\",\n \"message\": \"Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\",\n \"reason\": \"invalidQuery\"\n }\n ],\n \"message\": \"Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\",\n \"status\": \"INVALID_ARGUMENT\"\n}\n\tat com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)\n\tat com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)\n\tat com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)\n\tat com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692)\n\t... 25 more\n", "timestamp" : 1691164793870 }, { "failureOrigin" : "destination", "internalMessage" : "Destination process exited with non-zero exit code 1", "externalMessage" : "Something went wrong within the destination connector", "metadata" : { "attemptNumber" : 1, "jobId" : 129, "connector_command" : "write" }, "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1589)\n", "timestamp" : 1691164795009 } ] 2023-08-04 15:59:55 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 15:59:55 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):160 - sync summary: io.airbyte.config.StandardSyncOutput@75b6b84f[standardSyncSummary=io.airbyte.config.StandardSyncSummary@73ae3585[status=failed,recordsSynced=0,bytesSynced=0,startTime=1691164768192,endTime=1691164795068,totalStats=io.airbyte.config.SyncStats@4f5d8232[bytesCommitted=0,bytesEmitted=0,destinationStateMessagesEmitted=1,destinationWriteEndTime=0,destinationWriteStartTime=1691164768192,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=1,meanSecondsBetweenStateMessageEmittedandCommitted=1,recordsEmitted=0,recordsCommitted=0,replicationEndTime=0,replicationStartTime=1691164768192,sourceReadEndTime=1691164786228,sourceReadStartTime=1691164772906,sourceStateMessagesEmitted=1,additionalProperties={}],streamStats=[],performanceMetrics=,additionalProperties={}],normalizationSummary=,webhookOperationSummary=,state=,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@30797e94[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@704dcc14[stream=io.airbyte.protocol.models.AirbyteStream@72cddd85[name=pickuppoints_bookings,jsonSchema={"type":"object","properties":{}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@51229dfc[stream=io.airbyte.protocol.models.AirbyteStream@2a54c492[name=pickuppoints_bookingprocessingdatas,jsonSchema={"type":"object","properties":{}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@346790a0[stream=io.airbyte.protocol.models.AirbyteStream@28da525c[name=pickuppoints_pickuppoints,jsonSchema={"type":"object","properties":{"address":{"type":"object","properties":{"country":{"type":"string"},"city":{"type":"string"},"postalCode":{"type":"string"},"addressLine1":{"type":"string"},"addressLine2":{"type":"string"},"title":{"type":"string"}}},"packageMaxWeight":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"packageMaxCombinedLength":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"legacyCategory":{"type":"string"},"maxCombined":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"maxWeight":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"type":{"type":"string"},"closingDates":{"type":"array"},"createdAt":{"type":"string"},"carrier":{"type":"object","properties":{"code":{"type":"string"}}},"finalClosingDate":{"type":"string"},"packageMaxDimension":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"outdatedAt":{"type":"string"},"location":{"type":"object","properties":{"coordinates":{"type":"array"},"type":{"type":"string"}}},"openingHours":{"type":"object","properties":{"sunday":{"type":"array"},"saturday":{"type":"array"},"tuesday":{"type":"array"},"friday":{"type":"array"},"thursday":{"type":"array"},"wednesday":{"type":"array"},"monday":{"type":"array"}}},"maxPackageQuantity":{"type":"number"},"id":{"type":"string"},"_id":{"type":"string"},"category":{"type":"string"},"openingDate":{"type":"string"},"maxPackagesQuantity":{"type":"number"},"updatedAt":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[updatedAt],destinationSyncMode=append,primaryKey=[],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@6d52e7c6[failureOrigin=destination,failureType=system_error,internalMessage=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3],externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@612eceaa[additionalProperties={attemptNumber=1, jobId=129, from_trace_message=true, connector_command=write}],stacktrace=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_yzJ7XYyQsCJVfiPN-_TNTIZd_j4I?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more ,retryable=,timestamp=1691164793870,additionalProperties={}], io.airbyte.config.FailureReason@6944b0bf[failureOrigin=destination,failureType=,internalMessage=Destination process exited with non-zero exit code 1,externalMessage=Something went wrong within the destination connector,metadata=io.airbyte.config.Metadata@13f65779[additionalProperties={attemptNumber=1, jobId=129, connector_command=write}],stacktrace=io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1589) ,retryable=,timestamp=1691164795009,additionalProperties={}]],additionalProperties={}] 2023-08-04 15:59:55 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):165 - Sync summary length: 10221 2023-08-04 15:59:55 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):307 - Stopping temporal heartbeating... 2023-08-04 15:59:55 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:55 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END REPLICATION ----- 2023-08-04 15:59:55 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:21 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to Get a connection by connection Id 2023-08-04 16:00:21 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the most recent source actor catalog 2023-08-04 16:00:21 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to Retrieve Id of the workspace for the source 2023-08-04 16:00:21 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to Get a connection by connection Id >> ATTEMPT 3/3 2023-08-04 15:59:56 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/2/logs.log 2023-08-04 15:59:56 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 15:59:56 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 15:59:56 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 15:59:56 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START CHECK ----- 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:56 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 15:59:56 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 15:59:56 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = source-mongodb-v2-check-129-2-cketk for airbyte/source-mongodb-v2:0.2.5 with resources io.airbyte.config.ResourceRequirements@420441ef[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 15:59:56 INFO i.a.w.p.KubeProcessFactory(create):111 - source-mongodb-v2-check-129-2-cketk stdoutLocalPort = 9026 2023-08-04 15:59:56 INFO i.a.w.p.KubeProcessFactory(create):114 - source-mongodb-v2-check-129-2-cketk stderrLocalPort = 9027 2023-08-04 15:59:56 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 15:59:56 INFO i.a.w.p.KubePodProcess():584 - Creating pod source-mongodb-v2-check-129-2-cketk... 2023-08-04 15:59:56 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 15:59:56 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 15:59:56 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/1be61457-32ce-4114-bbc9-0444a14877f0/source_config.json data/source-mongodb-v2-check-129-2-cketk:/config/source_config.json -c init 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/5c51be29-4e75-49d4-b791-3e6bc9c6a39b/FINISHED_UPLOADING data/source-mongodb-v2-check-129-2-cketk:/config/FINISHED_UPLOADING -c init 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 15:59:58 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 15:59:59 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 15:59:59 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 16:00:00 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 16:00:00 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.42.10 2023-08-04 16:00:00 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 16:00:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 16:00:00 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.s.m.MongoDbSource(main):55 starting source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json} 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2023-08-04 16:00:01 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:00:01 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Cluster created with settings {hosts=[127.0.0.1:27017], srvHost=pickup-points.bwbt2.mongodb.net, mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', requiredReplicaSetName='atlas-plh3lm-shard-0'} 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Cluster description not yet available. Waiting for 30000 ms before timing out 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-00.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-01.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-02.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 16:00:01 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 No server chosen by com.mongodb.client.internal.MongoClientDelegate$1@2bc12da from cluster description ClusterDescription{type=REPLICA_SET, connectionMode=MULTIPLE, serverDescriptions=[ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}]}. Waiting for 30000 ms before timing out 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:5, serverValue:107266}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:6, serverValue:107267}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:3, serverValue:269605}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:1, serverValue:269606}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:2, serverValue:107977}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:4, serverValue:107976}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=317526899, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=7fffffff000000000000000a, setVersion=1, topologyVersion=TopologyVersion{processId=64b583ecebfb924922e6cfc1, counter=6}, lastWriteDate=Fri Aug 04 15:59:57 UTC 2023, lastUpdateTimeNanos=613675773846} 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=317506860, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b584404f6809297d3ea234, counter=3}, lastWriteDate=Fri Aug 04 15:59:57 UTC 2023, lastUpdateTimeNanos=613675773849} 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=319314247, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b5839658963dafcb22081c, counter=4}, lastWriteDate=Fri Aug 04 15:59:57 UTC 2023, lastUpdateTimeNanos=613677401919} 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Setting max election id to 7fffffff000000000000000a from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Setting max set version to 1 from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Discovered replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:7, serverValue:269607}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.s.m.MongoDbSource(lambda$getCheckOperations$0):89 The source passed the basic operation test! 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 16:00:02 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.s.m.MongoDbSource(main):57 completed source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 16:00:03 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / source-mongodb-v2-check-129-2-cketk) - Closed all resources for pod 2023-08-04 16:00:03 INFO i.a.w.g.DefaultCheckConnectionWorker(run):117 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@4a11d8[status=succeeded,message=,additionalProperties={}] 2023-08-04 16:00:03 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 16:00:03 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:03 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK ----- 2023-08-04 16:00:03 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:03 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/2/logs.log 2023-08-04 16:00:03 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 16:00:03 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 16:00:03 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 16:00:03 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = destination-bigquery-check-129-2-ddhce for airbyte/destination-bigquery:1.7.2 with resources io.airbyte.config.ResourceRequirements@a0ba3da[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 16:00:03 INFO i.a.w.p.KubeProcessFactory(create):111 - destination-bigquery-check-129-2-ddhce stdoutLocalPort = 9024 2023-08-04 16:00:03 INFO i.a.w.p.KubeProcessFactory(create):114 - destination-bigquery-check-129-2-ddhce stderrLocalPort = 9025 2023-08-04 16:00:03 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:03 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 16:00:03 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START CHECK ----- 2023-08-04 16:00:03 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:03 INFO i.a.w.p.KubePodProcess():584 - Creating pod destination-bigquery-check-129-2-ddhce... 2023-08-04 16:00:03 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 16:00:03 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 16:00:03 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 16:00:05 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 16:00:05 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 16:00:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 16:00:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/d2c35cf1-37ec-495c-bdbb-9847750e9440/source_config.json data/destination-bigquery-check-129-2-ddhce:/config/source_config.json -c init 2023-08-04 16:00:05 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:00:06 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:00:06 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 16:00:06 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/0757cbdb-0080-4509-9506-f5265a5960a5/FINISHED_UPLOADING data/destination-bigquery-check-129-2-ddhce:/config/FINISHED_UPLOADING -c init 2023-08-04 16:00:06 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:00:06 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:00:06 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 16:00:08 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 16:00:08 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 16:00:09 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 16:00:09 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.0.15 2023-08-04 16:00:09 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 16:00:09 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 16:00:11 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json} 2023-08-04 16:00:11 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 16:00:11 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK 2023-08-04 16:00:11 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2023-08-04 16:00:11 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:00:11 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:00:11 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:00:12 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):413 Selected loading method is set to: GCS 2023-08-04 16:00:17 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 S3 format config: {"format_type":"CSV","flattening":"No flattening"} 2023-08-04 16:00:17 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testSingleUpload):40 Started testing if all required credentials assigned to user for single file uploading 2023-08-04 16:00:17 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testSingleUpload):48 Finished checking for normal upload mode 2023-08-04 16:00:17 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testMultipartUpload):52 Started testing if all required credentials assigned to user for multipart upload 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 Initiated multipart upload to poow-data-staging/data_sync/ultifile/test_1691164817992 with full ID ABPnzm4Anoch0h3rXQAP95kc11M-AexdkuwMUqkxaBf6KhhnjzRVWNh2igymJoY5AL8Czvk1 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000] 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000] 2023-08-04 16:00:18 WARN i.a.w.i.VersionedAirbyteStreamFactory(internalLog):309 - WARN a.m.s.MultiPartOutputStream(close):160 [MultipartOutputStream for parts 1 - 10000] is already closed 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(complete):367 [Manager uploading to poow-data-staging/data_sync/ultifile/test_1691164817992 with id ABPnzm4An...5AL8Czvk1]: Uploading leftover stream [Part number 1 containing 3.34 MB] 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(uploadStreamPart):558 [Manager uploading to poow-data-staging/data_sync/ultifile/test_1691164817992 with id ABPnzm4An...5AL8Czvk1]: Finished uploading [Part number 1 containing 3.34 MB] 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO a.m.s.StreamTransferManager(complete):395 [Manager uploading to poow-data-staging/data_sync/ultifile/test_1691164817992 with id ABPnzm4An...5AL8Czvk1]: Completed 2023-08-04 16:00:18 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.d.s.S3BaseChecks(testMultipartUpload):74 Finished verification for multipart upload mode 2023-08-04 16:00:19 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 16:00:19 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - Destination process done (exit code 0) 2023-08-04 16:00:20 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / destination-bigquery-check-129-2-ddhce) - Closed all resources for pod 2023-08-04 16:00:20 INFO i.a.w.g.DefaultCheckConnectionWorker(run):117 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@33cce866[status=succeeded,message=,additionalProperties={}] 2023-08-04 16:00:20 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:20 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 16:00:20 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK ----- 2023-08-04 16:00:20 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:20 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get state 2023-08-04 16:00:20 INFO i.a.w.h.NormalizationInDestinationHelper(shouldNormalizeInDestination):52 - Requires Normalization: false, Normalization Supported: false, Feature Flag Enabled: false 2023-08-04 16:00:20 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to set attempt sync config 2023-08-04 16:00:21 INFO i.a.c.t.s.DefaultTaskQueueMapper(getTaskQueue):31 - Called DefaultTaskQueueMapper getTaskQueue for geography auto 2023-08-04 16:00:22 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/129/2/logs.log 2023-08-04 16:00:22 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.50.6 2023-08-04 16:00:22 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-08-04 16:00:22 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the source definition for feature flag checks 2023-08-04 16:00:22 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to get the source definition 2023-08-04 16:00:22 INFO i.a.w.g.ReplicationWorkerFactory(maybeEnableConcurrentStreamReads):166 - Concurrent stream read enabled? false 2023-08-04 16:00:22 INFO i.a.w.g.ReplicationWorkerFactory(create):127 - Setting up source... 2023-08-04 16:00:22 INFO i.a.w.g.ReplicationWorkerFactory(create):134 - Setting up destination... 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable METRIC_CLIENT: '' 2023-08-04 16:00:22 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to 2023-08-04 16:00:22 INFO i.a.w.g.ReplicationWorkerFactory(create):146 - Setting up replication worker... 2023-08-04 16:00:22 INFO i.a.w.g.DefaultReplicationWorker(run):124 - start sync worker. job id: 129 attempt id: 2 2023-08-04 16:00:22 INFO i.a.w.g.DefaultReplicationWorker(run):129 - configured sync modes: {pickup-points.pickuppoints=incremental - append, pickup-points.bookings=full_refresh - overwrite, pickup-points.bookingprocessingdatas=full_refresh - overwrite} 2023-08-04 16:00:22 INFO i.a.w.i.DefaultAirbyteDestination(start):88 - Running destination... 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 16:00:22 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 16:00:22 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = destination-bigquery-write-129-2-iccwk for airbyte/destination-bigquery:1.7.2 with resources io.airbyte.config.ResourceRequirements@37c5cf77[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 16:00:22 INFO i.a.w.p.KubeProcessFactory(create):111 - destination-bigquery-write-129-2-iccwk stdoutLocalPort = 9028 2023-08-04 16:00:22 INFO i.a.w.p.KubeProcessFactory(create):114 - destination-bigquery-write-129-2-iccwk stderrLocalPort = 9029 2023-08-04 16:00:22 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:22 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START REPLICATION ----- 2023-08-04 16:00:22 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:00:22 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 16:00:22 INFO i.a.w.p.KubePodProcess():584 - Creating pod destination-bigquery-write-129-2-iccwk... 2023-08-04 16:00:22 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 16:00:22 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 16:00:22 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 16:00:24 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 16:00:24 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 16:00:24 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: destination_config.json 2023-08-04 16:00:24 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/e6bc0278-e71b-4e46-a80c-3ddf4b3cea68/destination_config.json data/destination-bigquery-write-129-2-iccwk:/config/destination_config.json -c init 2023-08-04 16:00:24 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: destination_catalog.json 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/7b59eaaf-95ab-45f3-89f9-e6f68770b29a/destination_catalog.json data/destination-bigquery-write-129-2-iccwk:/config/destination_catalog.json -c init 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/e194ae55-28b2-4734-a610-c1c8a26915b5/FINISHED_UPLOADING data/destination-bigquery-write-129-2-iccwk:/config/FINISHED_UPLOADING -c init 2023-08-04 16:00:25 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:00:26 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:00:26 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 16:01:25 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2023-08-04 16:01:25 destination > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-08-04 16:01:25 destination > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: WRITE 2023-08-04 16:01:25 destination > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2023-08-04 16:01:25 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:01:25 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:01:25 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):413 Selected loading method is set to: GCS 2023-08-04 16:01:16 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.7.217 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess():639 - Creating stdin socket... 2023-08-04 16:01:17 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):41 - Writing messages to protocol version 0.2.0 2023-08-04 16:01:17 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0' 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1' 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable LAUNCHDARKLY_KEY: '' 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable FEATURE_FLAG_CLIENT: '' 2023-08-04 16:01:17 INFO i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-08-04 16:01:17 INFO i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = source-mongodb-v2-read-129-2-gtild for airbyte/source-mongodb-v2:0.2.5 with resources io.airbyte.config.ResourceRequirements@6b7cea59[cpuRequest=0.5,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null 2023-08-04 16:01:17 INFO i.a.w.p.KubeProcessFactory(create):111 - source-mongodb-v2-read-129-2-gtild stdoutLocalPort = 9034 2023-08-04 16:01:17 INFO i.a.w.p.KubeProcessFactory(create):114 - source-mongodb-v2-read-129-2-gtild stderrLocalPort = 9035 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-08-04 16:01:17 INFO i.a.w.p.KubePodProcess():584 - Creating pod source-mongodb-v2-read-129-2-gtild... 2023-08-04 16:01:18 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-08-04 16:01:18 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-08-04 16:01:20 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-08-04 16:01:20 INFO i.a.w.p.KubePodProcess():615 - Copying files... 2023-08-04 16:01:20 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: input_state.json 2023-08-04 16:01:20 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/4b48cf7e-781c-45a6-a97e-15a20e4a9339/input_state.json data/source-mongodb-v2-read-129-2-gtild:/config/input_state.json -c init 2023-08-04 16:01:20 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/ac6ba5a2-a3e7-4e00-944d-83903b572874/source_config.json data/source-mongodb-v2-read-129-2-gtild:/config/source_config.json -c init 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_catalog.json 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/508f186a-59b5-47ba-b737-125f3b2e67ba/source_catalog.json data/source-mongodb-v2-read-129-2-gtild:/config/source_catalog.json -c init 2023-08-04 16:01:21 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:01:22 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:01:22 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-08-04 16:01:22 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/33acac90-9d0d-48c4-9496-4399b4c9bf27/FINISHED_UPLOADING data/source-mongodb-v2-read-129-2-gtild:/config/FINISHED_UPLOADING -c init 2023-08-04 16:01:22 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-08-04 16:01:23 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-08-04 16:01:23 INFO i.a.w.p.KubePodProcess():618 - Waiting until pod is ready... 2023-08-04 16:01:24 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-08-04 16:01:24 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-08-04 16:01:25 INFO i.a.w.p.KubePodProcess():634 - Reading pod IP... 2023-08-04 16:01:25 INFO i.a.w.p.KubePodProcess():636 - Pod IP: 10.28.24.122 2023-08-04 16:01:25 INFO i.a.w.p.KubePodProcess():643 - Using null stdin output stream... 2023-08-04 16:01:25 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-08-04 16:01:25 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):94 - Starting source heartbeat check. Will check every 1 minutes. 2023-08-04 16:01:25 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):224 - Destination output thread started. 2023-08-04 16:01:25 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):268 - Replication thread started. 2023-08-04 16:01:34 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):111 - thread status... heartbeat thread: false , replication thread: true 2023-08-04 16:01:34 INFO i.a.w.g.DefaultReplicationWorker(replicate):195 - Waiting for source and destination threads to complete. 2023-08-04 16:01:34 INFO i.a.w.g.DefaultReplicationWorker(replicate):200 - One of source or destination thread complete. Waiting on the other. 2023-08-04 16:01:34 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromDestination):219 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@41245cea[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@e355f90[type=LEGACY,stream=,global=,data={"cdc":false,"streams":[{"stream_name":"bookingprocessingdatas","stream_namespace":"pickup-points","cursor_field":[]},{"stream_name":"bookings","stream_namespace":"pickup-points","cursor_field":[]},{"stream_name":"pickuppoints","stream_namespace":"pickup-points","cursor_field":["updatedAt"],"cursor":"2023-08-04T05:37:21.594Z","cursor_record_count":100}]},additionalProperties={}],trace=,control=,additionalProperties={}] 2023-08-04 16:01:34 INFO i.a.w.i.s.SyncPersistenceImpl(startBackgroundFlushStateTask):180 - starting state flush thread for connectionId f83cbbcb-5223-42e7-a084-55e54599e99a 2023-08-04 16:01:42 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromDestination):219 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@4e6c8677[type=TRACE,log=,spec=,connectionStatus=,catalog=,record=,state=,trace=io.airbyte.protocol.models.AirbyteTraceMessage@7d17c5fb[type=ERROR,emittedAt=1.691164902897E12,error=io.airbyte.protocol.models.AirbyteErrorTraceMessage@46acc8ec[message=Something went wrong in the connector. See the logs for more details.,internalMessage=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3],stackTrace=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more ,failureType=system_error,streamDescriptor=,additionalProperties={}],estimate=,streamStatus=,additionalProperties={}],control=,additionalProperties={}] 2023-08-04 16:01:44 INFO i.a.w.p.KubePodProcess(close):799 - (pod: data / destination-bigquery-write-129-2-iccwk) - Closed all resources for pod 2023-08-04 16:01:44 ERROR i.a.w.g.DefaultReplicationWorker(replicate):211 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:201) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Suppressed: io.airbyte.workers.exception.WorkerException: Destination process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:138) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:159) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Caused by: io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] ... 1 more 2023-08-04 16:01:44 INFO i.a.w.g.ReplicationWorkerHelper(getReplicationOutput):294 - sync summary: { "status" : "failed", "recordsSynced" : 0, "bytesSynced" : 0, "startTime" : 1691164822241, "endTime" : 1691164904185, "totalStats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "destinationStateMessagesEmitted" : 1, "destinationWriteEndTime" : 0, "destinationWriteStartTime" : 1691164822241, "meanSecondsBeforeSourceStateMessageEmitted" : 0, "maxSecondsBeforeSourceStateMessageEmitted" : 0, "maxSecondsBetweenStateMessageEmittedandCommitted" : 2, "meanSecondsBetweenStateMessageEmittedandCommitted" : 2, "recordsEmitted" : 0, "recordsCommitted" : 0, "replicationEndTime" : 0, "replicationStartTime" : 1691164822241, "sourceReadEndTime" : 1691164894047, "sourceReadStartTime" : 1691164877979, "sourceStateMessagesEmitted" : 1 }, "streamStats" : [ ] } 2023-08-04 16:01:44 INFO i.a.w.g.ReplicationWorkerHelper(getReplicationOutput):295 - failures: [ { "failureOrigin" : "destination", "failureType" : "system_error", "internalMessage" : "com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "externalMessage" : "Something went wrong in the connector. See the logs for more details.", "metadata" : { "attemptNumber" : 2, "jobId" : 129, "from_trace_message" : true, "connector_command" : "write" }, "stacktrace" : "com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114)\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694)\n\tat com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437)\n\tat com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432)\n\tat com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)\n\tat com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431)\n\tat com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415)\n\tat com.google.cloud.bigquery.Job$1.call(Job.java:338)\n\tat com.google.cloud.bigquery.Job$1.call(Job.java:335)\n\tat com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)\n\tat com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)\n\tat com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334)\n\tat com.google.cloud.bigquery.Job.waitFor(Job.java:244)\n\tat io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63)\n\tat io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243)\n\tat io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306)\n\tat io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82)\n\tat io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95)\n\tat io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154)\n\tat io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455)\nCaused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request\nGET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false\n{\n \"code\": 400,\n \"errors\": [\n {\n \"domain\": \"global\",\n \"location\": \"q\",\n \"locationType\": \"parameter\",\n \"message\": \"Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\",\n \"reason\": \"invalidQuery\"\n }\n ],\n \"message\": \"Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]\",\n \"status\": \"INVALID_ARGUMENT\"\n}\n\tat com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)\n\tat com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)\n\tat com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)\n\tat com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)\n\tat com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)\n\tat com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692)\n\t... 25 more\n", "timestamp" : 1691164902897 }, { "failureOrigin" : "destination", "internalMessage" : "Destination process exited with non-zero exit code 1", "externalMessage" : "Something went wrong within the destination connector", "metadata" : { "attemptNumber" : 2, "jobId" : 129, "connector_command" : "write" }, "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1589)\n", "timestamp" : 1691164904152 } ] 2023-08-04 16:01:44 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-08-04 16:01:44 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:01:44 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END REPLICATION ----- 2023-08-04 16:01:44 INFO i.a.c.i.LineGobbler(voidCall):149 - 2023-08-04 16:01:44 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):160 - sync summary: io.airbyte.config.StandardSyncOutput@6ae4cd2d[standardSyncSummary=io.airbyte.config.StandardSyncSummary@c9593d2[status=failed,recordsSynced=0,bytesSynced=0,startTime=1691164822241,endTime=1691164904185,totalStats=io.airbyte.config.SyncStats@a0f555b[bytesCommitted=0,bytesEmitted=0,destinationStateMessagesEmitted=1,destinationWriteEndTime=0,destinationWriteStartTime=1691164822241,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=2,meanSecondsBetweenStateMessageEmittedandCommitted=2,recordsEmitted=0,recordsCommitted=0,replicationEndTime=0,replicationStartTime=1691164822241,sourceReadEndTime=1691164894047,sourceReadStartTime=1691164877979,sourceStateMessagesEmitted=1,additionalProperties={}],streamStats=[],performanceMetrics=,additionalProperties={}],normalizationSummary=,webhookOperationSummary=,state=,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@2ca7125b[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@60662f47[stream=io.airbyte.protocol.models.AirbyteStream@40cd3b9a[name=pickuppoints_bookings,jsonSchema={"type":"object","properties":{}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@30181b55[stream=io.airbyte.protocol.models.AirbyteStream@24b9e2c7[name=pickuppoints_bookingprocessingdatas,jsonSchema={"type":"object","properties":{}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@7574fd00[stream=io.airbyte.protocol.models.AirbyteStream@5c4c09df[name=pickuppoints_pickuppoints,jsonSchema={"type":"object","properties":{"address":{"type":"object","properties":{"country":{"type":"string"},"city":{"type":"string"},"postalCode":{"type":"string"},"addressLine1":{"type":"string"},"addressLine2":{"type":"string"},"title":{"type":"string"}}},"packageMaxWeight":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"packageMaxCombinedLength":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"legacyCategory":{"type":"string"},"maxCombined":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"maxWeight":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"type":{"type":"string"},"closingDates":{"type":"array"},"createdAt":{"type":"string"},"carrier":{"type":"object","properties":{"code":{"type":"string"}}},"finalClosingDate":{"type":"string"},"packageMaxDimension":{"type":"object","properties":{"unit":{"type":"string"},"value":{"type":"number"}}},"outdatedAt":{"type":"string"},"location":{"type":"object","properties":{"coordinates":{"type":"array"},"type":{"type":"string"}}},"openingHours":{"type":"object","properties":{"sunday":{"type":"array"},"saturday":{"type":"array"},"tuesday":{"type":"array"},"friday":{"type":"array"},"thursday":{"type":"array"},"wednesday":{"type":"array"},"monday":{"type":"array"}}},"maxPackageQuantity":{"type":"number"},"id":{"type":"string"},"_id":{"type":"string"},"category":{"type":"string"},"openingDate":{"type":"string"},"maxPackagesQuantity":{"type":"number"},"updatedAt":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[updatedAt],destinationSyncMode=append,primaryKey=[],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@2c7e73b0[failureOrigin=destination,failureType=system_error,internalMessage=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3],externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@47ba1322[additionalProperties={attemptNumber=2, jobId=129, from_trace_message=true, connector_command=write}],stacktrace=com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more ,retryable=,timestamp=1691164902897,additionalProperties={}], io.airbyte.config.FailureReason@15f46a28[failureOrigin=destination,failureType=,internalMessage=Destination process exited with non-zero exit code 1,externalMessage=Something went wrong within the destination connector,metadata=io.airbyte.config.Metadata@29ca9bb8[additionalProperties={attemptNumber=2, jobId=129, connector_command=write}],stacktrace=io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:238) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1589) ,retryable=,timestamp=1691164904152,additionalProperties={}]],additionalProperties={}] 2023-08-04 16:01:44 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):165 - Sync summary length: 10219 2023-08-04 16:01:44 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):307 - Stopping temporal heartbeating... 2023-08-04 16:01:25 destination > INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):429 All tmp files will be removed from GCS when replication is finished 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):396 Creating BigQuery staging message consumer with staging ID e974dd65-43df-4a17-8d7e-19f09689a573 at 2023-08-04T16:01:24.136Z 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_bookings, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_cbh_pickuppoints_bookings}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_bookingprocessingdatas, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_vmw_pickuppoints_bookingprocessingdatas}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$2):133 BigQuery write config: BigQueryWriteConfig[streamName=pickuppoints_pickuppoints, namespace=eu_poow_ds, datasetId=airbyte_internal, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=_airbyte_tmp_okx_pickuppoints_pickuppoints}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, tableSchema=Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, syncMode=append, stagedFiles=[]] 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):173 class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):156 Preparing airbyte_raw tables in destination started for 3 streams 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_pickuppoints, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}}, stage: pickuppoints_pickuppoints 2023-08-04 16:01:25 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 Creating dataset airbyte_internal 2023-08-04 16:01:26 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 Creating dataset airbyte 2023-08-04 16:01:26 source > INFO i.a.i.s.m.MongoDbSource(main):55 starting source: class io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 16:01:26 source > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {read=null, catalog=source_catalog.json, state=input_state.json, config=source_config.json} 2023-08-04 16:01:26 source > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.mongodb.MongoDbSource 2023-08-04 16:01:27 source > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: READ 2023-08-04 16:01:27 source > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='input_state.json'} 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} 2023-08-04 16:01:27 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:01:27 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-08-04 16:01:27 source > INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):48 Legacy state manager selected to manage state object with type LEGACY. 2023-08-04 16:01:27 source > INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):178 Found matching cursor in state. Stream: pickup-points_pickuppoints. Cursor Field: updatedAt Value: 2023-08-04T05:37:21.594Z Count: 100 2023-08-04 16:01:27 source > INFO i.a.i.s.r.CdcStateManager():31 Initialized CDC state with: null 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table ALREADY EXISTS: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_pickuppoints}} 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_pickuppoints (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_pickuppoints/2023/08/04/16/e974dd65-43df-4a17-8d7e-19f09689a573/ 2023-08-04 16:01:27 source > INFO c.m.d.l.SLF4JLogger(info):71 Cluster created with settings {hosts=[127.0.0.1:27017], srvHost=pickup-points.bwbt2.mongodb.net, mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', requiredReplicaSetName='atlas-plh3lm-shard-0'} 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_bookings, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}}, stage: pickuppoints_bookings 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 16:01:27 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-00.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 16:01:27 source > INFO c.m.d.l.SLF4JLogger(info):71 Cluster description not yet available. Waiting for 30000 ms before timing out 2023-08-04 16:01:27 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-01.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 16:01:27 source > INFO c.m.d.l.SLF4JLogger(info):71 Adding discovered server pickup-points-shard-00-02.bwbt2.mongodb.net:27017 to client view of cluster 2023-08-04 16:01:27 source > INFO c.m.d.l.SLF4JLogger(info):71 No server chosen by com.mongodb.client.internal.MongoClientDelegate$1@773bd77b from cluster description ClusterDescription{type=REPLICA_SET, connectionMode=MULTIPLE, serverDescriptions=[ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}, ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=UNKNOWN, state=CONNECTING}]}. Waiting for 30000 ms before timing out 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table ALREADY EXISTS: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_bookings (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_bookings/2023/08/04/16/e974dd65-43df-4a17-8d7e-19f09689a573/ 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryGcsOperations(truncateTableIfExists):207 Truncating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} (dataset airbyte) 2023-08-04 16:01:27 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):175 Deleting target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} (dataset airbyte) 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookings}} 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):158 Preparing staging are in destination for schema: Schema{fields=[Field{name=_airbyte_raw_id, type=STRING, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_extracted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_loaded_at, type=TIMESTAMP, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}, Field{name=_airbyte_data, type=JSON, mode=null, description=null, policyTags=null, maxLength=null, scale=null, precision=null, defaultValueExpression=null, collation=null}]}, stream: pickuppoints_bookingprocessingdatas, target table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}}, stage: pickuppoints_bookingprocessingdatas 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table ALREADY EXISTS: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):109 Creating staging path for stream pickuppoints_bookingprocessingdatas (dataset airbyte): data_sync/ultifile/airbyte_pickuppoints_bookingprocessingdatas/2023/08/04/16/e974dd65-43df-4a17-8d7e-19f09689a573/ 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(truncateTableIfExists):207 Truncating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} (dataset airbyte) 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):175 Deleting target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} (dataset airbyte) 2023-08-04 16:01:28 destination > INFO i.a.i.d.b.BigQueryGcsOperations(createTableIfNotExists):102 Creating target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:1, serverValue:269623}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:4, serverValue:107984}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:6, serverValue:107274}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:3, serverValue:269624}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:5, serverValue:107275}] to pickup-points-shard-00-02.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:2, serverValue:107985}] to pickup-points-shard-00-00.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=638298589, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-02.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b584404f6809297d3ea234, counter=3}, lastWriteDate=Fri Aug 04 16:01:25 UTC 2023, lastUpdateTimeNanos=5440247879452214} 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=645589624, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-01.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=7fffffff000000000000000a, setVersion=1, topologyVersion=TopologyVersion{processId=64b583ecebfb924922e6cfc1, counter=6}, lastWriteDate=Fri Aug 04 16:01:25 UTC 2023, lastUpdateTimeNanos=5440247882417611} 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Monitor thread successfully connected to server with description ServerDescription{address=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=13, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=641177637, setName='atlas-plh3lm-shard-0', canonicalAddress=pickup-points-shard-00-00.bwbt2.mongodb.net:27017, hosts=[pickup-points-shard-00-00.bwbt2.mongodb.net:27017, pickup-points-shard-00-01.bwbt2.mongodb.net:27017, pickup-points-shard-00-02.bwbt2.mongodb.net:27017], passives=[], arbiters=[], primary='pickup-points-shard-00-01.bwbt2.mongodb.net:27017', tagSet=TagSet{[Tag{name='nodeType', value='ELECTABLE'}, Tag{name='provider', value='GCP'}, Tag{name='region', value='WESTERN_EUROPE'}, Tag{name='workloadType', value='OPERATIONAL'}]}, electionId=null, setVersion=1, topologyVersion=TopologyVersion{processId=64b5839658963dafcb22081c, counter=4}, lastWriteDate=Fri Aug 04 16:01:25 UTC 2023, lastUpdateTimeNanos=5440247880910839} 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Setting max election id to 7fffffff000000000000000a from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Setting max set version to 1 from replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Discovered replica set primary pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):230 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=airbyte_internal, tableId=eu_poow_ds_raw__stream_pickuppoints_bookingprocessingdatas}} 2023-08-04 16:01:29 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(prepareFinalTables):60 Preparing final tables 2023-08-04 16:01:29 destination > INFO i.a.i.d.b.t.BigQuerySqlGenerator(existingSchemaMatchesStreamConfig):245 Alter Table Report [] [] []; Clustering true; Partitioning true 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:7, serverValue:269625}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 source > INFO c.m.d.l.SLF4JLogger(info):71 Opened connection [connectionId{localValue:8, serverValue:269626}] to pickup-points-shard-00-01.bwbt2.mongodb.net:27017 2023-08-04 16:01:29 destination > INFO i.a.i.d.b.t.BigQuerySqlGenerator(existingSchemaMatchesStreamConfig):245 Alter Table Report [] [] []; Clustering true; Partitioning true 2023-08-04 16:01:29 destination > INFO i.a.i.d.b.t.BigQuerySqlGenerator(existingSchemaMatchesStreamConfig):245 Alter Table Report [] [] []; Clustering true; Partitioning true 2023-08-04 16:01:29 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$4):178 Preparing tables in destination completed. 2023-08-04 16:01:32 source > INFO i.a.i.s.r.StateDecoratingIterator(createStateMessage):207 State report for stream pickup-points_pickuppoints - original: updatedAt = 2023-08-04T05:37:21.594Z (count 100) -> latest: updatedAt = 2023-08-04T05:37:21.594Z (count 100) 2023-08-04 16:01:32 source > INFO i.a.c.u.CompositeIterator(lambda$emitStartStreamStatus$1):155 STARTING -> pickup-points_pickuppoints 2023-08-04 16:01:34 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded. 2023-08-04 16:01:34 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):288 executing on success close procedure. 2023-08-04 16:01:34 destination > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):133 Flushing all 0 current buffers (0 bytes in total) 2023-08-04 16:01:34 destination > INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$6):241 Cleaning up destination started for 3 streams 2023-08-04 16:01:34 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_pickuppoints 2023-08-04 16:01:34 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql 8ded425b-d0d6-4813-b301-f225998fc7b7: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END as `address`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END as `packageMaxWeight`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END as `packageMaxCombinedLength`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) as `legacyCategory`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END as `maxCombined`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END as `maxWeight`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) as `type`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END as `closingDates`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) as `createdAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END as `carrier`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) as `finalClosingDate`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END as `packageMaxDimension`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) as `outdatedAt`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END as `location`, CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END as `openingHours`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) as `maxPackageQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) as `id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) as `_id`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) as `category`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) as `openingDate`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) as `maxPackagesQuantity`, SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) as `updatedAt`, array_concat( CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.address') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.address') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.address')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.address') END IS NULL) THEN ["Problem with `address`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxWeight') END IS NULL) THEN ["Problem with `packageMaxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxCombinedLength') END IS NULL) THEN ["Problem with `packageMaxCombinedLength`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.legacyCategory') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.legacyCategory')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.legacyCategory') as STRING) IS NULL) THEN ["Problem with `legacyCategory`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxCombined') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxCombined')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxCombined') END IS NULL) THEN ["Problem with `maxCombined`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.maxWeight') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxWeight')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.maxWeight') END IS NULL) THEN ["Problem with `maxWeight`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.type') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.type')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.type') as STRING) IS NULL) THEN ["Problem with `type`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.closingDates') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.closingDates')) != 'array' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.closingDates') END IS NULL) THEN ["Problem with `closingDates`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.createdAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.createdAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.createdAt') as STRING) IS NULL) THEN ["Problem with `createdAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.carrier') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.carrier') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.carrier')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.carrier') END IS NULL) THEN ["Problem with `carrier`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.finalClosingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.finalClosingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.finalClosingDate') as STRING) IS NULL) THEN ["Problem with `finalClosingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.packageMaxDimension') END IS NULL) THEN ["Problem with `packageMaxDimension`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.outdatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.outdatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.outdatedAt') as STRING) IS NULL) THEN ["Problem with `outdatedAt`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.location') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.location') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.location')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.location') END IS NULL) THEN ["Problem with `location`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'null') AND (CASE WHEN JSON_QUERY(`_airbyte_data`, '$.openingHours') IS NULL OR JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingHours')) != 'object' THEN NULL ELSE JSON_QUERY(`_airbyte_data`, '$.openingHours') END IS NULL) THEN ["Problem with `openingHours`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackageQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackageQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackageQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.id') as STRING) IS NULL) THEN ["Problem with `id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$._id') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$._id')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$._id') as STRING) IS NULL) THEN ["Problem with `_id`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.category') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.category')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.category') as STRING) IS NULL) THEN ["Problem with `category`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.openingDate') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.openingDate')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.openingDate') as STRING) IS NULL) THEN ["Problem with `openingDate`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.maxPackagesQuantity')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.maxPackagesQuantity') as NUMERIC) IS NULL) THEN ["Problem with `maxPackagesQuantity`"] ELSE [] END, CASE WHEN (JSON_QUERY(`_airbyte_data`, '$.updatedAt') IS NOT NULL) AND (JSON_TYPE(JSON_QUERY(`_airbyte_data`, '$.updatedAt')) != 'null') AND (SAFE_CAST(JSON_VALUE(`_airbyte_data`, '$.updatedAt') as STRING) IS NULL) THEN ["Problem with `updatedAt`"] ELSE [] END ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` WHERE _airbyte_loaded_at IS NULL ) SELECT `address`, `packageMaxWeight`, `packageMaxCombinedLength`, `legacyCategory`, `maxCombined`, `maxWeight`, `type`, `closingDates`, `createdAt`, `carrier`, `finalClosingDate`, `packageMaxDimension`, `outdatedAt`, `location`, `openingHours`, `maxPackageQuantity`, `id`, `_id`, `category`, `openingDate`, `maxPackagesQuantity`, `updatedAt`, to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 16:01:40 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):70 Root-level job 8ded425b-d0d6-4813-b301-f225998fc7b7 completed in 5327 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 16:01:40 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql BEGIN TRANSACTION completed in 50 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 16:01:40 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql INSERT INTO `eu_poow_ds`.`pickuppoints_pickuppoints` ( `address`, `packageMaxWeight`, `packageMaxCom... completed in 1734 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 16:01:40 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_pickuppoints` SET `_airbyte_loaded_at... completed in 1582 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 16:01:40 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(lambda$execute$1):92 Child sql COMMIT TRANSACTION completed in 134 ms; processed 0 bytes; billed for 0 bytes 2023-08-04 16:01:40 destination > INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):186 Cleaning up staging path for stream pickuppoints_pickuppoints (dataset airbyte_internal): data_sync/ultifile/airbyte_internal_pickuppoints_pickuppoints 2023-08-04 16:01:41 destination > INFO i.a.i.b.d.t.DefaultTyperDeduper(typeAndDedupe):96 Attempting typing and deduping for eu_poow_ds.pickuppoints_bookings 2023-08-04 16:01:41 destination > INFO i.a.i.d.b.t.BigQueryDestinationHandler(execute):56 Executing sql 7b1115eb-705b-471a-8d2a-545e92852303: BEGIN TRANSACTION; INSERT INTO `eu_poow_ds`.`pickuppoints_bookings` ( _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at ) WITH intermediate_data AS ( SELECT array_concat( ) as _airbyte_cast_errors, _airbyte_raw_id, _airbyte_extracted_at FROM `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_bookings` WHERE _airbyte_loaded_at IS NULL ) SELECT to_json(struct(_airbyte_cast_errors AS errors)) AS _airbyte_meta, _airbyte_raw_id, _airbyte_extracted_at FROM intermediate_data; UPDATE `airbyte_internal`.`eu_poow_ds_raw__stream_pickuppoints_bookings` SET `_airbyte_loaded_at` = CURRENT_TIMESTAMP() WHERE `_airbyte_loaded_at` IS NULL ; COMMIT TRANSACTION; 2023-08-04 16:01:42 destination > ERROR i.a.i.d.b.BufferedStreamConsumer(close):318 Close failed. com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:338) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:335) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitFor(Job.java:244) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) ~[io.airbyte.airbyte-integrations.bases-base-typing-deduping-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) ~[google-http-client-1.43.1.jar:1.43.1] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] ... 25 more Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more 2023-08-04 16:01:42 destination > ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:338) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job$1.call(Job.java:335) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) ~[gax-2.28.1.jar:2.28.1] at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at com.google.cloud.bigquery.Job.waitFor(Job.java:244) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) ~[io.airbyte.airbyte-integrations.bases-base-typing-deduping-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-24.0.2.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-24.0.2.jar:?] Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) ~[google-http-client-1.43.1.jar:1.43.1] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) ~[google-api-client-1.31.5.jar:1.31.5] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ~[google-cloud-bigquery-2.27.0.jar:2.27.0] ... 25 more Stack Trace: com.google.cloud.bigquery.BigQueryException: Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3] at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:114) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:694) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1437) at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1432) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1431) at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1415) at com.google.cloud.bigquery.Job$1.call(Job.java:338) at com.google.cloud.bigquery.Job$1.call(Job.java:335) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:103) at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86) at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49) at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334) at com.google.cloud.bigquery.Job.waitFor(Job.java:244) at io.airbyte.integrations.destination.bigquery.typing_deduping.BigQueryDestinationHandler.execute(BigQueryDestinationHandler.java:63) at io.airbyte.integrations.base.destination.typing_deduping.DefaultTyperDeduper.typeAndDedupe(DefaultTyperDeduper.java:100) at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$6(BigQueryStagingConsumerFactory.java:243) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:306) at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:272) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:157) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:455) Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request GET https://www.googleapis.com/bigquery/v2/projects/lastmile-prod/queries/job_VVD1tFyAT-iDvP8vq7LlvcJHxbTg?location=EU&maxResults=0&prettyPrint=false { "code": 400, "errors": [ { "domain": "global", "location": "q", "locationType": "parameter", "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "reason": "invalidQuery" } ], "message": "Query error: Number of arguments does not match for function ARRAY_CONCAT. Supported signature: ARRAY_CONCAT(ARRAY, [ARRAY, ...]) at [17:3]", "status": "INVALID_ARGUMENT" } at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118) at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428) at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:692) ... 25 more 2023-08-04 16:01:42 destination > Destination process done (exit code 1) 2023-08-04 16:01:43 destination > Skipping in-connector normalization