We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JIRA 0.3.3
File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 180, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 290, in _read_full_refresh for record_data_or_message in record_data_or_messages: File "/airbyte/integration_code/source_jira/streams.py", line 192, in read_records for board in super().read_records(**kwargs): File "/airbyte/integration_code/source_jira/streams.py", line 90, in read_records yield from super().read_records(**kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 413, in read_records yield from self._read_pages( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 430, in _read_pages yield from records_generator_fn(request, response, stream_state, stream_slice) File "/airbyte/integration_code/source_jira/streams.py", line 81, in parse_response yield self.transform(record=record, **kwargs) File "/airbyte/integration_code/source_jira/streams.py", line 197, in transform record["projectId"] = str(record["location"]["projectId"]) KeyError: 'projectId' ,retryable=<null>,timestamp=1674202970240], io.airbyte.config.FailureReason@1da8c66[failureOrigin=source,failureType=<null>,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@5f7edc3f[additionalProperties={attemptNumber=2, jobId=165, connector_command=read}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1589) Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$6(DefaultReplicationWorker.java:385) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:151) at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$6(DefaultReplicationWorker.java:383) ... 4 more ,retryable=<null>,timestamp=1674202970589]]] 2023-01-20 08:23:11 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):212 - Sync summary length: 14473 2023-01-20 08:23:11 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating... 2023-01-20 08:23:11 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to get normalization statuses 2023-01-20 08:23:11 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/165/2/logs.log 2023-01-20 08:23:11 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.26 2023-01-20 08:23:11 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation 2023-01-20 08:23:11 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-20 08:23:11 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):125 - Running with normalization version: airbyte/normalization:0.2.25 2023-01-20 08:23:11 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START DEFAULT NORMALIZATION ----- 2023-01-20 08:23:11 INFO i.a.w.p.KubeProcessFactory(create):96 - Attempting to start pod = normalization-normalize-165-2-oduty for airbyte/normalization:0.2.25 with resources io.airbyte.config.ResourceRequirements@2a675161[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-20 08:23:11 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-20 08:23:11 INFO i.a.w.p.KubeProcessFactory(create):99 - normalization-normalize-165-2-oduty stdoutLocalPort = 9012 2023-01-20 08:23:11 INFO i.a.w.p.KubeProcessFactory(create):102 - normalization-normalize-165-2-oduty stderrLocalPort = 9013 2023-01-20 08:23:11 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$10):602 - Creating stdout socket server... 2023-01-20 08:23:11 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):620 - Creating stderr socket server... 2023-01-20 08:23:11 INFO i.a.w.p.KubePodProcess(<init>):533 - Creating pod normalization-normalize-165-2-oduty... 2023-01-20 08:23:11 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):313 - Waiting for init container to be ready before copying files... 2023-01-20 08:23:11 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):317 - Init container present.. 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):320 - Init container ready.. 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(<init>):564 - Copying files... 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):262 - Uploading file: destination_config.json 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):270 - kubectl cp /tmp/0211a3fe-e9e5-490f-adab-ff279e695ef8/destination_config.json airbyte/normalization-normalize-165-2-oduty:/config/destination_config.json -c init 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):273 - Waiting for kubectl cp to complete 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):287 - kubectl cp complete, closing process 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):262 - Uploading file: destination_catalog.json 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):270 - kubectl cp /tmp/6a9978cc-7e03-4b93-ac40-4561251f6a8b/destination_catalog.json airbyte/normalization-normalize-165-2-oduty:/config/destination_catalog.json -c init 2023-01-20 08:23:12 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):273 - Waiting for kubectl cp to complete 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):287 - kubectl cp complete, closing process 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):262 - Uploading file: FINISHED_UPLOADING 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):270 - kubectl cp /tmp/0e3508b0-0b1e-4803-897e-d68f047f9bbf/FINISHED_UPLOADING airbyte/normalization-normalize-165-2-oduty:/config/FINISHED_UPLOADING -c init 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):273 - Waiting for kubectl cp to complete 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):287 - kubectl cp complete, closing process 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(<init>):567 - Waiting until pod is ready... 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$10):611 - Setting stdout... 2023-01-20 08:23:13 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):623 - Setting stderr... 2023-01-20 08:23:14 INFO i.a.w.p.KubePodProcess(<init>):583 - Reading pod IP... 2023-01-20 08:23:14 INFO i.a.w.p.KubePodProcess(<init>):585 - Pod IP: 10.10.3.123 2023-01-20 08:23:14 INFO i.a.w.p.KubePodProcess(<init>):592 - Using null stdin output stream... 2023-01-20 08:23:17 INFO i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 - 2023-01-20 08:23:17 INFO i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 - 2023-01-20 08:22:49 source > Syncing stream: boards 2023-01-20 08:22:49 destination > Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2023-01-20 08:22:49 destination > Command: WRITE 2023-01-20 08:22:49 destination > Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2023-01-20 08:22:49 destination > Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-20 08:22:49 destination > Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-20 08:22:49 destination > Selected loading method is set to: GCS 2023-01-20 08:22:49 destination > S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2023-01-20 08:22:49 destination > All tmp files GCS will be kept in bucket when replication is finished 2023-01-20 08:22:49 destination > Creating BigQuery staging message consumer with staging ID 3b01e032-7d00-472d-82b9-baa3fa7f7245 at 2023-01-20T08:22:47.587Z 2023-01-20 08:22:49 destination > BigQuery write config: BigQueryWriteConfig[streamName=boards, namespace=ing_jira, datasetId=ing_jira, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_wjb_boards}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_raw_boards}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-01-20 08:22:49 destination > BigQuery write config: BigQueryWriteConfig[streamName=board_issues, namespace=ing_jira, datasetId=ing_jira, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_hek_board_issues}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_raw_board_issues}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2023-01-20 08:22:49 destination > BigQuery write config: BigQueryWriteConfig[streamName=projects, namespace=ing_jira, datasetId=ing_jira, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_zok_projects}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_raw_projects}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-01-20 08:22:49 destination > BigQuery write config: BigQueryWriteConfig[streamName=sprints, namespace=ing_jira, datasetId=ing_jira, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_brb_sprints}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_raw_sprints}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=overwrite, stagedFiles=[]] 2023-01-20 08:22:49 destination > BigQuery write config: BigQueryWriteConfig[streamName=sprint_issues, namespace=ing_jira, datasetId=ing_jira, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_kuy_sprint_issues}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_raw_sprint_issues}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2023-01-20 08:22:49 destination > class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2023-01-20 08:22:49 destination > Preparing tmp tables in destination started for 5 streams 2023-01-20 08:22:49 destination > Creating dataset ing_jira 2023-01-20 08:22:49 destination > Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_zok_projects}} 2023-01-20 08:22:49 destination > Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=ing_jira, tableId=_airbyte_tmp_zok_projects}} 2023-01-20 08:22:49 destination > Creating staging path for stream projects (dataset ing_jira): ingestion/ing_jira_projects/2023/01/20/08/3b01e032-7d00-472d-82b9-baa3fa7f7245/ 2023-01-20 08:22:50 destination > Storage Object airbyte-ingestion-production/ingestion/ing_jira_projects/2023/01/20/08/3b01e032-7d00-472d-82b9-baa3fa7f7245/ does not exist in bucket; creating... 2023-01-20 08:22:50 source > Encountered an exception while reading stream boards Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 111, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 180, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 290, in _read_full_refresh for record_data_or_message in record_data_or_messages: File "/airbyte/integration_code/source_jira/streams.py", line 192, in read_records for board in super().read_records(**kwargs): File "/airbyte/integration_code/source_jira/streams.py", line 90, in read_records yield from super().read_records(**kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 413, in read_records yield from self._read_pages( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 430, in _read_pages yield from records_generator_fn(request, response, stream_state, stream_slice) File "/airbyte/integration_code/source_jira/streams.py", line 81, in parse_response yield self.transform(record=record, **kwargs) File "/airbyte/integration_code/source_jira/streams.py", line 197, in transform record["projectId"] = str(record["location"]["projectId"]) KeyError: 'projectId'
My config
No
The text was updated successfully, but these errors were encountered:
Fix made in #21802
Sorry, something went wrong.
No branches or pull requests
Environment
JIRA 0.3.3
Logs
Steps to Reproduce
My config
Are you willing to submit a PR?
No
The text was updated successfully, but these errors were encountered: