Skip to content

Throw TrinoException on AWS errors in GlueHiveMetastore.updateTable #7528

Throw TrinoException on AWS errors in GlueHiveMetastore.updateTable

Throw TrinoException on AWS errors in GlueHiveMetastore.updateTable #7528

GitHub Actions / Test Report failed Aug 26, 2024 in 0s

104361 tests run, 4262 skipped, 6 failed.

Annotations

Check failure on line 1 in io/trino/plugin/bigquery/TestBigQueryAvroConnectorTest

See this annotation in the file changed.

@github-actions github-actions / Test Report

TestBigQueryAvroConnectorTest.testCommentTable

Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
Raw output
io.trino.testing.QueryFailedException: Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
	at io.trino.testing.AbstractTestingTrinoClient.execute(AbstractTestingTrinoClient.java:134)
	at io.trino.testing.DistributedQueryRunner.executeInternal(DistributedQueryRunner.java:558)
	at io.trino.testing.DistributedQueryRunner.execute(DistributedQueryRunner.java:541)
	at io.trino.testing.QueryRunner.execute(QueryRunner.java:82)
	at io.trino.testing.sql.TestTable.close(TestTable.java:134)
	at io.trino.testing.BaseConnectorTest.testCommentTable(BaseConnectorTest.java:4152)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:194)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:507)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1489)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:2071)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:2033)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:187)
	Suppressed: java.lang.Exception: SQL: DROP TABLE test_comment_m6ej1lktv7
		at io.trino.testing.DistributedQueryRunner.executeInternal(DistributedQueryRunner.java:565)
		... 11 more
Caused by: com.google.cloud.bigquery.BigQueryException: Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:116)
	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.deleteTable(HttpBigQueryRpc.java:383)
	at com.google.cloud.bigquery.BigQueryImpl$10.call(BigQueryImpl.java:601)
	at com.google.cloud.bigquery.BigQueryImpl$10.call(BigQueryImpl.java:598)
	at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:102)
	at com.google.cloud.RetryHelper.run(RetryHelper.java:76)
	at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)
	at com.google.cloud.bigquery.BigQueryImpl.delete(BigQueryImpl.java:597)
	at io.trino.plugin.bigquery.BigQueryClient.dropTable(BigQueryClient.java:376)
	at io.trino.plugin.bigquery.BigQueryMetadata.dropTable(BigQueryMetadata.java:625)
	at io.trino.plugin.base.classloader.ClassLoaderSafeConnectorMetadata.dropTable(ClassLoaderSafeConnectorMetadata.java:458)
	at io.trino.tracing.TracingConnectorMetadata.dropTable(TracingConnectorMetadata.java:395)
	at io.trino.metadata.MetadataManager.dropTable(MetadataManager.java:1025)
	at io.trino.tracing.TracingMetadata.dropTable(TracingMetadata.java:563)
	at io.trino.execution.DropTableTask.execute(DropTableTask.java:88)
	at io.trino.execution.DropTableTask.execute(DropTableTask.java:36)
	at io.trino.execution.DataDefinitionExecution.start(DataDefinitionExecution.java:146)
	at io.trino.execution.SqlQueryManager.createQuery(SqlQueryManager.java:272)
	at io.trino.dispatcher.LocalDispatchQuery.startExecution(LocalDispatchQuery.java:150)
	at io.trino.dispatcher.LocalDispatchQuery.lambda$waitForMinimumWorkers$2(LocalDispatchQuery.java:134)
	at io.airlift.concurrent.MoreFutures.lambda$addSuccessCallback$12(MoreFutures.java:570)
	at io.airlift.concurrent.MoreFutures$3.onSuccess(MoreFutures.java:545)
	at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1137)
	at io.trino.$gen.Trino_testversion____20240826_222627_71.run(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1570)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
DELETE https://bigquery.googleapis.com/bigquery/v2/projects/sep-bq-cicd/datasets/tpch/tables/test_comment_m6ej1lktv7
{
  "code": 403,
  "errors": [
    {
      "domain": "usageLimits",
      "message": "Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas",
      "reason": "rateLimitExceeded"
    }
  ],
  "message": "Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas",
  "status": "PERMISSION_DENIED"
}
	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$3.interceptResponse(AbstractGoogleClientRequest.java:479)
	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:565)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:506)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:616)
	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.deleteTable(HttpBigQueryRpc.java:380)
	... 25 more

Check failure on line 79 in io/trino/plugin/hive/TestHiveAlluxioCacheFileOperations

See this annotation in the file changed.

@github-actions github-actions / Test Report

TestHiveAlluxioCacheFileOperations.testCacheFileOperations

Could not read table schema
Raw output
io.trino.testing.QueryFailedException: Could not read table schema
	at io.trino.testing.AbstractTestingTrinoClient.execute(AbstractTestingTrinoClient.java:134)
	at io.trino.testing.DistributedQueryRunner.executeInternal(DistributedQueryRunner.java:558)
	at io.trino.testing.DistributedQueryRunner.executeWithPlan(DistributedQueryRunner.java:547)
	at io.trino.testing.QueryAssertions.assertDistributedUpdate(QueryAssertions.java:108)
	at io.trino.testing.QueryAssertions.assertUpdate(QueryAssertions.java:62)
	at io.trino.testing.AbstractTestQueryFramework.assertUpdate(AbstractTestQueryFramework.java:420)
	at io.trino.testing.AbstractTestQueryFramework.assertUpdate(AbstractTestQueryFramework.java:415)
	at io.trino.plugin.hive.TestHiveAlluxioCacheFileOperations.testCacheFileOperations(TestHiveAlluxioCacheFileOperations.java:79)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:194)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:507)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1489)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:2071)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:2033)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:187)
	Suppressed: java.lang.Exception: SQL: INSERT INTO test_cache_file_operations VALUES ('1-abc', 'p1')
		at io.trino.testing.DistributedQueryRunner.executeInternal(DistributedQueryRunner.java:565)
		... 13 more
Caused by: io.trino.spi.TrinoException: Could not read table schema
	at io.trino.plugin.hive.metastore.file.FileHiveMetastore.readFile(FileHiveMetastore.java:1406)
	at io.trino.plugin.hive.metastore.file.FileHiveMetastore.readSchemaFile(FileHiveMetastore.java:1391)
	at io.trino.plugin.hive.metastore.file.FileHiveMetastore.getTable(FileHiveMetastore.java:412)
	at io.trino.metastore.tracing.TracingHiveMetastore.lambda$getTable$2(TracingHiveMetastore.java:99)
	at io.trino.metastore.tracing.Tracing.withTracing(Tracing.java:39)
	at io.trino.metastore.tracing.TracingHiveMetastore.getTable(TracingHiveMetastore.java:99)
	at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.loadTable(CachingHiveMetastore.java:436)
	at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169)
	at io.trino.cache.EmptyCache.lambda$get$0(EmptyCache.java:58)
	at io.trino.cache.EmptyCache.get(EmptyCache.java:94)
	at io.trino.cache.EmptyCache.get(EmptyCache.java:58)
	at com.google.common.cache.AbstractLoadingCache.getUnchecked(AbstractLoadingCache.java:53)
	at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.getOptional(CachingHiveMetastore.java:285)
	at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.getTable(CachingHiveMetastore.java:431)
	at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.loadTable(CachingHiveMetastore.java:436)
	at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:169)
	at io.trino.cache.EvictableCache$TokenCacheLoader.load(EvictableCache.java:466)
	at io.trino.cache.EvictableCache$TokenCacheLoader.load(EvictableCache.java:452)
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3574)
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2316)
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2189)
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2079)
	at com.google.common.cache.LocalCache.get(LocalCache.java:4017)
	at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4040)
	at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4989)
	at io.trino.cache.EvictableCache.get(EvictableCache.java:147)
	at com.google.common.cache.AbstractLoadingCache.getUnchecked(AbstractLoadingCache.java:53)
	at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.getOptional(CachingHiveMetastore.java:285)
	at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.getTable(CachingHiveMetastore.java:431)
	at io.trino.plugin.hive.metastore.SemiTransactionalHiveMetastore.getTable(SemiTransactionalHiveMetastore.java:276)
	at io.trino.plugin.hive.HiveMetadata.getView(HiveMetadata.java:2893)
	at io.trino.spi.connector.ConnectorMetadata.isView(ConnectorMetadata.java:1057)
	at io.trino.plugin.base.classloader.ClassLoaderSafeConnectorMetadata.isView(ClassLoaderSafeConnectorMetadata.java:752)
	at io.trino.tracing.TracingConnectorMetadata.isView(TracingConnectorMetadata.java:888)
	at io.trino.metadata.MetadataManager.lambda$isView$37(MetadataManager.java:1492)
	at java.base/java.util.Optional.map(Optional.java:260)
	at io.trino.metadata.MetadataManager.isView(MetadataManager.java:1488)
	at io.trino.tracing.TracingMetadata.isView(TracingMetadata.java:886)
	at io.trino.sql.analyzer.StatementAnalyzer$Visitor.visitInsert(StatementAnalyzer.java:571)
	at io.trino.sql.analyzer.StatementAnalyzer$Visitor.visitInsert(StatementAnalyzer.java:520)
	at io.trino.sql.tree.Insert.accept(Insert.java:68)
	at io.trino.sql.tree.AstVisitor.process(AstVisitor.java:27)
	at io.trino.sql.analyzer.StatementAnalyzer$Visitor.process(StatementAnalyzer.java:539)
	at io.trino.sql.analyzer.StatementAnalyzer.analyze(StatementAnalyzer.java:499)
	at io.trino.sql.analyzer.StatementAnalyzer.analyze(StatementAnalyzer.java:488)
	at io.trino.sql.analyzer.Analyzer.analyze(Analyzer.java:98)
	at io.trino.sql.analyzer.Analyzer.analyze(Analyzer.java:87)
	at io.trino.execution.SqlQueryExecution.analyze(SqlQueryExecution.java:285)
	at io.trino.execution.SqlQueryExecution.<init>(SqlQueryExecution.java:218)
	at io.trino.execution.SqlQueryExecution$SqlQueryExecutionFactory.createQueryExecution(SqlQueryExecution.java:884)
	at io.trino.dispatcher.LocalDispatchQueryFactory.lambda$createDispatchQuery$0(LocalDispatchQueryFactory.java:153)
	at io.trino.$gen.Trino_testversion____20240826_223537_19079.call(Unknown Source)
	at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
	at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:76)
	at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1570)
Caused by: java.lang.IllegalArgumentException: Invalid JSON bytes for [simple type, class io.trino.plugin.hive.metastore.file.TableMetadata]
	at io.airlift.json.JsonCodec.fromJson(JsonCodec.java:202)
	at io.trino.plugin.hive.metastore.file.FileHiveMetastore.readFile(FileHiveMetastore.java:1399)
	... 57 more
Caused by: com.fasterxml.jackson.core.io.JsonEOFException: Unexpected end-of-input in field name
 at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 24, column: 9]
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:585)
	at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.parseEscapedName(UTF8StreamJsonParser.java:2062)
	at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.slowParseName(UTF8StreamJsonParser.java:1969)
	at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._parseName(UTF8StreamJsonParser.java:1757)
	at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:757)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:430)
	at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1493)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:348)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:185)
	at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:342)
	at com.fasterxml.jackson.databind.ObjectReader._bind(ObjectReader.java:2099)
	at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1249)
	at io.airlift.json.JsonCodec.fromJson(JsonCodec.java:197)
	... 58 more

Check failure on line 1 in io/trino/plugin/iceberg/catalog/glue/TestIcebergGlueCatalogConnectorSmokeTest

See this annotation in the file changed.

@github-actions github-actions / Test Report

TestIcebergGlueCatalogConnectorSmokeTest.testDeleteRowsConcurrently

Task 0 did not complete in time
Raw output
java.lang.IllegalStateException: Task 0 did not complete in time
	at com.google.common.base.Preconditions.checkState(Preconditions.java:589)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.lambda$testDeleteRowsConcurrently$2(BaseIcebergConnectorSmokeTest.java:160)
	at com.google.common.collect.Streams$1Splitr.tryAdvance(Streams.java:500)
	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:556)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:546)
	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:265)
	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:702)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.testDeleteRowsConcurrently(BaseIcebergConnectorSmokeTest.java:164)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1570)

Check failure on line 1 in io/trino/plugin/iceberg/catalog/glue/TestIcebergGlueCatalogConnectorSmokeTest

See this annotation in the file changed.

@github-actions github-actions / Test Report

TestIcebergGlueCatalogConnectorSmokeTest.testDeleteRowsConcurrently

Task 1 did not complete in time
Raw output
java.lang.IllegalStateException: Task 1 did not complete in time
	at com.google.common.base.Preconditions.checkState(Preconditions.java:589)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.lambda$testDeleteRowsConcurrently$2(BaseIcebergConnectorSmokeTest.java:160)
	at com.google.common.collect.Streams$1Splitr.tryAdvance(Streams.java:500)
	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:556)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:546)
	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:265)
	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:702)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.testDeleteRowsConcurrently(BaseIcebergConnectorSmokeTest.java:164)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1570)

Check failure on line 1 in io/trino/plugin/iceberg/catalog/glue/TestIcebergGlueCatalogConnectorSmokeTest

See this annotation in the file changed.

@github-actions github-actions / Test Report

TestIcebergGlueCatalogConnectorSmokeTest.testDeleteRowsConcurrently

Task 1 did not complete in time
Raw output
java.lang.IllegalStateException: Task 1 did not complete in time
	at com.google.common.base.Preconditions.checkState(Preconditions.java:589)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.lambda$testDeleteRowsConcurrently$2(BaseIcebergConnectorSmokeTest.java:160)
	at com.google.common.collect.Streams$1Splitr.tryAdvance(Streams.java:500)
	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:556)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:546)
	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:265)
	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:702)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.testDeleteRowsConcurrently(BaseIcebergConnectorSmokeTest.java:164)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1570)

Check failure on line 1 in io/trino/plugin/iceberg/catalog/glue/TestIcebergGlueCatalogConnectorSmokeTest

See this annotation in the file changed.

@github-actions github-actions / Test Report

TestIcebergGlueCatalogConnectorSmokeTest.testDeleteRowsConcurrently

Task 0 did not complete in time
Raw output
java.lang.IllegalStateException: Task 0 did not complete in time
	at com.google.common.base.Preconditions.checkState(Preconditions.java:589)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.lambda$testDeleteRowsConcurrently$2(BaseIcebergConnectorSmokeTest.java:160)
	at com.google.common.collect.Streams$1Splitr.tryAdvance(Streams.java:500)
	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:556)
	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:546)
	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:265)
	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:702)
	at io.trino.plugin.iceberg.BaseIcebergConnectorSmokeTest.testDeleteRowsConcurrently(BaseIcebergConnectorSmokeTest.java:164)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1570)