Skip to content

chore(gradle): bump com.google.protobuf:protobuf-java #1267

chore(gradle): bump com.google.protobuf:protobuf-java

chore(gradle): bump com.google.protobuf:protobuf-java #1267

Status Failure
Total duration 3h 31m 28s
Artifacts 20
Matrix: nightly
Fit to window
Zoom out
Zoom in

Annotations

7 errors
nightly (check, 17)
Process completed with exit code 1.
nightly (check, 21)
Process completed with exit code 1.
nightly (check, 23)
Process completed with exit code 1.
nightly (check, 11)
Process completed with exit code 1.
tests/test_kafka_consumer.py.test_protobuf_spec [regular]: tests/test_kafka_consumer/KafkaConsumerTestCase#L1
failed to consume a Kafka stream. : RuntimeError: java.lang.VerifyError: Bad type on operand stack Traceback (most recent call last): File "/python/deephaven/stream/kafka/consumer.py", line 258, in _consume j_table=_JKafkaTools.consumeToTable( RuntimeError: java.lang.VerifyError: Bad type on operand stack Exception Details: Location: io/confluent/kafka/schemaregistry/protobuf/ProtobufSchema.toEnum(Lcom/google/protobuf/DescriptorProtos$EnumDescriptorProto;)Lcom/squareup/wire/schema/internal/parser/EnumElement; @161: invokestatic Reason: Type 'com/google/protobuf/DescriptorProtos$EnumValueOptions' (current frame, stack[1]) is not assignable to 'com/google/protobuf/GeneratedMessageV3$ExtendableMessage' Current Frame: bci: @161 flags: { } locals: { 'com/google/protobuf/DescriptorProtos$EnumDescriptorProto', 'java/lang/String', 'com/google/common/collect/ImmutableList$Builder', 'java/util/Iterator', 'com/google/protobuf/DescriptorProtos$EnumValueDescriptorProto', 'com/google/common/collect/ImmutableList$Builder' } stack: { 'com/google/common/collect/ImmutableList$Builder', 'com/google/protobuf/DescriptorProtos$EnumValueOptions' } Bytecode: 0000000: 2ab6 014b 4cb2 001e 1301 4c2b b901 0f03 0000010: 00b8 0063 4d2a b601 4db9 0053 0100 4e2d 0000020: b900 5401 0099 00a6 2db9 0055 0100 c001 0000030: 4e3a 04b8 0063 3a05 1904 b601 4fb6 0150 0000040: 9900 25bb 0079 5912 a0b2 0083 1904 b601 0000050: 4fb6 0151 b800 8503 b700 7d3a 0619 0519 0000060: 06b6 0067 5719 04b6 014f b201 52b6 0153 0000070: 9900 2a19 04b6 014f b201 52b6 0154 c000 0000080: bd3a 0613 0155 1906 b800 bf3a 0719 07c6 0000090: 000b 1905 1907 b600 6757 1905 1904 b601 00000a0: 4fb8 00c0 b600 c157 2cbb 0156 59b2 0043 00000b0: 1904 b601 5719 04b6 0158 12d3 1905 b600 00000c0: c5b7 0159 b600 6757 a7ff 57b8 0063 4e2a 00000d0: b601 5ab9 0053 0100 3a04 1904 b900 5401 00000e0: 0099 0020 1904 b900 5501 00c0 015b 3a05 00000f0: 1905 b801 5c3a 062d 1906 b600 6757 a7ff 0000100: dc2a b601 5db9 0121 0100 3a04 1904 b900 0000110: 5401 0099 002c 1904 b900 5501 00c0 0074 0000120: 3a05 bb01 2259 b200 4312 d319 05b8 0123 0000130: b701 243a 062d 1906 b600 6757 a7ff d0b8 0000140: 0063 3a04 2ab6 015e b601 5f99 0025 bb00 0000150: 7959 1301 60b2 0083 2ab6 015e b601 61b8 0000160: 0085 03b7 007d 3a05 1904 1905 b600 6757 0000170: 2ab6 015e b601 6299 0024 bb00 7959 12a0 0000180: b200 832a b601 5eb6 0163 b800 8503 b700 0000190: 7d3a 0519 0419 05b6 0067 572a b601 5eb2 00001a0: 0164 b601 6599 0029 2ab6 015e b201 64b6 00001b0: 0166 c000 bd3a 0513 0167 1905 b800 bf3a 00001c0: 0619 06c6 000b 1904 1906 b600 6757 1904 00001d0: 2ab6 015e b800 c0b6 00c1 57bb 0168 59b2 00001e0: 0043 2b12 d319 04b6 00c5 2cb6 00c5 2db6 00001f0: 00c5 b701 69b0 Stackmap Table: append_frame(@31,Object[#1106],Object[#1219],Object[#1170]) append_frame(@101,Object[#1336],Object[#1219]) same_frame(@154) chop_frame(@203,3) append_frame(@218,Object[#1219],Object[#1170]) chop_frame(@257,1) append_frame(@268,Object[#1170]) chop_frame(@319,1) append_frame(@368,Object[#1219]) same_frame(@411) same_frame(@462) at io.confluent.kafka.schemaregistry.protobuf.ProtobufSchemaProvider.parseSchemaOrElseThrow(ProtobufSchemaProvider.java:38) at io.confluent.kafka.schemaregistry.SchemaProvider.parseSchema(SchemaProvider.java:75) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.parseSchema(CachedSchemaRegistryClient.java:301) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:340) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:464) at io.deephaven.kafka.ProtobufImpl.descriptor(ProtobufImpl.java:230) at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.setDescriptor(ProtobufImpl.java:117) at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.getDeserializer(ProtobufImpl.java:101) at io.deephaven.kafka.KafkaTools.getConsumeStruct(KafkaTools.java:1257) at io.deephaven.kafka.KafkaTools.consume(KafkaTools.java:1347) at io.deephaven.kafka.KafkaTools.consumeToTable(KafkaTools.java:1020)
tests/test_kafka_consumer.py.test_protobuf_spec [include /foo /bar]: tests/test_kafka_consumer/KafkaConsumerTestCase#L1
failed to consume a Kafka stream. : RuntimeError: java.lang.VerifyError: Bad type on operand stack Traceback (most recent call last): File "/python/deephaven/stream/kafka/consumer.py", line 258, in _consume j_table=_JKafkaTools.consumeToTable( RuntimeError: java.lang.VerifyError: Bad type on operand stack Exception Details: Location: io/confluent/kafka/schemaregistry/protobuf/ProtobufSchema.toEnum(Lcom/google/protobuf/DescriptorProtos$EnumDescriptorProto;)Lcom/squareup/wire/schema/internal/parser/EnumElement; @161: invokestatic Reason: Type 'com/google/protobuf/DescriptorProtos$EnumValueOptions' (current frame, stack[1]) is not assignable to 'com/google/protobuf/GeneratedMessageV3$ExtendableMessage' Current Frame: bci: @161 flags: { } locals: { 'com/google/protobuf/DescriptorProtos$EnumDescriptorProto', 'java/lang/String', 'com/google/common/collect/ImmutableList$Builder', 'java/util/Iterator', 'com/google/protobuf/DescriptorProtos$EnumValueDescriptorProto', 'com/google/common/collect/ImmutableList$Builder' } stack: { 'com/google/common/collect/ImmutableList$Builder', 'com/google/protobuf/DescriptorProtos$EnumValueOptions' } Bytecode: 0000000: 2ab6 014b 4cb2 001e 1301 4c2b b901 0f03 0000010: 00b8 0063 4d2a b601 4db9 0053 0100 4e2d 0000020: b900 5401 0099 00a6 2db9 0055 0100 c001 0000030: 4e3a 04b8 0063 3a05 1904 b601 4fb6 0150 0000040: 9900 25bb 0079 5912 a0b2 0083 1904 b601 0000050: 4fb6 0151 b800 8503 b700 7d3a 0619 0519 0000060: 06b6 0067 5719 04b6 014f b201 52b6 0153 0000070: 9900 2a19 04b6 014f b201 52b6 0154 c000 0000080: bd3a 0613 0155 1906 b800 bf3a 0719 07c6 0000090: 000b 1905 1907 b600 6757 1905 1904 b601 00000a0: 4fb8 00c0 b600 c157 2cbb 0156 59b2 0043 00000b0: 1904 b601 5719 04b6 0158 12d3 1905 b600 00000c0: c5b7 0159 b600 6757 a7ff 57b8 0063 4e2a 00000d0: b601 5ab9 0053 0100 3a04 1904 b900 5401 00000e0: 0099 0020 1904 b900 5501 00c0 015b 3a05 00000f0: 1905 b801 5c3a 062d 1906 b600 6757 a7ff 0000100: dc2a b601 5db9 0121 0100 3a04 1904 b900 0000110: 5401 0099 002c 1904 b900 5501 00c0 0074 0000120: 3a05 bb01 2259 b200 4312 d319 05b8 0123 0000130: b701 243a 062d 1906 b600 6757 a7ff d0b8 0000140: 0063 3a04 2ab6 015e b601 5f99 0025 bb00 0000150: 7959 1301 60b2 0083 2ab6 015e b601 61b8 0000160: 0085 03b7 007d 3a05 1904 1905 b600 6757 0000170: 2ab6 015e b601 6299 0024 bb00 7959 12a0 0000180: b200 832a b601 5eb6 0163 b800 8503 b700 0000190: 7d3a 0519 0419 05b6 0067 572a b601 5eb2 00001a0: 0164 b601 6599 0029 2ab6 015e b201 64b6 00001b0: 0166 c000 bd3a 0513 0167 1905 b800 bf3a 00001c0: 0619 06c6 000b 1904 1906 b600 6757 1904 00001d0: 2ab6 015e b800 c0b6 00c1 57bb 0168 59b2 00001e0: 0043 2b12 d319 04b6 00c5 2cb6 00c5 2db6 00001f0: 00c5 b701 69b0 Stackmap Table: append_frame(@31,Object[#1106],Object[#1219],Object[#1170]) append_frame(@101,Object[#1336],Object[#1219]) same_frame(@154) chop_frame(@203,3) append_frame(@218,Object[#1219],Object[#1170]) chop_frame(@257,1) append_frame(@268,Object[#1170]) chop_frame(@319,1) append_frame(@368,Object[#1219]) same_frame(@411) same_frame(@462) at io.confluent.kafka.schemaregistry.protobuf.ProtobufSchemaProvider.parseSchemaOrElseThrow(ProtobufSchemaProvider.java:38) at io.confluent.kafka.schemaregistry.SchemaProvider.parseSchema(SchemaProvider.java:75) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.parseSchema(CachedSchemaRegistryClient.java:301) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:340) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:464) at io.deephaven.kafka.ProtobufImpl.descriptor(ProtobufImpl.java:230) at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.setDescriptor(ProtobufImpl.java:117) at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.getDeserializer(ProtobufImpl.java:101) at io.deephaven.kafka.KafkaTools.getConsumeStruct(KafkaTools.java:1257) at io.deephaven.kafka.KafkaTools.consume(KafkaTools.java:1347) at io.deephaven.kafka.KafkaTools.consumeToTable(KafkaTools.java:1020)
tests/test_kafka_consumer.py.test_protobuf_spec [include /ts /sub/*]: tests/test_kafka_consumer/KafkaConsumerTestCase#L1
failed to consume a Kafka stream. : RuntimeError: java.lang.VerifyError: Bad type on operand stack Traceback (most recent call last): File "/python/deephaven/stream/kafka/consumer.py", line 258, in _consume j_table=_JKafkaTools.consumeToTable( RuntimeError: java.lang.VerifyError: Bad type on operand stack Exception Details: Location: io/confluent/kafka/schemaregistry/protobuf/ProtobufSchema.toEnum(Lcom/google/protobuf/DescriptorProtos$EnumDescriptorProto;)Lcom/squareup/wire/schema/internal/parser/EnumElement; @161: invokestatic Reason: Type 'com/google/protobuf/DescriptorProtos$EnumValueOptions' (current frame, stack[1]) is not assignable to 'com/google/protobuf/GeneratedMessageV3$ExtendableMessage' Current Frame: bci: @161 flags: { } locals: { 'com/google/protobuf/DescriptorProtos$EnumDescriptorProto', 'java/lang/String', 'com/google/common/collect/ImmutableList$Builder', 'java/util/Iterator', 'com/google/protobuf/DescriptorProtos$EnumValueDescriptorProto', 'com/google/common/collect/ImmutableList$Builder' } stack: { 'com/google/common/collect/ImmutableList$Builder', 'com/google/protobuf/DescriptorProtos$EnumValueOptions' } Bytecode: 0000000: 2ab6 014b 4cb2 001e 1301 4c2b b901 0f03 0000010: 00b8 0063 4d2a b601 4db9 0053 0100 4e2d 0000020: b900 5401 0099 00a6 2db9 0055 0100 c001 0000030: 4e3a 04b8 0063 3a05 1904 b601 4fb6 0150 0000040: 9900 25bb 0079 5912 a0b2 0083 1904 b601 0000050: 4fb6 0151 b800 8503 b700 7d3a 0619 0519 0000060: 06b6 0067 5719 04b6 014f b201 52b6 0153 0000070: 9900 2a19 04b6 014f b201 52b6 0154 c000 0000080: bd3a 0613 0155 1906 b800 bf3a 0719 07c6 0000090: 000b 1905 1907 b600 6757 1905 1904 b601 00000a0: 4fb8 00c0 b600 c157 2cbb 0156 59b2 0043 00000b0: 1904 b601 5719 04b6 0158 12d3 1905 b600 00000c0: c5b7 0159 b600 6757 a7ff 57b8 0063 4e2a 00000d0: b601 5ab9 0053 0100 3a04 1904 b900 5401 00000e0: 0099 0020 1904 b900 5501 00c0 015b 3a05 00000f0: 1905 b801 5c3a 062d 1906 b600 6757 a7ff 0000100: dc2a b601 5db9 0121 0100 3a04 1904 b900 0000110: 5401 0099 002c 1904 b900 5501 00c0 0074 0000120: 3a05 bb01 2259 b200 4312 d319 05b8 0123 0000130: b701 243a 062d 1906 b600 6757 a7ff d0b8 0000140: 0063 3a04 2ab6 015e b601 5f99 0025 bb00 0000150: 7959 1301 60b2 0083 2ab6 015e b601 61b8 0000160: 0085 03b7 007d 3a05 1904 1905 b600 6757 0000170: 2ab6 015e b601 6299 0024 bb00 7959 12a0 0000180: b200 832a b601 5eb6 0163 b800 8503 b700 0000190: 7d3a 0519 0419 05b6 0067 572a b601 5eb2 00001a0: 0164 b601 6599 0029 2ab6 015e b201 64b6 00001b0: 0166 c000 bd3a 0513 0167 1905 b800 bf3a 00001c0: 0619 06c6 000b 1904 1906 b600 6757 1904 00001d0: 2ab6 015e b800 c0b6 00c1 57bb 0168 59b2 00001e0: 0043 2b12 d319 04b6 00c5 2cb6 00c5 2db6 00001f0: 00c5 b701 69b0 Stackmap Table: append_frame(@31,Object[#1106],Object[#1219],Object[#1170]) append_frame(@101,Object[#1336],Object[#1219]) same_frame(@154) chop_frame(@203,3) append_frame(@218,Object[#1219],Object[#1170]) chop_frame(@257,1) append_frame(@268,Object[#1170]) chop_frame(@319,1) append_frame(@368,Object[#1219]) same_frame(@411) same_frame(@462) at io.confluent.kafka.schemaregistry.protobuf.ProtobufSchemaProvider.parseSchemaOrElseThrow(ProtobufSchemaProvider.java:38) at io.confluent.kafka.schemaregistry.SchemaProvider.parseSchema(SchemaProvider.java:75) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.parseSchema(CachedSchemaRegistryClient.java:301) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:340) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:464) at io.deephaven.kafka.ProtobufImpl.descriptor(ProtobufImpl.java:230) at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.setDescriptor(ProtobufImpl.java:117) at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.getDeserializer(ProtobufImpl.java:101) at io.deephaven.kafka.KafkaTools.getConsumeStruct(KafkaTools.java:1257) at io.deephaven.kafka.KafkaTools.consume(KafkaTools.java:1347) at io.deephaven.kafka.KafkaTools.consumeToTable(KafkaTools.java:1020)

Artifacts

Produced during runtime
Name Size
nightly-check-java11-ci-jvm-err
46.4 KB
nightly-check-java11-ci-results
3.24 MB
nightly-check-java17-ci-jvm-err
46.4 KB
nightly-check-java17-ci-results
3.25 MB
nightly-check-java21-ci-jvm-err
46.4 KB
nightly-check-java21-ci-results
3.24 MB
nightly-check-java23-ci-jvm-err
46.4 KB
nightly-check-java23-ci-results
3.25 MB
nightly-testOutOfBand-java11-ci-results
1.13 MB
nightly-testOutOfBand-java17-ci-results
1.13 MB
nightly-testOutOfBand-java21-ci-results
1.13 MB
nightly-testOutOfBand-java23-ci-results
1.13 MB
nightly-testParallel-java11-ci-results
157 KB
nightly-testParallel-java17-ci-results
158 KB
nightly-testParallel-java21-ci-results
158 KB
nightly-testParallel-java23-ci-results
158 KB
nightly-testSerial-java11-ci-results
894 KB
nightly-testSerial-java17-ci-results
893 KB
nightly-testSerial-java21-ci-results
896 KB
nightly-testSerial-java23-ci-results
894 KB