Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump of Avro from 1.11.3 to 1.12.0 causes deserialization of messages with array of elements using logicalType to fail #43084

Closed
bechhansen opened this issue Sep 6, 2024 · 9 comments
Labels
kind/bug-thirdparty Bugs that are caused by third-party components and not causing a major dysfunction of core Quarkus.

Comments

@bechhansen
Copy link

bechhansen commented Sep 6, 2024

Describe the bug

After updating from Quarkus 3.13.x to 3.14.x we are no longer able to deserialize messages containing arrays of logicalTypes.

This is an example of an Avro schema causing issues:

{
  "type": "record",
  "name": "TestObject",
  "doc": "Test",
  "namespace": "com.test.avro",
  "fields": [
    {
      "name": "Timestamps",
      "type": {
        "type": "array",
        "items": {
          "type": "long",
          "logicalType": "timestamp-millis"
        },
        "default": []
      }
    }
  ]
}

It looks likes the issue might have been introduced by this PR apache/avro#2389

Expected behavior

We expect to be able to deserialize the same message using Quarkus 3.13.x and 3.14.x

Actual behavior

Our Avro messages cannot be deserialized. We get this error:

2024-09-06 14:14:44,674 ERROR [io.sma.rea.mes.kafka] (smallrye-kafka-consumer-thread-0) SRMSG18249: Unable to recover from the deserialization failure (topic: testobject), configure a DeserializationFailureHandler to recover from errors.: java.lang.ClassCastException: class java.time.Instant cannot be cast to class java.lang.Long (java.time.Instant and java.lang.Long are in module java.base of loader 'bootstrap')
        at org.apache.avro.generic.PrimitivesArrays$LongArray.add(PrimitivesArrays.java:132)
        at java.base/java.util.AbstractList.add(AbstractList.java:113)
        at org.apache.avro.generic.GenericDatumReader.addToArray(GenericDatumReader.java:333)
        at org.apache.avro.generic.GenericDatumReader.readArray(GenericDatumReader.java:294)
        at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:184)
        at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:181)
        at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:248)
        at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:168)
        at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:180)
        at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
        at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:154)
        at io.apicurio.registry.serde.avro.AvroKafkaDeserializer.readData(AvroKafkaDeserializer.java:117)
        at io.apicurio.registry.serde.AbstractKafkaDeserializer.readData(AbstractKafkaDeserializer.java:142)
        at io.apicurio.registry.serde.AbstractKafkaDeserializer.deserialize(AbstractKafkaDeserializer.java:122)
        at io.smallrye.reactive.messaging.kafka.fault.DeserializerWrapper.lambda$deserialize$1(DeserializerWrapper.java:77)
        at io.smallrye.reactive.messaging.kafka.fault.DeserializerWrapper.wrapDeserialize(DeserializerWrapper.java:109)
        at io.smallrye.reactive.messaging.kafka.fault.DeserializerWrapper.deserialize(DeserializerWrapper.java:77)
        at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:73)
        at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:321)
        at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:283)
        at org.apache.kafka.clients.consumer.internals.FetchCollector.fetchRecords(FetchCollector.java:168)
        at org.apache.kafka.clients.consumer.internals.FetchCollector.collectFetch(FetchCollector.java:134)
        at org.apache.kafka.clients.consumer.internals.Fetcher.collectFetch(Fetcher.java:145)
        at org.apache.kafka.clients.consumer.internals.LegacyKafkaConsumer.pollForFetches(LegacyKafkaConsumer.java:693)
        at org.apache.kafka.clients.consumer.internals.LegacyKafkaConsumer.poll(LegacyKafkaConsumer.java:617)
        at org.apache.kafka.clients.consumer.internals.LegacyKafkaConsumer.poll(LegacyKafkaConsumer.java:590)
        at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:874)
        at io.smallrye.reactive.messaging.kafka.impl.ReactiveKafkaConsumer.lambda$poll$4(ReactiveKafkaConsumer.java:199)
        at io.smallrye.context.impl.wrappers.SlowContextualFunction.apply(SlowContextualFunction.java:21)
        at io.smallrye.mutiny.operators.uni.UniOnItemTransform$UniOnItemTransformProcessor.onItem(UniOnItemTransform.java:36)
        at io.smallrye.mutiny.operators.uni.UniOperatorProcessor.onItem(UniOperatorProcessor.java:47)
        at io.smallrye.mutiny.operators.uni.UniMemoizeOp.forwardTo(UniMemoizeOp.java:123)
        at io.smallrye.mutiny.operators.uni.UniMemoizeOp.subscribe(UniMemoizeOp.java:67)
        at io.smallrye.mutiny.operators.AbstractUni.subscribe(AbstractUni.java:36)
        at io.smallrye.mutiny.operators.uni.UniRunSubscribeOn.lambda$subscribe$0(UniRunSubscribeOn.java:27)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
        at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
        at java.base/java.lang.Thread.run(Thread.java:1583)

How to Reproduce?

Unzip this example and run mvn test. Observe the test to fail.
kafka-avro-schema-quickstart.zip

Change Quarkus version to 3.13.1, and rerun, to see the test run successfully.

Output of uname -a or ver

INGW64_NT-10.0-19045 xxxxxxxx 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys

Output of java -version

openjdk version "21.0.2" 2024-01-16 LTS OpenJDK Runtime Environment Zulu21.32+17-CA (build 21.0.2+13-LTS) OpenJDK 64-Bit Server VM Zulu21.32+17-CA (build 21.0.2+13-LTS, mixed mode, sharing)

Quarkus version or git rev

3.14.1

Build tool (ie. output of mvnw --version or gradlew --version)

Apache Maven 3.9.2 (c9616018c7a021c1c39be70fb2843d6f5f9b8a1c)

Additional information

No response

@bechhansen bechhansen added the kind/bug Something isn't working label Sep 6, 2024
@cescoffier
Copy link
Member

My guess is that the Apicurio SERDE might need to be updated.

@alesj @carlesarnal any idea?

@cescoffier
Copy link
Member

Except if there is a new configuration to set, I don't think we will be able to fix it / workaround it in Quarkus

@cescoffier cescoffier added triage/upstream and removed kind/bug Something isn't working labels Sep 6, 2024
@geoand geoand added kind/bug Something isn't working and removed triage/needs-triage labels Sep 8, 2024
@carlesarnal
Copy link
Contributor

Sorry, I've been on PTO, looking into this now to see what can be done.

@scherrsasrf
Copy link

We are also suffering from this and are currently stuck on 3.13.3. Are there any possible workarounds?

@gsmet gsmet added kind/bug-thirdparty Bugs that are caused by third-party components and not causing a major dysfunction of core Quarkus. and removed triage/upstream kind/bug Something isn't working labels Nov 26, 2024
@scherrsasrf
Copy link

This seems to be the relevant ticket from avro side: https://issues.apache.org/jira/browse/AVRO-4039

@carlesarnal
Copy link
Contributor

The upcoming Apicurio Registry release will fix this issue. I will comment here once it's done with the details.

@carlesarnal
Copy link
Contributor

The upcoming Apicurio Registry release will fix this issue. I will comment here once it's done with the details.

Quarkus has been upgraded on main to the new Apicurio Registry release 2.6.6.Final.

@cescoffier
Copy link
Member

@carlesarnal do you think it's something we could backport to the 3.15 branch of Quarkus (it's the LTS branch).

BTW, should we close this issue?

@carlesarnal
Copy link
Contributor

I think so, yes. The configuration values are the same, the options mentioned in the related threads were basically workarounds, so I think backporting the upgrade makes sense.

As for closing the issue, yes, I think it can be closed since the upgrade is on main.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug-thirdparty Bugs that are caused by third-party components and not causing a major dysfunction of core Quarkus.
Projects
None yet
Development

No branches or pull requests

6 participants