-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avro schema incompatible with schema upload vs python producer #9571
Comments
This might be a duplicate of #8510. I ran into the same issues using the Java client. Primary issue: I believe that the above two schema versions are not identical. They likely contain a whitespace difference. Unfortunately, Pulsar seems to consider two schema definitions that differ by whitespace different, and compatible. If aligning the whitespacing of the schema definitions resolve your problem, please mark this ticket a duplicate of #8510. -- You are right to disable auto updates. If the above worked, you don't have to read the below text. I'm sharing with you how I set the various schema policies for your reference. I've not had anyone review these policy settings, so if you see something dubious, feel free to ask me to elaborate.
The schema related policies for my namespace look like:
|
I'm not working on Pulsar this sprint but I've inserted a task into the sprint to do this testing. I'll post an update here once I've done the testing. |
I downloaded Pulsar 2.7.1 and checked one case. I uploaded an Avro schema through pulsar-admin. The schema had an extra newline in it. In Pulsar 2.7.0 that newline made it impossible for the producer application to publish to the topic with that schema registered. The client side error I got was:
In Pulsar 2.7.1, the extra newline made no difference. Producing with a schema without that extra newline succeeded! |
|
Retracting the previous comment. I got the Python client to work only against required float fields. I was unable to get it to work against required string fields. While testing, I fixed two bugs locally. There seems to be another bug concerning string fields. All of the bugs give the "IncompatibleSchema" error. It's late today and I'll resume testing tomorrow. |
I went deeper today and now believe that the original issue, in its exact form, was resolved by #9612. Here are all the cases I got working, including workarounds where I encountered bugs. All schemas were uploaded through pulsar-admin. Producers and consumers were disallowed from updating schemas. CAVEAT: Pulsar does not support
|
@codelipenghui @congbobo184 while the original issue was fixed, there were three bugs that I encountered that also resulted in an Consider the workarounds that I implemented locally:
which overrides
So this ticket can be used to track |
Filed |
The patch posted in #9571 (comment) is a superset of the patch posted in #10174. |
The issue had no activity for 30 days, mark with Stale label. |
Describe the bug
Hi i just want to know if i do something wrong or if someone has identify following behavior (and found some solution for fixing it)
I create a topic and upload the Avro Schema as Yaml
Content of Avro Yaml Schema:
{"type": "AVRO","schema":"{\"name\":\"Validation\",\"type\":\"record\",\"fields\":[{\"name\":\"messagetimestamp\",\"type\":\"string\"}]}","properties": {}}
Schema looks like that with schemas get
set-is-allow-auot-update-schema is set to disable(!) to cover that nowone is “destroying” or topic
but if i know use a python producer the schema is failing with “incompatible schema”
python code:
same behaviour with python pulsar functions .
if i turn on the auto schema update a new schema is created but total same structure:
Any idea why that happens and how can i avoid that multiple producer (written in Java, Python) are changing the schema all time?!
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Schema version shouldn't be updated if python schema is the same as given in the schema upload (could it be that there is some hidden character given?!)
The text was updated successfully, but these errors were encountered: