-
Notifications
You must be signed in to change notification settings - Fork 288
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Table column type bigint is converted to AVRO decimal logical type but should be converted to long type #4420
Comments
Hi, according to https://avro.apache.org/docs/1.11.0/spec.html#schema_primitive avro's long is a 64-bit signed integer. So I don't think this works with TiDB's BIGINT UNSIGNED. Looking at the code this should work ok for BIGINT which maps to long. In the code there is this:
and
This is in |
Hi @dveeden, thanks for the update. Indeed, I didn't notice that it is unsigned bigint. In such a case, I think using Avro Decimal is a legit option. May I ask if TiCDC has a similar configuration as Debezium where the user can configure whether Avro |
Shall I close this ticket? |
@keweishang I think it's OK to keep this issue open status because we haven't the decision about configuration. |
@lonng thanks for the update. For the default behavior of using Avro Decimal type. I was also wondering if the precision should be |
What did you do?
employees
. Itsid
column isbigint
.What did you expect to see?
The AVRO type in Kafka Schema Registry for the
id
column isdecimal
AVRO logical type.What did you see instead?
The AVRO type in Kafka Schema Registry for the
id
column should belong
AVRO type instead.Versions of the cluster
Upstream TiDB cluster version (execute
SELECT tidb_version();
in a MySQL client):TiCDC version (execute
cdc version
):5.3.0
The text was updated successfully, but these errors were encountered: