-
Notifications
You must be signed in to change notification settings - Fork 287
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sink/codec(cdc): add message size check and support schema default #5451
Conversation
[REVIEW NOTIFICATION] This pull request has been approved by:
To complete the pull request process, please ask the reviewers in the list to review by filling The full list of commands accepted by this bot can be found here. Reviewer can indicate their review by submitting an approval review. |
/run-all-tests |
/run-all-tests |
/run-all-tests |
LGTM. |
Avro doesn't fit into the current IT framework now, which definitely should be improved in future. But I have done some manual e2e checks. |
Co-authored-by: zhaoxinyu <zhaoxinyu512@gmail.com>
/merge |
This pull request has been accepted and is ready to merge. Commit hash: 7a80b61
|
What problem does this PR solve?
Issue Number: ref #5338
What's changed and why?
max-message-bytes
limitation when using kafka sink, avro needs to respect it.default
property for record, add support in TiCDC. For TiCDC, this doesn't matter for DML changes since TiCDC willgetDefaultValueOrZeroValue
for DML change, so we always have a value. But it's important for DDL(schema change). Optional and non-optional columns has different results when register schema into Confluent Schema Registry, https://docs.confluent.io/platform/current/schema-registry/avro.html#summary. Losing the default definition might fail the compatibility check.goavro
doesn't support set default value for logical type. So there is a known limitation thatDECIMAL
type will lose the default definition.Release note