-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kafka connect sink can not process with empty array #457
Comments
Hi @phongbui81062 are you still running into this? |
yes, I still stuck on this |
@phongbui81062 Could you run this and paste the output here?
I want to check something with the definition of the table... |
here is my table
Postgres table
|
Could you try to insert the data in Postgres table in different array length? such as col aftership_id values {'asd','asd'}, carrier_17_id with empty array |
@phongbui81062 I was able to replicate the issue, let me discuss and see. Thanks! |
thank you |
Hi @Paultagoras Can I ask 1 question? What happen if I add async_insert and wait_for_async_insert settings for kafka connect sink? |
If I remember right we had a bug with RowBinaryWithDefaults - essentially it'll hiccup if using Avro or Protobuf. May have figured out the issue, could you please try setting this delimiter instead? "transforms": "flatten", |
Hi @Paultagoras
let me test. Thank a lot.
I had one problem that one of my table (which use kafka connect sink as pipeline) will trigger 1 materialize view and that MV would trigger another one. When I test with basic insert query, It looks like it waits to process in MV done, and it took too long around 8 mins, then when I added these settings async_insert and wait_for_async_insert, it reduced to 10 seconds. I worry about if I deploy kafka connect sink on production env with my current design, performance will be affected. Can you give me alternative solution in case not use these settings? |
Describe the bug
Hi everyone,
I have this case and I don't know what is the problem.
I create kafka connect sink, which reads data from CDC (Postgres) with Avro format. However when I try to sink data to Table which contains Array(String) columns (it worked well with Array(UInt32)), the error SIZES_OF_ARRAYS_DONT_MATCH, seem like ClickHouse tried to compare the length of array columns
Steps to reproduce
Errors Log:
Can you help me to resolve this problems?
The text was updated successfully, but these errors were encountered: