Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated message table creates cdc fail #77

Open
dartartem opened this issue Nov 3, 2020 · 24 comments
Open

Updated message table creates cdc fail #77

dartartem opened this issue Nov 3, 2020 · 24 comments

Comments

@dartartem
Copy link
Contributor

dartartem commented Nov 3, 2020

Problem appeared when the eventuate-tram-examples-customers-orders project was tested with migration to database id support.

How was found (only mysql is currently tested):

  1. Run all docker services
  2. Run e2e tests
  3. Configure application services (customer-service, order-service, order-history-service) to used database id generation.
  4. Apply migration script to the database: https://github.com/eventuate-foundation/eventuate-common/blob/wip-db-id-gen/mysql/4.initialize-database-db-id.sql
  5. Restart application services
  6. Rerun e2e tests

Error in cdc logs:

cdc-service_1            | java.lang.RuntimeException: java.lang.IllegalArgumentException: Unexpected type class [B of column published, should be int or stringified int
cdc-service_1            | 	at io.eventuate.local.common.BinlogEntryReader.handleProcessingFailException(BinlogEntryReader.java:134) ~[eventuate-local-java-cdc-connector-common-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.publish(MySqlBinaryLogClient.java:337) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$null$5(MySqlBinaryLogClient.java:314) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:79) ~[micrometer-core-1.1.1.jar!/:1.1.1]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleWriteRowsEvent$6(MySqlBinaryLogClient.java:313) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_252]
cdc-service_1            | 	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485) ~[na:1.8.0_252]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleWriteRowsEvent(MySqlBinaryLogClient.java:312) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEvent(MySqlBinaryLogClient.java:232) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleBinlogEventWithErrorHandling$1(MySqlBinaryLogClient.java:184) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:79) ~[micrometer-core-1.1.1.jar!/:1.1.1]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEventWithErrorHandling(MySqlBinaryLogClient.java:183) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$start$0(MySqlBinaryLogClient.java:163) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1055) ~[mysql-binlog-connector-java-0.16.1.jar!/:0.16.1]
cdc-service_1            | 	at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:913) ~[mysql-binlog-connector-java-0.16.1.jar!/:0.16.1]
cdc-service_1            | 	at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559) ~[mysql-binlog-connector-java-0.16.1.jar!/:0.16.1]
cdc-service_1            | 	at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793) ~[mysql-binlog-connector-java-0.16.1.jar!/:0.16.1]
cdc-service_1            | 	at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_252]
cdc-service_1            | Caused by: java.lang.IllegalArgumentException: Unexpected type class [B of column published, should be int or stringified int
cdc-service_1            | 	at io.eventuate.local.common.BinlogEntry.getBooleanColumn(BinlogEntry.java:19) ~[eventuate-local-java-cdc-connector-common-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.tram.cdc.connector.BinlogEntryToMessageConverter.convert(BinlogEntryToMessageConverter.java:24) ~[eventuate-tram-cdc-connector-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.common.BinlogEntryHandler.publish(BinlogEntryHandler.java:36) ~[eventuate-local-java-cdc-connector-common-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.publish(MySqlBinaryLogClient.java:335) [eventuate-local-java-cdc-connector-mysql-binlog-0.10.0-SNAPSHOT.jar!/:na]
cdc-service_1            | 	... 23 common frames omitted

Reason recreation of table message breaks expected column order here:

https://github.com/eventuate-foundation/eventuate-cdc/blob/master/eventuate-local-java-cdc-connector-mysql-binlog/src/main/java/io/eventuate/local/mysql/binlog/AbstractMySqlBinlogExtractor.java#L23

And in current case column was "payload" instead of "published".

dartartem added a commit to dartartem/eventuate-cdc-1 that referenced this issue Nov 3, 2020
@dartartem
Copy link
Contributor Author

Fixed. Tested with example application.
For now, fixed without cdc tests.
Usually we trying to test as much as possible in specific project,
on other hand:

  1. We use examples for additional testing
  2. Case is too specific
  3. To test this behaviour is necessary to create separate script, with separate runtime to isolate so significant changes from other tests.

@cer
Copy link
Contributor

cer commented Nov 3, 2020

How often do TABLE_MAP events occur?
Is it only when the schema changes?

@dartartem
Copy link
Contributor Author

dartartem commented Nov 3, 2020

@cer Chris, unfortunately I cannot answer that question right now.
It was long time ago when I worked with it, I need to spent some time to research/remember.

@cer
Copy link
Contributor

cer commented Nov 3, 2020

If TABLE_MAP occurs for every insert then refreshing the column order would be quite expensive.

Also, it looks like the TABLE_MAP event contains the schema so it might not be necessary to query the database.

@dartartem
Copy link
Contributor Author

@cer I will investigate it too

@dartartem
Copy link
Contributor Author

@cer Chris,

According the documentation:
https://github.com/shyiko/mysql-binlog-connector-java/blob/master/src/main/java/com/github/shyiko/mysql/binlog/event/EventType.java#L113-L120
TABLE_MAP occurs for every insert.
I debugged cdc and confirmed it experimental.

Under debug I did not find any information about column names.
But I found this: shyiko/mysql-binlog-connector-java#24 (comment)
As I remember I used this information to retrieve column names.

I was thinking about optimization, and I think that we can refresh column orders only if something in columnMetadata and columnTypes is changed. Please see these screenshots:

https://gist.github.com/dartartem/c3d81f5de7252c583b7c77273422cfaf

I did not find accurate information, but it seems columnMetadata is information about column attributes: PK, NOT NULL, etc.

@cer
Copy link
Contributor

cer commented Nov 5, 2020

I was thinking about optimization, and I think that we can refresh column orders only if something in columnMetadata and columnTypes is changed. Please see these screenshots:

I think this would be a good approach - I don't think knowing the details of columnMetadata matters. All that matters is that the array length is unchanged and that the contents are the same.

@dartartem
Copy link
Contributor Author

Will do, thank you

dartartem added a commit to dartartem/eventuate-cdc-1 that referenced this issue Nov 6, 2020
@dartartem
Copy link
Contributor Author

During fixing I found another case:

mysql extractor caches columns using schema and table as key.

After migration, column types/metada were not considered as changed because to message table was assigned new id, since migration drops then recreates message table. In other words data not compared because tables are considered as different.

To make everything work I added additional column refresh when new table id is passed via TABLE_MAP event.

@cer
Copy link
Contributor

cer commented Nov 6, 2020

To make everything work I added additional column refresh when new table id is passed via TABLE_MAP event.

Can you link to the lines of code that implement this.

@dartartem
Copy link
Contributor Author

during working on cdc schema migration tests I found a new failure.

It seems if table altered (not recreated by drop/create), mysql connector does not see changes in schema.

But columns order can be refreshed via datasource, so cdc expected columns != cdc received columns.

And error is exactly same that I found during db id migration.

Simple way to reproduce.

We have test class that checks message handling: https://github.com/eventuate-foundation/eventuate-cdc/blob/master/eventuate-local-java-cdc-connector-test-util/src/main/java/io/eventuate/local/test/util/AbstractBinlogEntryReaderMessageTableTest.java

version for mysql: https://github.com/eventuate-foundation/eventuate-cdc/blob/master/eventuate-local-java-cdc-connector-mysql-binlog/src/test/java/io/eventuate/local/mysql/binlog/MySqlBinlogEntryReaderMessageTableTest.java

  1. Run ./gradlew mysqlComposeUp
  2. Run test
  3. Execute the following in adminer:
    ALTER TABLE eventuate.message drop destination;
    ALTER TABLE eventuate.message add destination longtext;
  4. Run test

Some notes:

cdc is starts/stops on each test run. It means that there is no cache in cdc mysql client.
mysql stop/start does not help.

I am not sure what to do. I see at least 3 options:

  1. it is not critical, presumably any migration can be done via recreating of table - ignore for now.
  2. Investigate shyiko connector souce code and debug
  3. Try to replace shyiko coonector by (fork?) https://github.com/osheroff/mysql-binlog-connector-java

@dartartem
Copy link
Contributor Author

Stack trace (similar to trace of initial fail):

java.lang.IllegalArgumentException: Unexpected type class [B of column published, should be int or stringified int
	at io.eventuate.local.common.BinlogEntry.getBooleanColumn(BinlogEntry.java:19)
	at io.eventuate.tram.cdc.connector.BinlogEntryToMessageConverter.convert(BinlogEntryToMessageConverter.java:24)
	at io.eventuate.local.common.BinlogEntryHandler.publish(BinlogEntryHandler.java:36)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.publish(MySqlBinaryLogClient.java:336)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$null$5(MySqlBinaryLogClient.java:315)
	at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:79)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleWriteRowsEvent$6(MySqlBinaryLogClient.java:314)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
	at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleWriteRowsEvent(MySqlBinaryLogClient.java:313)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEvent(MySqlBinaryLogClient.java:234)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleBinlogEventWithErrorHandling$1(MySqlBinaryLogClient.java:185)
	at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:79)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEventWithErrorHandling(MySqlBinaryLogClient.java:184)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$start$0(MySqlBinaryLogClient.java:164)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1055)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:913)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559)
	at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793)
	at java.lang.Thread.run(Thread.java:748)
14:59:47.255 [Thread-11] INFO  i.e.l.m.binlog.MySqlBinaryLogClient - MySqlBinaryLogClient finished processing
14:59:47.256 [blc-172.17.0.1:3306] ERROR i.e.l.m.binlog.MySqlBinaryLogClient - Restarting due to exception
java.lang.RuntimeException: java.lang.IllegalArgumentException: Unexpected type class [B of column published, should be int or stringified int
	at io.eventuate.local.common.BinlogEntryReader.handleProcessingFailException(BinlogEntryReader.java:134)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.publish(MySqlBinaryLogClient.java:338)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$null$5(MySqlBinaryLogClient.java:315)
	at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:79)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleWriteRowsEvent$6(MySqlBinaryLogClient.java:314)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
	at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleWriteRowsEvent(MySqlBinaryLogClient.java:313)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEvent(MySqlBinaryLogClient.java:234)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleBinlogEventWithErrorHandling$1(MySqlBinaryLogClient.java:185)
	at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:79)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEventWithErrorHandling(MySqlBinaryLogClient.java:184)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$start$0(MySqlBinaryLogClient.java:164)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1055)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:913)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559)
	at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: Unexpected type class [B of column published, should be int or stringified int
	at io.eventuate.local.common.BinlogEntry.getBooleanColumn(BinlogEntry.java:19)
	at io.eventuate.tram.cdc.connector.BinlogEntryToMessageConverter.convert(BinlogEntryToMessageConverter.java:24)
	at io.eventuate.local.common.BinlogEntryHandler.publish(BinlogEntryHandler.java:36)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.publish(MySqlBinaryLogClient.java:336)
	... 23 common frames omitted
14:59:47.265 [blc-172.17.0.1:3306] WARN  c.g.s.mysql.binlog.BinaryLogClient - io.eventuate.local.mysql.binlog.MySqlBinaryLogClient$$Lambda$756/2083949197@6084281d choked on Event{header=EventHeaderV4{timestamp=1604922689000, eventType=EXT_WRITE_ROWS, serverId=1, headerLength=19, dataLength=190, nextPosition=582, flags=0}, data=WriteRowsEventData{tableId=108, includedColumns={0, 1, 2, 3, 4, 5}, rows=[
    [[B@7ad7937c, [B@75ab1315, [B@287a981e, [B@2f3ada00, 0, 1604922689738]
]}}
java.lang.IllegalArgumentException: Restart callback is not specified, but restart is requsted
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$handleRestart$2(MySqlBinaryLogClient.java:196)
	at java.util.Optional.orElseThrow(Optional.java:290)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleRestart(MySqlBinaryLogClient.java:196)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.handleBinlogEventWithErrorHandling(MySqlBinaryLogClient.java:188)
	at io.eventuate.local.mysql.binlog.MySqlBinaryLogClient.lambda$start$0(MySqlBinaryLogClient.java:164)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1055)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:913)
	at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559)
	at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793)
	at java.lang.Thread.run(Thread.java:748)

@cer
Copy link
Contributor

cer commented Nov 9, 2020

cdc is starts/stops on each test run.

Steps 1, and 3 are Run test but I don't think you mean to say that they restart the CDC.

@dartartem
Copy link
Contributor Author

Steps 1, and 3 are Run test but I don't think you mean to say that they restart the CDC.

Chris, it is an integration test. So yes, it starts/stops the cdc.

@cer
Copy link
Contributor

cer commented Nov 9, 2020

Please replace

Run ./gradlew mysqlComposeUp
Run test
Execute the following in adminer:
ALTER TABLE eventuate.message drop destination;
ALTER TABLE eventuate.message add destination longtext;
Run test

With an more precise sequence of steps that actually describe what is happening at the database level and the starting and stopping of CDC.

e.g.

Start CDC 
INSERT INTO MESSAGES ...
ALTER TABLE eventuate.message drop destination;
ALTER TABLE eventuate.message add destination longtext;
INSERT INTO MESSAGES ....

@dartartem
Copy link
Contributor Author

Actual steps:

  1. Start mysql, zookeeper, kafka
  2. Start cdc
  3. Insert new message into message table
  4. Stop cdc

execute ALTER TABLE eventuate.message drop destination;
execute ALTER TABLE eventuate.message add destination longtext;

  1. Start cdc

  2. Insert new message into message table

  3. Cdc fails with described stack trace.

@dartartem
Copy link
Contributor Author

dartartem commented Nov 9, 2020

@cer

I realized what is wrong.

Integration test uses mocked offset store.

My bad.

But it shows another issue to think about:

  1. Multiple messages are inserted into message table.
  2. Cdc handled only part of them
  3. Schema is changed
  4. For old messages cdc uses updated schema and fails

So, before changing anything in schema needs to make sure that cdc finished processing.

We have some status service, I will check.

@dartartem
Copy link
Contributor Author

@cer

I checked the status service. It shows if processing is finished. It compares offset of last handled event with current binlog position. So, checking the status service should be sufficient.

@dartartem
Copy link
Contributor Author

@cer

One more moment, currently cdc cannot reread already processed binlog.
But can we be sure that it will not be necessary at some point?
Or maybe we will need something else but similar?
How about to persist column names using column types and metadata as a key?

@cer
Copy link
Contributor

cer commented Nov 9, 2020

So, before changing anything in schema needs to make sure that cdc finished processing.

Won't the CDC reprocess TABLE_MAP events so will know what the schema is?

@cer
Copy link
Contributor

cer commented Nov 9, 2020

3. Cdc fails with described stack trace.

Also, given that the CDC is restarted what is the actual problem? What is the sequence of events that it reads that causes it to fail.

@dartartem
Copy link
Contributor Author

Summary:

I used an integration test.
It means that cdc started and stopped in the test.
Problem is that the test uses mocked offset store, so offsets are not persisted between runs.

  1. start mysql

further steps happens inside the test:

  1. start cdc
  2. save a message
  3. cdc processes message.
  4. stop cdc

next step executed via adminer:

execute ALTER TABLE eventuate.message drop destination;
execute ALTER TABLE eventuate.message add destination longtext;

the last steps executed inside the same test:

  1. start cdc
  2. message from step 2) is processed on step 3) but is not saved because mocked offset store is used.
    So cdc tries to process it again.
  3. Cdc requests information about column order from actual database
  4. column names/order received from actual schema, but processing message is from previous schema that has different column order.
  5. cdc fails because it tries to parse string as boolean, because it thinks that it processes "published" field, but actually it processes "payload" field

Example:

initial column order:

  1. id,
  2. destination,
  3. headers,
  4. payload,
  5. published,
  6. creation_time

message from step 2) is saved with that column order.
Then in step 5) Order is changed to

  1. id,
  2. headers,
  3. payload,
  4. published,
  5. creation_time
  6. destination

On step 3) that message is processed, but because mocked offset store is used, binlog is not saved and cdc reprocesses it after restart on step 6)

How data event looks like:

[some_id, some_destination, some_headers, some_payload, some_published_flag, some_creation_time]

(For convenience lets assume that all indexes starting from 1)

TABLE_MAP_EVENT does not contain column names (discussed here) So, cdc needs to check what the columns correspond to the data. It uses sql to query column order as recommended by author of mysql connector.

cdc tries to process "published" column (step 10) ), in current schema it's order is 4
but in received event corresponding cell is "some_payload" because order corresponds to the initial schema.

dartartem added a commit to dartartem/eventuate-cdc-1 that referenced this issue Nov 10, 2020
dartartem added a commit to dartartem/eventuate-cdc-1 that referenced this issue Nov 11, 2020
dartartem added a commit to dartartem/eventuate-cdc-1 that referenced this issue Nov 11, 2020
…ved separate test script for message table schema migration.
cer added a commit that referenced this issue Nov 13, 2020
#77: Updated message table creates cdc fail.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants