Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Verify whether all MQ output protocol handles time zones correctly #1639

Open
2 of 5 tasks
liuzix opened this issue Apr 12, 2021 · 3 comments
Open
2 of 5 tasks

Verify whether all MQ output protocol handles time zones correctly #1639

liuzix opened this issue Apr 12, 2021 · 3 comments
Assignees
Labels
component/sink Sink component. help wanted Denotes an issue that needs help from a contributor. Must meet "help wanted" guidelines.

Comments

@liuzix
Copy link
Contributor

liuzix commented Apr 12, 2021

Recently it has come to our attention that our Avro output does not handle correctly the situation where the --tz option used in TiCDC is different from the local time zone. We need to verify whether all our protocols can handle time zones correctly, and, if not, under what condition things will go wrong and whether there is any workaround.

Checklist

@liuzix liuzix added question Further information is requested. help wanted Denotes an issue that needs help from a contributor. Must meet "help wanted" guidelines. labels Apr 12, 2021
@overvenus overvenus added component/sink Sink component. and removed question Further information is requested. labels Jul 8, 2021
@Rustin170506
Copy link
Member

/assign

I am working on the Canal JSON.

@Rustin170506
Copy link
Member

The Canal JSON protocol should not have this problem and is tested as follows:

  1. start a tidb cluster
  2. start a ticdc cluster
  3. create a changefeed with canal json protocol
  4. create an table:
create table if not exists test.table_name
(
	column_1 int not null,
	column_2 datetime null,
	column_3 timestamp null,
	constraint table_name_column_1_uindex
		unique (column_1)
);

alter table test.table_name
	add primary key (column_1);
  1. insert 23,2021-11-01 17:43:24,2021-11-01 17:43:33
  2. the kafka puput:
{"id":0,"database":"test","table":"table_name","pkNames":["column_1"],"isDdl":false,"type":"INSERT","es":1635759848635,"ts":0,"sql":"","sqlType":{"column_1":-5,"column_2":93,"column_3":93},"mysqlType":{"column_1":"int","column_2":"datetime","column_3":"timestamp"},"data":[{"column_1":"23","column_2":"2021-11-01 17:43:24","column_3":"2021-11-01 17:43:33"}],"old":[null]}

Technically we are generating the string directly, so if the user's upstream and kafka consumer programs handle the time zone correctly (keeping it consistent), then there is no problem.

@Rustin170506 Rustin170506 removed their assignment Jan 3, 2024
@asddongmen
Copy link
Contributor

@3AceShowHand Could you please take a look at this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/sink Sink component. help wanted Denotes an issue that needs help from a contributor. Must meet "help wanted" guidelines.
Projects
None yet
Development

No branches or pull requests

5 participants