Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tools: replace generate binlog position with binlogctl #517

Merged
merged 2 commits into from
Jul 6, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions tools/tidb-binlog-kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,13 +73,13 @@ cd tidb-binlog-latest-linux-amd64

To guarantee the integrity of data, perform the following operations 10 minutes after Pump is started:

- Use the `generate_binlog_position` tool of the [tidb-tools](https://github.com/pingcap/tidb-tools)project to generate the Drainer savepoint file. Use `generate_binlog_position` to compile this tool. See the [README description](https://github.com/pingcap/tidb-tools/blob/master/generate_binlog_position/README.md) for usage. You can also download this tool from [generate_binlog_position](https://download.pingcap.org/generate_binlog_position-latest-linux-amd64.tar.gz) and use `sha256sum` to verify the [sha256](https://download.pingcap.org/generate_binlog_position-latest-linux-amd64.sha256) file.
- Do a full backup. For example, back up TiDB using mydumper.
- Use [binlogctl](https://github.com/pingcap/tidb-tools/tree/master/tidb_binlog/binlogctl) of the [tidb-tools](https://github.com/pingcap/tidb-tools) project to generate the `position` for the initial start of Drainer.
- Do a full backup. For example, back up TiDB using Mydumper.
- Import the full backup to the target system.
- The savepoint file started by the Kafka version of Drainer is stored in the checkpoint table of the downstream database tidb_binlog by default. If no valid data exists in the checkpoint table, configure `initial-commit-ts` to make Drainer work from a specified position when it is started:
- The savepoint file started by the Kafka version of Drainer is stored in the `checkpoint` table of the downstream database `tidb_binlog` by default. If no valid data exists in the `checkpoint` table, configure `initial-commit-ts` to make Drainer work from a specified position when it is started:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The savepoint file? is The savepoint metadata better?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Fixed in #523


```
bin/drainer --config=conf/drainer.toml --data-dir=${drainer_savepoint_dir}
bin/drainer --config=conf/drainer.toml --initial-commit-ts=${position}
```

- The drainer outputs `pb` and you need to set the following parameters in the configuration file:
Expand Down