Skip to content

Commit

Permalink
op-guide: update tikv rolling udpate policy (#592)
Browse files Browse the repository at this point in the history
  • Loading branch information
LinuxGit authored and lilin90 committed Aug 31, 2018
1 parent 9c458c6 commit 965cd1d
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 6 deletions.
2 changes: 1 addition & 1 deletion op-guide/ansible-deployment-rolling-update.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ wget http://download.pingcap.org/tidb-v2.0.3-linux-amd64-unportable.tar.gz
$ ansible-playbook rolling_update.yml --tags=tikv
```
When you apply a rolling update to the TiKV instance, Ansible migrates the Region leader to other nodes. The concrete logic is as follows: Call the PD API to add the `evict leader scheduler` -> Inspect the `leader_count` of this TiKV instance every 10 seconds -> Wait the `leader_count` to reduce to below 10, or until the times of inspecting the `leader_count` is more than 12 -> Start closing the rolling update of TiKV after two minutes of timeout -> Delete the `evict leader scheduler` after successful start. The operations are executed serially.
When you apply a rolling update to the TiKV instance, Ansible migrates the Region leader to other nodes. The concrete logic is as follows: Call the PD API to add the `evict leader scheduler` -> Inspect the `leader_count` of this TiKV instance every 10 seconds -> Wait the `leader_count` to reduce to below 1, or until the times of inspecting the `leader_count` is more than 18 -> Start closing the rolling update of TiKV after three minutes of timeout -> Delete the `evict leader scheduler` after successful start. The operations are executed serially.
If the rolling update fails in the process, log in to `pd-ctl` to execute `scheduler show` and check whether `evict-leader-scheduler` exists. If it does exist, delete it manually. Replace `{PD_IP}` and `{STORE_ID}` with your PD IP and the `store_id` of the TiKV instance:
Expand Down
8 changes: 3 additions & 5 deletions tispark/tispark-quick-start-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,17 @@ category: User Guide

# TiSpark Quick Start Guide

To make it easy to [try TiSpark](tispark-user-guide.md), the TiDB cluster integrates Spark, TiSpark jar package and TiSpark sample data by default, in both the Pre-GA and master versions installed using TiDB-Ansible.
To make it easy to [try TiSpark](tispark-user-guide.md), the TiDB cluster installed using TiDB-Ansible integrates Spark, TiSpark jar package and TiSpark sample data by default.

## Deployment information

- Spark is deployed by default in the `spark` folder in the TiDB instance deployment directory.
- The TiSpark jar package is deployed by default in the `jars` folder in the Spark deployment directory.

```
spark/jars/tispark-0.1.0-beta-SNAPSHOT-jar-with-dependencies.jar
spark/jars/tispark-SNAPSHOT-jar-with-dependencies.jar
```
- TiSpark sample data and import scripts are deployed by default in the TiDB-Ansible directory.
```
Expand Down Expand Up @@ -108,8 +108,6 @@ MySQL [TPCH_001]> show tables;

## Use example

Assume that the IP of your PD node is `192.168.0.2`, and the port is `2379`.

First start the spark-shell in the spark deployment directory:

```
Expand Down

0 comments on commit 965cd1d

Please sign in to comment.