Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Destroy Command Does Not Clean TiFlash Data Directories #865

Closed
birdstorm opened this issue Oct 27, 2020 · 3 comments · Fixed by #871
Closed

Destroy Command Does Not Clean TiFlash Data Directories #865

birdstorm opened this issue Oct 27, 2020 · 3 comments · Fixed by #871
Labels
status/TODO Categorizes issue as we will do it. type/bug Categorizes issue as related to a bug.
Milestone

Comments

@birdstorm
Copy link
Contributor

birdstorm commented Oct 27, 2020

Bug Report

Please answer these questions before submitting your issue. Thanks!

  1. What did you do?

Use tiup cluster destroy command to destroy a cluster with TiFlash

tiflash_servers:
  - host: 172.16.5.81
    tcp_port: 5020
    http_port: 5030
    flash_service_port: 7333
    flash_proxy_port: 13170
    flash_proxy_status_port: 13292
    metrics_port: 12234
    data_dir: "/home/tidb/birdstorm/data1,/home/tidb/birdstorm/data3"
    config:
      capacity: "10737418240,10737418240"
      logger:
        level: info
  1. What did you expect to see?
    TiFlash data directories should be deleted

  2. What did you see instead?

+ [ Serial ] - DestroyCluster
Destroying component tiflash
Destroying instance 172.16.5.81
Deleting paths on 172.16.5.81: /home/tidb/birdstorm/deploy/tiflash-5020/log /home/tidb/birdstorm/deploy/tiflash-5020 /etc/systemd/system/tiflash-5020.service
Destroy 172.16.5.81 success
- Destroy tiflash paths: [/home/tidb/birdstorm/deploy/tiflash-5020/log /home/tidb/birdstorm/deploy/tiflash-5020 /etc/systemd/system/tiflash-5020.service]
  1. What version of TiUP are you using (tiup --version)?
    v1.2.1
@birdstorm birdstorm added the type/bug Categorizes issue as related to a bug. label Oct 27, 2020
@lucklove lucklove added the status/investigating Indicates that a issue is under investigating. label Oct 28, 2020
@lucklove
Copy link
Member

Thanks for your issue

@lucklove
Copy link
Member

lucklove commented Oct 29, 2020

屏幕快照 2020-10-29 下午8 21 11

It seems not reproduce with a single data dir

@lucklove
Copy link
Member

Ok, when there is multiple data directories, it not works

@lucklove lucklove added status/TODO Categorizes issue as we will do it. and removed status/investigating Indicates that a issue is under investigating. labels Oct 29, 2020
@lucklove lucklove added this to the v1.3.2 milestone Oct 29, 2020
@lucklove lucklove modified the milestones: v1.3.2, v1.2.4 Dec 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/TODO Categorizes issue as we will do it. type/bug Categorizes issue as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants