Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(#1359): add option to keep missing clusters in config #2213

Merged
merged 1 commit into from
Nov 12, 2023

Conversation

kalioz
Copy link
Contributor

@kalioz kalioz commented Sep 4, 2023

Hey !

First of all I love this project, thanks for maintaining it :)

This PR adds an option to keep "missing" clusters in the k9s config, which is useful when the user is using multiple Kubeconfig files.
The default mechanism is still to remove clusters missing from the k9s config.

Link to issue : #1359

@kalioz kalioz marked this pull request as ready for review September 4, 2023 13:11
@kalioz kalioz changed the title fix(#1359): add option to keep missing clusters in config Draft: fix(#1359): add option to keep missing clusters in config Sep 7, 2023
@kalioz kalioz changed the title Draft: fix(#1359): add option to keep missing clusters in config fix(#1359): add option to keep missing clusters in config Oct 11, 2023
@kalioz
Copy link
Contributor Author

kalioz commented Oct 11, 2023

I've been using this for the past month, and didn't find any issue - I own 5 clusters
Kinda forgot I put this PR on draft ^^'

@igreg
Copy link

igreg commented Nov 8, 2023

That would be a great addition; I chose to use multiple Kube config files and was wondering why k9s kept removing clusters from its configuration. It would be great if we could have to choice to "maintain" this cluster ourselves.

@derailed derailed added enhancement New feature or request needs-review PR needs to be reviewed labels Nov 12, 2023
Copy link
Owner

@derailed derailed left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kalioz Good feature! Thank you for this update Clement!!

@derailed derailed merged commit 24e244c into derailed:master Nov 12, 2023
1 check passed
rm-hull added a commit to rm-hull/k9s that referenced this pull request Nov 12, 2023
* 'master' of github.com:derailed/k9s: (130 commits)
  added flux suspended resources retrieval plugin (derailed#1584)
  Provide white blur so images work in dark modes (derailed#1597)
  Add context to get-all (derailed#1701)
  fix brew command in the readme (derailed#2012)
  Add support for using custom kubeconfig with log_full plugin (derailed#2014)
  feat: allow for multiple plugin files in $XDG_DATA_DIRS/k9s/plugins (derailed#2029)
  Clean up issues introduced by derailed#2125 (derailed#2289)
  Pod view resembles more the output of kubectl get pods -o wide (derailed#2125)
  Update README.md with snap install (derailed#2262)
  Add snapcraft config (derailed#2123)
  storageclasses view keeps the same output as kubectl get sc (derailed#2132)
  Fix merge issues with PR derailed#2168 (derailed#2288)
  Add colour config for container picker (derailed#2140)
  Add env var to disable node pod counts (derailed#2168)
  Use current k9s NS if new context has no default NS (derailed#2197)
  Bump actions/setup-go from 4.0.1 to 4.1.0 (derailed#2200)
  fix: trigger a single log refresh after changing 'since' (derailed#2202)
  Add crossplane plugin (derailed#2204)
  fix(derailed#1359): add option to keep missing clusters in config (derailed#2213)
  K9s release v0.28.2
  ...
thejoeejoee pushed a commit to thejoeejoee/k9s that referenced this pull request Feb 23, 2024
…railed#2213)

Co-authored-by: Clément Loiselet <clement.loiselet@cbp-group.com>
placintaalexandru pushed a commit to placintaalexandru/k9s that referenced this pull request Apr 3, 2024
…railed#2213)

Co-authored-by: Clément Loiselet <clement.loiselet@cbp-group.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request needs-review PR needs to be reviewed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants