Skip to content

Commit

Permalink
Merge pull request #104 from 2i2c-org/ssh
Browse files Browse the repository at this point in the history
Add docs on kubectl & helm access to pilot hubs
  • Loading branch information
yuvipanda authored Nov 25, 2020
2 parents f0506d2 + 8216148 commit e0f94ed
Showing 1 changed file with 57 additions and 20 deletions.
77 changes: 57 additions & 20 deletions docs/operate.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,38 +2,75 @@

Information about operating the hubs, debugging problems, and performing common actions.

## Gain `kubectl` access to a hub
## Gain `kubectl` & `helm` access to a hub

Each of the hubs in the 2i2c Pilot runs on Google Cloud Platform and Kubernetes. To access the Kubernetes deployments (in order to inspect them or make changes), use the `kubectl` command line tool.
Each of the hubs in the 2i2c Pilot runs on Google Cloud Platform and Kubernetes.
To access the Kubernetes objects (in order to inspect them or make changes), use
the `kubectl` command line tool. You can also use `helm` to

In order to get this tool and ensure that it works, first ensure **ensure your Google account is added to the `two-eye-two-see` hubs project.** If you aren't sure, ask one of the hub administrators to determine if this is the case.
### Project Access

Next, go to the Google Cloud Kubernetes configuration page for this project (at the following URL):
First, you'll need to access the Google Cloud projects on which the hubs run. The most accurate name
of the project can be gleamed from `hubs.yaml` (under `gcp.project` for each cluster entry). Currently,
the projects are:

<https://console.cloud.google.com/kubernetes/list?project=two-eye-two-see>`
| Cluster | Project Name |
| - | - |
| *.pilot.2i2c.cloud | `two-eye-two-see` |
| *.cloudbank.2i2c.cloud | `cb-1003-1696` |

Click on `Connect`, as seen in the figure below.
If you don't have access to these, please get in touch with 2i2c staff.

````{panels}
```{figure} images/gcp-k8s-dashboard.png
```
---
```{figure} images/gcp-run-in-shell.png
```
````
### Commandline tools installation

This will spin up an interactive cloud shell where you have `kubectl` access to the `two-eye-two-see` hub infrastructure.
You can do all this via [Google Cloud Shell](https://cloud.google.com/shell),
but might be easier to do this on your local machine. You'll need the following
tools installed:

:::{admonition,tip} Working locally with the CLI
If you'd like to work locally, you may also install the `kubectl` and `helm` CLIs locally. To do so, follow the [steps from the Z2JH guide](https://zero-to-jupyterhub.readthedocs.io/en/latest/kubernetes/google/step-zero-gcp.html#kubernetes-on-google-cloud-gke) (begin with "choose a terminal"). When that is done, copy and run the "Command-line access" command that is shown on the "Connect to the cluster" page. It is something like:
1. [gcloud](https://cloud.google.com/sdk)
2. [kubectl](https://kubernetes.io/docs/tasks/tools/install-kubectl/)
3. [helm](https://helm.sh/)

### Authenticating

First, you need to [gcloud auth login](https://cloud.google.com/sdk/docs/authorizing#authorizing_with_a_user_account),
so you can perform `gcloud` operations. Next, you need to do [gcloud auth application-default login](https://cloud.google.com/sdk/gcloud/reference/auth/application-default/login)
so `kubectl` and `helm` could use your auth credentials.

### Fetch Cluster credentials

For each cluster, you'll need to fetch credentials at least once with [gcloud container clusters get-credentials](https://cloud.google.com/sdk/gcloud/reference/container/clusters/get-credentials).

```bash
gcloud container clusters get-credentials <cluster-name> --region <region> --project <project-name>
```
gcloud container clusters get-credentials low-touch-hubs-cluster --region us-central1 --project two-eye-two-see
```

You may need to authenticate your local CLI with Google Cloud first with `gcloud auth login`.
:::
You can get authoritative information for `<cluster-name>`, `<zone>` and `<project-name>` from
`hubs.yaml`.

With that, `kubectl` and `helm` should now work!

### (Optional) Access via Google Cloud Shell

Instead of setting up the tools & authenticating locally, you can do all this via
[Google Cloud Shell](https://cloud.google.com/shell). It has all the tools installed,
and the authentication done. Instead of doing `gcloud container clusters get-credentials`
yourself, you can instead:


1. Go to the [Google Cloud Kubernetes Engine page](https://console.cloud.google.com/kubernetes/list) (for the appropriate project)

2. Click on `Connect`, as seen in the figure below.

````{panels}
```{figure} images/gcp-k8s-dashboard.png
```
---
```{figure} images/gcp-run-in-shell.png
```
````

3. This will spin up an interactive cloud shell where you have `kubectl` access.

## Delete a hub

Expand Down

0 comments on commit e0f94ed

Please sign in to comment.