Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Helm Chart #14

Merged
merged 2 commits into from
May 18, 2020
Merged

Adding Helm Chart #14

merged 2 commits into from
May 18, 2020

Conversation

almahmoud
Copy link
Contributor

A generic version of the chart used by the Galaxy Project: https://github.com/CloudVE/galaxy-cvmfs-csi-helm/

Replaces #3

@almahmoud almahmoud changed the title Helm chart added Adding Helm Chart Mar 12, 2020
@almahmoud almahmoud mentioned this pull request Mar 12, 2020
@almahmoud almahmoud mentioned this pull request Mar 12, 2020
@@ -0,0 +1,5 @@
apiVersion: v1
appVersion: "1.0.1"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The latest tag/release is 1.1.0, maybe we set that here.

appVersion: "1.0.1"
description: A Helm chart to deploy the CVMFS-CSI Plugin
name: cvmfs-csi
version: 1.3.2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we start with something like 0.1.0?

@@ -0,0 +1,57 @@
kind: Service
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't use (need) the cvmfs attacher, not sure if you rely on it - in principle for this CSI driver it shouldn't be needed. In case you do need it for some reason, could you make it optional with a 'enabled' flag in the values?

@@ -0,0 +1,53 @@
apiVersion: v1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as below, dropping or making this chart optional.

csiProvisioner:
image: quay.io/k8scsi/csi-provisioner:v1.0.1
args:
- "--provisioner=csi-cvmfsplugin"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've updated the provisioner name to match the driver: cvmfs.csi.cern.ch. Not sure if you're relying on this being csi-cvmfsplugin, if yes could we make this configurable? We would need to update it in all the templates as well.

Here's the internal validation test we're using at CERN:
https://github.com/cernops/cvmfs-csi/blob/64110dd66c5f1825b33c35e67c444b5c6e957c5f/test/01-validation-cern.yaml#L5

- "--nodeid=$(NODE_ID)"
- "--endpoint=$(CSI_ENDPOINT)"
- "--v=5"
- "--drivername=csi-cvmfsplugin"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rochaporto
Copy link
Collaborator

Hi @almahmoud .

I've done a first try. A few ocmments in the review, but it looks really good thanks!

# the cache PVs are defined manually for all pods of the DaemonSet.
# If local cache is enabled, alien cache must be disabled.
localCache:
enabled: false
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we invert these defaults? That is, using localCache instead of alienCache? As far as i'm aware most sites run a local cache?

@rochaporto
Copy link
Collaborator

@almahmoud here are a couple changes i've done:
733a034

Let me know, i'll probably merge this soon with the above.

@rochaporto rochaporto merged commit 9c6ca02 into cvmfs-contrib:master May 18, 2020
@almahmoud
Copy link
Contributor Author

All the changes look good to me, sorry for not getting back to this earlier... it’s been a rough few months (our PI passed on top of everything happening in the world)...
Thanks for all the help and getting it merged!

@rochaporto
Copy link
Collaborator

Thanks for the contribution, hope all is well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants