Skip to content
This repository has been archived by the owner on Jul 15, 2024. It is now read-only.

Sanitize application names #306

Closed
alvarogonzalez-packlink opened this issue Jul 22, 2021 · 0 comments · Fixed by #436
Closed

Sanitize application names #306

alvarogonzalez-packlink opened this issue Jul 22, 2021 · 0 comments · Fixed by #436
Labels
enhancement New feature or request

Comments

@alvarogonzalez-packlink

Summary: automatically generated application names (with a template variable, like {{path.basename}}, should be sanitized to avoid problematic characters.

Background: We have almost a hundred microservices with all its kustomize templates in a repository, so we've created an ApplicationSet using the git generator with this name template:

[...]
  template:
    metadata:
      name: '{{path.basename}}'
      namespace: argocd
[...]

Everything seems to work properly until we notice the number of apps deployed by the application set doesn't match, and after some debugging we find the applications not deployed are the ones locates in directories with an underscore on its name.

Argo ingested the ApplicationSet seemingly ok, and there weren't any errors on Argo CD UI, but finally we found this in argocd-applicationset-controller logs:

2021-07-22T12:08:42.461Z        ERROR   controller-runtime.manager.controller.applicationset    Reconciler error    {"reconciler group": "argoproj.io", "reconciler kind": "ApplicationSet", "name": "production-apps-set", "namespace": "argocd", "error": "Application.argoproj.io \"office_back\" is invalid: metadata.name: Invalid value: \"office_back\": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')"}
github.com/go-logr/zapr.(*zapLogger).Error
        /go/pkg/mod/github.com/go-logr/zapr@v0.2.0/zapr.go:132
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler
        /go/pkg/mod/sigs.k8s.io/controller-runtime@v0.7.0/pkg/internal/controller/controller.go:267
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem
        /go/pkg/mod/sigs.k8s.io/controller-runtime@v0.7.0/pkg/internal/controller/controller.go:235
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func1.1
        /go/pkg/mod/sigs.k8s.io/controller-runtime@v0.7.0/pkg/internal/controller/controller.go:198
k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1
        /go/pkg/mod/k8s.io/apimachinery@v0.19.2/pkg/util/wait/wait.go:185
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1
        /go/pkg/mod/k8s.io/apimachinery@v0.19.2/pkg/util/wait/wait.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil
        /go/pkg/mod/k8s.io/apimachinery@v0.19.2/pkg/util/wait/wait.go:156
k8s.io/apimachinery/pkg/util/wait.JitterUntil
        /go/pkg/mod/k8s.io/apimachinery@v0.19.2/pkg/util/wait/wait.go:133
k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext
        /go/pkg/mod/k8s.io/apimachinery@v0.19.2/pkg/util/wait/wait.go:185
k8s.io/apimachinery/pkg/util/wait.UntilWithContext
        /go/pkg/mod/k8s.io/apimachinery@v0.19.2/pkg/util/wait/wait.go:99

We caught it because we're very early in the deployment process, but if 50 or 100 apps were already in, it could slip through easily unless we compare the number of apps deployed or go through all the logs.

Suggested features/solutions:
There's a couple things that would make this way easier:

The second feature is probably more difficult until ApplicationSet is more coupled to ArgoCD.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants