-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'archive already exist' issue when multiple apps use the same helm chart #8008
Comments
Has anyone figured out any workarounds for this? The error eventually goes away, but it is somewhat frustrating to slow down the sync cycle. I have an applicationset that creates an application for every directory in a git repository. Those directories all import some shared helm charts. The error is very intermittent, but when it occurs, it seems like argo doesn't make another attemp at building the kustomization for a while. (5-10 minutes) |
Committing the entire Seems like maybe this bug is more of an upstream issue in kustomize or even helm. I have a repro of the bug here https://github.com/adrianlyjak/kustomize-bugs |
Affecting to me and is really annoying. |
Combined with argo-cd notifications on Unknown State of apps, this is a huge pain, as it gets triggered several times per hour. |
Having the same issue (ArgoCD 2.7.9, Kustomize v5). Storing the Helm chart files in Git is not an option to me, it would be painful to manage any time there's an update to the Helm chart. |
I guess this is a concurrency issue: if apps were "built" one by one, there would be no issue? |
I've worked around this by adding
to my Kustomise file. The issue is that the chart is locally cached under |
To give a concrete example of a situation where this can happen and for which no workaround has been identified so far:
The Kustomize overlays each referring to the base as usual with Kustomize. Overlay1 and 2 each being referenced in an ArgoCD application (via an ApplicationSet and git generator in my case but don't think that matters). |
We ran into this issue, as well. It's a pure race condition. The solution, at this time, is to vendor the chart locally in a centralized location next to your kustomization. You can actually recreate the
If you catch it fast enough, when running As mentioned above, we vendored our chart locally and pointed our base kustomization to it with: helmGlobals:
chartHome: ../../.helm/<chart> Now, since the chart exists in the repo, ArgoCD won't completely clear it out, and the kustomize build process won't have to pull it. It's also easy to manage your vendored charts with something like vendir |
I've opened an issue at Kustomize to track this: kubernetes-sigs/kustomize#5271 Feel free to comment there to raise awareness. From my experience, it takes very long time to get things moving in Kustomize though. I would love if ArgoCD could somehow workaround it by adding maybe some small delay between apps. |
FYI in my case, the workaround is to use the new "progressive sync" feature of ApplicationSets so that there's no concurrency for these cases. EDIT: surprisingly, even with a progressive sync of one by one, I still get the error from time to time :'( |
Encountered this recently, but turned out it was non-consequential / a red herring. The real issue was due to invalid YAML in our Helm values defined in our |
If you are trying to resolve an environment-specific issue or have a one-off question about the edge case that does not require a feature then please consider asking a question in argocd slack channel.
Checklist:
argocd version
.Describe the bug
We have several different apps deployed from the same helm chart and regularly getting errors like this:
To Reproduce
Create a several kustomize-based application which are using same helm chart via helmCharts
Expected behavior
no warning during apply
Screenshots
If applicable, add screenshots to help explain your problem.
Version
Logs
The text was updated successfully, but these errors were encountered: