-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Persisted Folder(s) #1517
Comments
If you're enjoying Dashy, consider dropping us a ⭐ |
Are you sure your config is valid and works for other apps? Could you share some more details? I don't know that much about kubernetes, but I think other people have manged to install it successfully. |
Yeah, I think this sounds like something with your Kubernetes config, because it should work okay. |
Kuberneties is not my not my area of expertise, so please someone jump in and correct me if I'm wrong here. But I would think that for MicroK8s you'd need something roughly like: PersistentVolumeapiVersion: v1
kind: PersistentVolume
metadata:
name: dashy-pv
spec:
storageClassName: manual
capacity:
storage: 1Gi
accessModes:
- ReadWriteOnce
hostPath:
path: "/mnt/microk8s-data/dashy" PersistentVolumeClaimapiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: dashy-pvc
spec:
storageClassName: manual
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 1Gi DeploymentapiVersion: apps/v1
kind: Deployment
metadata:
name: dashy
spec:
replicas: 1
selector:
matchLabels:
app: dashy
template:
metadata:
labels:
app: dashy
spec:
containers:
- name: dashy
image: lissy93/dashy:latest
ports:
- containerPort: 8080
volumeMounts:
- name: config-volume
mountPath: /app/user-data
volumes:
- name: config-volume
persistentVolumeClaim:
claimName: dashy-pvc |
Below is the application form created with Portainer. For persisted folder in this configuration I used the following values:
No matter what I choose for path in container, and unlike my other apps, nothing ever goes in there and Dashy always goes back to default after a reboot. I see my changes go in /app/user-data when saving to disk, but it will not deploy if I use that for my persisted folder. Hope that helps! :) apiVersion: apps/v1
kind: StatefulSet
metadata:
annotations:
io.portainer.kubernetes.application.note: ""
creationTimestamp: 2024-05-11T20:17:16Z
generation: 1
labels:
io.portainer.kubernetes.application.name: dashy
io.portainer.kubernetes.application.owner: admin
io.portainer.kubernetes.application.stack: dashy
name: dashy
namespace: default
uid: 0b77fa93-6116-44bf-879f-906b25ce9c9a
spec:
podManagementPolicy: OrderedReady
replicas: 1
revisionHistoryLimit: 10
selector:
matchLabels:
app: dashy
serviceName: headless-dashy
template:
metadata:
creationTimestamp: null
labels:
app: dashy
io.portainer.kubernetes.application.name: dashy
spec:
containers:
- env:
- name: TZ
value: America/Montreal
- name: MODE_ENV
value: production
image: lissy93/dashy:latest
imagePullPolicy: Always
name: dashy
resources: {}
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
volumeMounts:
- mountPath: /config
name: dashy-2a785c16-af22-45ec-8077-40ab81c3e6e6
dnsPolicy: ClusterFirst
restartPolicy: Always
schedulerName: default-scheduler
securityContext: {}
terminationGracePeriodSeconds: 30
volumes:
- name: dashy-2a785c16-af22-45ec-8077-40ab81c3e6e6
persistentVolumeClaim:
claimName: dashy-2a785c16-af22-45ec-8077-40ab81c3e6e6
updateStrategy:
rollingUpdate:
partition: 0
type: RollingUpdate
volumeClaimTemplates:
- apiVersion: v1
kind: PersistentVolumeClaim
metadata:
creationTimestamp: null
labels:
app: dashy
io.portainer.kubernetes.application.name: dashy
io.portainer.kubernetes.application.owner: admin
name: dashy-2a785c16-af22-45ec-8077-40ab81c3e6e6
namespace: default
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 5G
storageClassName: microk8s-hostpath
volumeMode: Filesystem
status:
phase: Pending
status:
availableReplicas: 1
collisionCount: 0
currentReplicas: 1
currentRevision: dashy-56f4949bdf
observedGeneration: 1
readyReplicas: 1
replicas: 1
updateRevision: dashy-56f4949bdf
updatedReplicas: 1
---
apiVersion: v1
kind: Service
metadata:
creationTimestamp: 2024-05-11T20:17:16Z
labels:
io.portainer.kubernetes.application.name: dashy
io.portainer.kubernetes.application.owner: admin
io.portainer.kubernetes.application.stack: dashy
name: headless-dashy
namespace: default
uid: 79cce4b7-872d-4904-ab85-a2e75890f6c8
spec:
clusterIP: None
clusterIPs:
- None
internalTrafficPolicy: Cluster
ipFamilies:
- IPv4
ipFamilyPolicy: SingleStack
selector:
app: dashy
sessionAffinity: None
type: ClusterIP
status:
loadBalancer: {}
---
apiVersion: v1
kind: Service
metadata:
creationTimestamp: 2024-05-11T20:17:16Z
labels:
io.portainer.kubernetes.application.name: dashy
io.portainer.kubernetes.application.owner: admin
io.portainer.kubernetes.application.stack: dashy
name: dashy
namespace: default
uid: b31e4430-9b01-44eb-9511-13dc46f6d41b
spec:
clusterIP: 10.152.183.65
clusterIPs:
- 10.152.183.65
externalTrafficPolicy: Cluster
internalTrafficPolicy: Cluster
ipFamilies:
- IPv4
ipFamilyPolicy: SingleStack
ports:
- name: port-0
nodePort: 31080
port: 8080
protocol: TCP
targetPort: 8080
selector:
app: dashy
sessionAffinity: None
type: NodePort
status:
loadBalancer: {} |
This needs to be /app/user-data. Dashy doesn't serve up any other locations (well, except /app/public).
Hmm, that sounds like a setup issue. Is that path deffo there, as a directory, at the specified location, with the right permissions? Do you have an error message for when it doesn't deploy? |
Please see the attached deployment log. dashy-0_logs.txt If I use /app/user-data as the persisted folder, it will delete what is normally held in there when installing Dashy which per error in the log, would include the missing conf.yml. If I specify any other path, it remains empty and has nothing to go to after a reboot. All the Best! |
Does the config exist on the host path when mounting to the folder /app/user-data ? Could you create the config file, by copying the default config here on Github located in /app/user-data/config.yml to you host, for example to /mnt/dockerdata/dashy and then mount /mnt/dockerdata/dashy to the container to /app/user-data ? And the path inside the container really has to be /app/user-data, dashy will expect a config there and not in /config or any other locations. |
Any updates on this? 2 weeks ago or so I realized I was way out of date and updated from 2.1.1 to this 3.1.1. The first run it wiped out my /app/config folder and created the /app/user-data folder, so I had to re-import my config.yml. Nothing on my deployment has changed. Ever since, every time the container recreates it overwrites(?) the /app/user-data/config.yml giving me a stock webpage with none of my links. This is after the logs says the config is valid. Fortunately, the cloud backup portion is not being wiped out. Its not a write/permissions issue on the PVC since i can import manually or restore from the cloud and then save to config, as well as the cloud id being retained despite the config missing. After container recreates you can see the timestamp of the file is May 25 (vs the clock) and is only 1482b After Cloud Restore the timestamp changes to current and the size increases. K8s DetailsIn case you were wondering: Debian 12.7 with 6.10.11 kernel Dashy Logs
k logs -n dashy-ns dashy-8658754949-s7c2h Checking config file against schema... SSL Not Enabled: Public key not present ██████╗ █████╗ ███████╗██╗ ██╗██╗ ██╗ Welcome to Dashy! 🚀 Using Dashy V-3.1.1. Update Check Complete Browserslist: caniuse-lite is outdated. Please run:
File Size Gzipped dist/js/chunk-vendors.c2e0a379.js 7157.57 KiB 2621.08 KiB Images and other types of assets omitted. DONE Build complete. The dist directory is ready to be deployed. |
Please post your manifest, without this it's nearly impossible to tell anything. |
Le Sigh. nvm, I just answered my own stupid question. /app/user-data is not inside of /app/public. I will correct my mount point and try again. |
Environment
MicroK8s Kubernetes
System
No response
Version
2.1.2
Describe the problem
For Dashy to function with MicroK8s Kubernetes, it requires at least one persisted folder for conf.yml, and other files that need to be preserved before a redeployment or reboot. With Portainer, it can be installed using a Kubectl shell as well as with Helm. However, upon a reboot, it reverts back to its original installation, and any changes are lost.
Additional info
No response
Please tick the boxes
Update: With version 3.0 this still doesn't work! Once installed using Portainer, it always redeploys from scratch upon a reboot. Also, if I try to make /app/user-data the persisted folder, the installation fails.
The text was updated successfully, but these errors were encountered: