-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proxy and Redis issues #291
Comments
Logs from the server pod:
Logs from the worker pod:
|
Redis is a requirement https://docs.goauthentik.io/docs/core/architecture#redis I can only suggest deploying a redis server yourself. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi all!
I was trying to deploy Authentik on a K8s cluster made of Raspberry Pis 5 (home lab) and i'm getting some issues that haven't been able to fix...
I'm using the following config
And so far found the following errors:
redis.enabled: true
, the redis pod was constantly failing with the error<jemalloc>: Unsupported system page size
. I found some issues related to this, like [bitnami/redis] container crashed when docker run on arm64 bitnami/containers#26062 . There seems that a fix was provided on the latest images, and i started a container with thebitnami/redis:latest
and at least didn't got the error and the pod was up and running, so probably authentik just needs to update to a newer container tag/ chart version?redis.enabled: false
and:-- (1) both on the logs for the server and worker pods, there was a
{"event": "Redis Connection failed, retrying... (Timeout connecting to server)", "level": "info", "logger": "authentik.lib.config", "timestamp": 1729369834.256011}
error and the pods restarted. So or there's an issue on the config that is not propagating theredis.enabled: false
to all the places it needs or redis is required?-- (2) was getting the error
{"error":"authentik starting","event":"failed to proxy to backend","level":"warning","logger":"authentik.router","timestamp":"2024-10-19T20:31:39Z"}
on the server pod. The pods were terminated automatically and replaced by new ones, but always with the same errorsHelm chart version 2024.8.3 from https://artifacthub.io/packages/helm/goauthentik/authentik
Running on a k3s cluster on Raspberry Pi 5s with version v1.30.5+k3s1
Adding the logs from pods on the comments to avoid an even bigger description
But any insight on how to get those issues fixed and get Authentik running?
And thank you for the support
The text was updated successfully, but these errors were encountered: