Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pod becomes CrashLoopBackOff after runner finished #2782

Closed
7 tasks done
int128 opened this issue Jul 28, 2023 · 0 comments · Fixed by #2787
Closed
7 tasks done

Pod becomes CrashLoopBackOff after runner finished #2782

int128 opened this issue Jul 28, 2023 · 0 comments · Fixed by #2787
Assignees
Labels
bug Something isn't working gha-runner-scale-set Related to the gha-runner-scale-set mode

Comments

@int128
Copy link
Contributor

int128 commented Jul 28, 2023

Checks

Controller Version

0.4.0

Helm Chart Version

0.4.0

CertManager Version

No response

Deployment Method

ArgoCD

cert-manager installation

Yes.

Checks

  • This isn't a question or user support case (For Q&A and community support, go to Discussions. It might also be a good idea to contract with any of contributors and maintainers if your business is so critical and therefore you need priority support
  • I've read releasenotes before submitting this issue and I'm sure it's not due to any recently-introduced backward-incompatible changes
  • My actions-runner-controller version (v0.x.y) does support the feature
  • I've already upgraded ARC (including the CRDs, see charts/actions-runner-controller/docs/UPGRADING.md for details) to the latest and it didn't fix the issue
  • I've migrated to the workflow job webhook event (if you using webhook driven scaling)

Resource Definitions

apiVersion: actions.github.com/v1alpha1
kind: AutoscalingRunnerSet
metadata:
  name: example-amd64
  namespace: arc-runners
  labels:
    app.kubernetes.io/component: "autoscaling-runner-set"
    helm.sh/chart: gha-runner-scale-set-0.4.0
    app.kubernetes.io/name: gha-runner-scale-set
    app.kubernetes.io/instance: example-amd64
    app.kubernetes.io/version: "0.4.0"
    app.kubernetes.io/managed-by: Helm
    app.kubernetes.io/part-of: gha-runner-scale-set
    actions.github.com/scale-set-name: example-amd64
    actions.github.com/scale-set-namespace: arc-runners
  annotations:
    actions.github.com/cleanup-manager-role-binding: example-amd64-gha-runner-scale-set-manager-role-binding
    actions.github.com/cleanup-manager-role-name: example-amd64-gha-runner-scale-set-manager-role
    actions.github.com/cleanup-no-permission-service-account-name: example-amd64-gha-runner-scale-set-no-permission-service-account
spec:
  githubConfigUrl: https://github.com/example/example
  githubConfigSecret: my-secret
  maxRunners: 400

  template:
    spec:
      serviceAccountName: example-amd64-gha-runner-scale-set-no-permission-service-account
      containers:
      - name: runner
        image: 
          ghcr.io/quipper/actions-runner:2.306.0
        resources: 
          limits:
            memory: 16Gi
          requests:
            cpu: "2"
            memory: 4Gi
        securityContext: 
          privileged: true
        env:
        volumeMounts:

To Reproduce

1. Start a job in GitHub Actions
2. Check the status of Pod using kubectl

Describe the bug

A pod of EphemeralRunner is restarted even if the runner is successfully completed.
I observed the pod lifecycle as follows:

  1. Pod is started
  2. Runner container is started
  3. Runner container is completed successfully
  4. Pod is restarted
  5. Runner container is started
  6. Runner container is crashed, because the corresponding job of GitHub Actions is already completed
  7. Finally, the pod is deleted by the controller

The pod is restarted due to the default value of restartPolicy field.
https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/#restart-policy

I noticed this because our Kubernetes running cost is increasing after migration to the new ARC.
I think this behavior may waste the running cost.

Describe the expected behavior

Do not restart a pod if the runner is completed successfully.

I set restartPolicy to OnFailure as follows:

apiVersion: actions.github.com/v1alpha1
kind: AutoscalingRunnerSet
# (snip)
spec:
  template:
    spec:
      restartPolicy: OnFailure

and confirmed that a runner pod does not restart again.

It would be nice if the ARC sets it to OnFailure by default.

Whole Controller Logs

2023-07-27T04:58:05Z INFO EphemeralRunnerSet Created new ephemeral runner {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "runner": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:13Z INFO EphemeralRunner Adding runner registration finalizer {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:13Z INFO EphemeralRunner Successfully added runner registration finalizer {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:13Z INFO EphemeralRunner Adding finalizer {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:13Z INFO EphemeralRunner Successfully added finalizer {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:16Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:23Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:24Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:24Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:24Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is not registered yet {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:25Z INFO EphemeralRunner Creating new ephemeral runner registration and updating status with runner config {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:25Z INFO EphemeralRunner Creating ephemeral runner JIT config {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:25Z INFO EphemeralRunner Created ephemeral runner JIT config {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm", "runnerId": 3914644}
2023-07-27T04:58:25Z INFO EphemeralRunner Updating ephemeral runner status with runnerId and runnerJITConfig {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:25Z INFO EphemeralRunner Updated ephemeral runner status with runnerId and runnerJITConfig {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:50Z INFO EphemeralRunner Creating new ephemeral runner secret for jitconfig. {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:50Z INFO EphemeralRunner Creating new secret for ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:50Z INFO EphemeralRunner Created new secret spec for ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:58:50Z INFO EphemeralRunner Created ephemeral runner secret {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm", "secretName": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:05Z INFO EphemeralRunner Creating new EphemeralRunner pod. {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:05Z INFO EphemeralRunner Creating new pod for ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:05Z INFO EphemeralRunner Created new pod spec for ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:06Z INFO EphemeralRunner Created ephemeral runner pod {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm", "runnerScaleSetId": 1, "runnerName": "example-amd64-c7b8z-runner-8whgm", "runnerId": 3914644, "configUrl": "https://github.com/example/example", "podName": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:17Z INFO service update job info for runner {"runnerName": "example-amd64-c7b8z-runner-8whgm", "ownerName": "example", "repoName": "example", "workflowRef": "****", "workflowRunId": 5676667485, "jobDisplayName": "****", "requestId": 4900641}
2023-07-27T04:59:18Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is running a job {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm", "jobRequestId": 4900641}
2023-07-27T04:59:22Z INFO EphemeralRunnerSet Skipping ephemeral runner since it is running a job {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm", "jobRequestId": 4900641}
2023-07-27T04:59:25Z INFO EphemeralRunner Ephemeral runner container is still running {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:25Z INFO EphemeralRunner Updating ephemeral runner status with pod phase {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm", "phase": "Running", "reason": "", "message": ""}
2023-07-27T04:59:25Z INFO EphemeralRunner Updated ephemeral runner status with pod phase {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T04:59:37Z INFO EphemeralRunner Ephemeral runner container is still running {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:05:24Z INFO service job completed message received. {"RequestId": 4900641, "Result": "succeeded", "RunnerId": 3914644, "RunnerName": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:05:39Z INFO EphemeralRunner Ephemeral runner container is still running {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:05:59Z INFO EphemeralRunner Ephemeral runner container is still running {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:06:37Z INFO EphemeralRunner Ephemeral runner container is still running {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:07:28Z INFO EphemeralRunner Ephemeral runner container is still running {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:08:45Z INFO EphemeralRunner Checking if runner exists in GitHub service {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm", "runnerId": 3914644}
2023-07-27T05:08:45Z INFO EphemeralRunner Runner does not exist in GitHub service {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm", "runnerId": 3914644}
2023-07-27T05:08:45Z INFO EphemeralRunner Ephemeral runner has finished since it does not exist in the service anymore {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:08:45Z INFO EphemeralRunner Updating ephemeral runner status to Finished {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:08:45Z INFO EphemeralRunnerSet Deleting finished ephemeral runner {"ephemeralrunnerset": "arc-runners/example-amd64-c7b8z", "name": "example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:08:45Z INFO EphemeralRunner EphemeralRunner status is marked as Finished {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:08Z INFO EphemeralRunner Successfully removed runner registration finalizer {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:24Z INFO EphemeralRunner Finalizing ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:24Z INFO EphemeralRunner Cleaning up the runner pod {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:24Z INFO EphemeralRunner Deleting the runner pod {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:24Z INFO EphemeralRunner Waiting for ephemeral runner owned resources to be deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:37Z INFO EphemeralRunner Finalizing ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:37Z INFO EphemeralRunner Cleaning up the runner pod {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:37Z INFO EphemeralRunner Pod is deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:37Z INFO EphemeralRunner Cleaning up the runner jitconfig secret {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:37Z INFO EphemeralRunner Deleting the jitconfig secret {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:37Z INFO EphemeralRunner Waiting for ephemeral runner owned resources to be deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Finalizing ephemeral runner {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Cleaning up the runner pod {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Pod is deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Cleaning up the runner jitconfig secret {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Secret is deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Cleaning up runner linked pods {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Runner-linked pods are deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Cleaning up runner linked secrets {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Runner-linked secrets are deleted {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Removing finalizer {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}
2023-07-27T05:09:55Z INFO EphemeralRunner Successfully removed finalizer after cleanup {"ephemeralrunner": "arc-runners/example-amd64-c7b8z-runner-8whgm"}

Whole Runner Pod Logs

time="2023-07-28T00:21:22.539456285Z" level=info msg="Starting up"
time="2023-07-28T00:21:22.540221494Z" level=info msg="libcontainerd: started new containerd process" pid=29
time="2023-07-28T00:21:22.540247521Z" level=info msg="parsed scheme: \"unix\"" module=grpc
time="2023-07-28T00:21:22.540253683Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc
time="2023-07-28T00:21:22.540268677Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/containerd/containerd.sock  <nil> 0 <nil>}] <nil> <nil>}" module=grpc
time="2023-07-28T00:21:22.540281018Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc
time="2023-07-28T00:21:22Z" level=warning msg="containerd config version `1` has been deprecated and will be removed in containerd v2.0, please switch to version `2`, see https://github.com/containerd/containerd/blob/main/docs/PLUGINS.md#version-header"
time="2023-07-28T00:21:22.551939640Z" level=info msg="starting containerd" revision=5b842e528e99d4d4c1686467debf2bd4b88ecd86 version=v1.6.15
time="2023-07-28T00:21:22.560590472Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
time="2023-07-28T00:21:22.560662475Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.560751173Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exec: \"modprobe\": executable file not found in $PATH \"\"): skip plugin" type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.560793064Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.560933473Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (xfs) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.560947663Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.560957794Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured"
time="2023-07-28T00:21:22.560969361Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.561001668Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.561099039Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.561196927Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
time="2023-07-28T00:21:22.561211505Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
time="2023-07-28T00:21:22.561237341Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured"
time="2023-07-28T00:21:22.561248896Z" level=info msg="metadata content store policy set" policy=shared
time="2023-07-28T00:21:22.561857202Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
time="2023-07-28T00:21:22.561874595Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
time="2023-07-28T00:21:22.561889323Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
time="2023-07-28T00:21:22.561921771Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.561942903Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.561960715Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.561977531Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.561995797Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.562164751Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.562177206Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.562187308Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.562204642Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
time="2023-07-28T00:21:22.562291581Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
time="2023-07-28T00:21:22.562357110Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
time="2023-07-28T00:21:22.562628932Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
time="2023-07-28T00:21:22.562648651Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562661496Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
time="2023-07-28T00:21:22.562706222Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562719267Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562733234Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562747893Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562764108Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562780516Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562806612Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562821967Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562834656Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
time="2023-07-28T00:21:22.562935589Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562957642Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562971471Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
time="2023-07-28T00:21:22.562981483Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
time="2023-07-28T00:21:22.562998692Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1
time="2023-07-28T00:21:22.563011279Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
time="2023-07-28T00:21:22.563024518Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin"
time="2023-07-28T00:21:22.563209799Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
time="2023-07-28T00:21:22.563390472Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
time="2023-07-28T00:21:22.563480239Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
time="2023-07-28T00:21:22.563510991Z" level=info msg="containerd successfully booted in 0.012120s"
time="2023-07-28T00:21:22.572241992Z" level=info msg="parsed scheme: \"unix\"" module=grpc
time="2023-07-28T00:21:22.572262184Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc
time="2023-07-28T00:21:22.572275656Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/containerd/containerd.sock  <nil> 0 <nil>}] <nil> <nil>}" module=grpc
time="2023-07-28T00:21:22.572288590Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc
time="2023-07-28T00:21:22.572940016Z" level=info msg="parsed scheme: \"unix\"" module=grpc
time="2023-07-28T00:21:22.572952081Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc
time="2023-07-28T00:21:22.572960804Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/containerd/containerd.sock  <nil> 0 <nil>}] <nil> <nil>}" module=grpc
time="2023-07-28T00:21:22.572965849Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc
time="2023-07-28T00:21:22.575505618Z" level=warning msg="Your kernel does not support cgroup blkio weight"
time="2023-07-28T00:21:22.575521196Z" level=warning msg="Your kernel does not support cgroup blkio weight_device"
time="2023-07-28T00:21:22.575623267Z" level=info msg="Loading containers: start."
time="2023-07-28T00:21:22.670831641Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
time="2023-07-28T00:21:22.727510830Z" level=info msg="Loading containers: done."
time="2023-07-28T00:21:22.732456862Z" level=info msg="Docker daemon" commit=6051f14 graphdriver(s)=overlay2 version=20.10.23
time="2023-07-28T00:21:22.732545954Z" level=info msg="Daemon has completed initialization"
time="2023-07-28T00:21:22.742676124Z" level=info msg="API listen on /var/run/docker.sock"

√ Connected to GitHub

Failed to create a session. The runner registration has been deleted from the server, please re-configure.
Runner listener exit with terminated error, stop the service, no retry needed.
Exiting runner...

Additional Context

It seems possible to set the default value of restartPolicy here.

newPod.Spec = runner.Spec.PodTemplateSpec.Spec

I'm happy to create a pull request.

@int128 int128 added bug Something isn't working needs triage Requires review from the maintainers labels Jul 28, 2023
@int128 int128 changed the title Pod is restarting after completion of runner Pod becomes CrashLoopBackOff after runner finished Jul 31, 2023
@nikola-jokic nikola-jokic added gha-runner-scale-set Related to the gha-runner-scale-set mode and removed needs triage Requires review from the maintainers labels Jul 31, 2023
@nikola-jokic nikola-jokic self-assigned this Aug 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working gha-runner-scale-set Related to the gha-runner-scale-set mode
Projects
None yet
2 participants