Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ScaledJob Paused Condition doesn't change to False when removing annotation #5251

Closed
nappelson opened this issue Dec 4, 2023 · 11 comments
Closed
Assignees
Labels
bug Something isn't working stale All issues that are marked as stale due to inactivity

Comments

@nappelson
Copy link
Contributor

Report

When I set the ScaledJob paused annotation to True

autoscaling.keda.sh/paused: "true"
```, confirm that the job is paused, and then remove that annotation, the Paused condition from `kubectl get ScaledJob` still shows True.

Interestingly enough, the keda operator begins to scale the Job so it doesn't recognize this condition. 

### Expected Behavior

`kubectl get ScaledJob` should show `Paused = False`

### Actual Behavior

`kubectl get ScaledJob` shows `Paused = True` even though the operator is in fact scaling the job.

### Steps to Reproduce the Problem

1. create a ScaledJob with `autoscaling.keda.sh/paused: "true"`
2. confirm ScaledJob is created with `Paused: True` by calling `kubectl get ScaledJob` 
3. remove annotation and apply changes
4. verify that `kubectl get ScaledJob` still shows `Paused: False` even though the job is scaling


### Logs from KEDA operator

The keda operator does in fact show 

INFO ScaledJob is paused, stopping scaling loop.

after removing the annotation even though the ScaledJob resource condition shows that it is still paused.

### KEDA Version

2.12.0

### Kubernetes Version

1.27

### Platform

Google Cloud

### Scaler Details

_No response_

### Anything else?

I found this bug while testing code for https://github.com/kedacore/keda/issues/5215
@nappelson nappelson added the bug Something isn't working label Dec 4, 2023
@nappelson
Copy link
Contributor Author

I think I have found the bug...

err = r.requestScaleLoop(ctx, logger, scaledJob)

Essentially, requestScaleLoop is being called with the old scaledJob object. The conditions haven't been updated yet here -

if err := kedastatus.SetStatusConditions(ctx, r.Client, reqLogger, scaledJob, conditions); err != nil {

requestScaleLoop needs to be called after we call setStatusConditions.

I'm happy to fix this in #5215 or have two separate PRs. What would the maintainers prefer?

@zroubalik
Copy link
Member

@nappelson thanks! Yes, please use 2 different PRs

@nappelson
Copy link
Contributor Author

I think I have found the bug...

err = r.requestScaleLoop(ctx, logger, scaledJob)

Essentially, requestScaleLoop is being called with the old scaledJob object. The conditions haven't been updated yet here -

if err := kedastatus.SetStatusConditions(ctx, r.Client, reqLogger, scaledJob, conditions); err != nil {

requestScaleLoop needs to be called after we call setStatusConditions.

I'm happy to fix this in #5215 or have two separate PRs. What would the maintainers prefer?

Actually, this doesn't appear to be "the bug". The reason why is because ScaledObjects call requestScaleLoop before SetStatusConditions as well...

I'm wondering if this is due to how the respective scale_executors are implemented for ScaledObjects and ScaledJobs and if there is a bug in the executor for jobs.

@zroubalik
Copy link
Member

Yeah, looking briefly at the code. We should probalby store the paused condition right away, it is a special case most likely. For the other conditions it makes sense to way for the result of requestScaleLoop() or another functions/validations.

Or do you see an easy way how we can handle this in the executor? I haven't seen the code for some time, so don't remember anything 😄 Will try to dig in more later on.

Copy link

stale bot commented Feb 19, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale All issues that are marked as stale due to inactivity label Feb 19, 2024
Copy link

stale bot commented Mar 1, 2024

This issue has been automatically closed due to inactivity.

@stale stale bot closed this as completed Mar 1, 2024
@zroubalik zroubalik reopened this Mar 2, 2024
Copy link

stale bot commented Mar 11, 2024

This issue has been automatically closed due to inactivity.

@stale stale bot closed this as completed Mar 11, 2024
@zroubalik zroubalik removed the stale All issues that are marked as stale due to inactivity label Mar 11, 2024
@zroubalik zroubalik reopened this Mar 11, 2024
Copy link

stale bot commented May 10, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale All issues that are marked as stale due to inactivity label May 10, 2024
Copy link

stale bot commented May 18, 2024

This issue has been automatically closed due to inactivity.

@stale stale bot closed this as completed May 18, 2024
@zroubalik zroubalik removed the stale All issues that are marked as stale due to inactivity label May 20, 2024
@zroubalik zroubalik reopened this May 20, 2024
Copy link

stale bot commented Jul 20, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale All issues that are marked as stale due to inactivity label Jul 20, 2024
Copy link

stale bot commented Jul 30, 2024

This issue has been automatically closed due to inactivity.

@stale stale bot closed this as completed Jul 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale All issues that are marked as stale due to inactivity
Projects
Archived in project
Development

No branches or pull requests

2 participants