-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Intermittent "InvalidTaskResultReference" when running tasks with big results #4529
Comments
I think its very likely possible and surprised to see this happening intermittently. The Lines 109 to 115 in 6cb0f4c
But there is more updates done after updating the status to completed. In the end, the taskRun status is updated with the results: Line 138 in 6cb0f4c
So if its possible for the pipelineRun controller to pick up the completed status before the task results are populated, its also possible to run into this issue. |
/priority important-soon |
@lbernick thanks for adding the priority 👍 Let me know if you want to take a stab at it else I am happy to try fixing it 🙏 /assign |
/assign |
@skaegi @pritidesai if you manage to reproduce, can you look at the container's termination message. We do have some similar failures that happens because in the case it fails, the termination message content (json) is cut in the middle (and thus invalid). |
/assign
|
Closing in favor of #4808 which tracks larger results in general |
We've been seeing an increasing number of teams reporting
InvalidTaskResultReference
problems in pipelines that might otherwise run successfully. We are seeing this infrequently (1 in 40) but we've now seen it a number of different unrelated pipelines. Something like this...The "results" in question are generally large-ish -- 2K+ (correction here thanks @pritidesai) and are present when we look ;)
We suspect that what's happening is that there is a race somewhere. Perhaps a race between the "next" task running and the current TaskRun being updated with the Result. That's just a guess, but does this seem possible/likely?
I'll try to create a good test-case but this sort of race condition is not easy to get to happen on demand although it clearly is occurring with some regularity.
The text was updated successfully, but these errors were encountered: