AI jobs getting senderNonce: too many values
#3378
Labels
status: triage
this issue has not been evaluated yet
senderNonce: too many values
#3378
Describe the bug
When sending AI jobs for expensive models (such as DeepSeek), or in the case of LLM pipeline, sending large
max_tokens
parameter such as 163K tokens, it causes a lot of payment tickets to be sent at once.The Orch will show this message.
There is a nonce cap of 150 currently. We need to allow infinite nonce count to be accepted by the Orch, or some way to manage this limit.
For instance, if LLM context windows keep increasing or price keep getting higher, which I believe it will, we need higher throughput of tickets to be redeemed.
This is also prevalent when multiple jobs are sent at the same time, the ticket nonce stacks up and will reach the limit quickly.
To Reproduce
Steps to reproduce the behavior:
max_tokens
parameterThe text was updated successfully, but these errors were encountered: