-
Notifications
You must be signed in to change notification settings - Fork 619
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
staking tests are flaky #3128
Labels
A-chain
Area: Chain, client & related
Comments
nearprotocol-bulldozer bot
pushed a commit
that referenced
this issue
Aug 12, 2020
The flakiness was primarily due to insufficient number of blocks between the epoch switch, and checking for the expected stakes. The epoch was switching at heights equal to 2 modulo 10, and the checks were executed at the same heights. Moreover, the height was retrieved from one node, but the checks were against the other. Fixing it by extending the epoch by two blocks, and moving the checks two blocks ahead. Also adding checks before sending the second set of stakes, it covers a situation when the latest stake during the current epoch is higher than the current stake (a situation the test was not previously testing). Reducing the timeout to 6 minutes from 10 minutes. staking2.py, staking_repro1.py and staking_repro2.py mostly test the same things, so it is still 18 minutes of testing per run. All the failures in the observable past were occurring much earlier than during the first 6 minutes of the test (e.g. see failures here: http://nayduck.eastus.cloudapp.azure.com:3000/#/run/67). Fixes #3128 Test plan: ---------- No failures in 150 runs: http://nayduck.eastus.cloudapp.azure.com:3000/#/run/69
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
All the staking tests occasionally miss a staking transactions, e.g:
staking2.py: http://52.149.162.182:3000/#/test/7719
staking_repro1.py: http://52.149.162.182:3000/#/test/7360
staking_repro2.py: http://52.149.162.182:3000/#/test/6521
The text was updated successfully, but these errors were encountered: