Skip to content
This repository has been archived by the owner on Apr 4, 2024. It is now read-only.

fix(tests): make integration tests more consistent #1494

Closed
wants to merge 9 commits into from

Conversation

ramacarlucho
Copy link
Contributor

Closes: #XXX

Description

Integration tests are not stable enough.
Instead of waiting for tx receipts, attempt to resend tx a couple of times.


For contributor use:

  • Targeted PR against correct branch (see CONTRIBUTING.md)
  • Linked to Github issue with discussion and accepted design OR link to spec that describes this work.
  • Code follows the module structure standards.
  • Wrote unit and integration tests
  • Updated relevant documentation (docs/) or specification (x/<module>/spec/)
  • Added relevant godoc comments.
  • Added a relevant changelog entry to the Unreleased section in CHANGELOG.md
  • Re-reviewed Files changed in the Github PR explorer

For admin use:

  • Added appropriate labels to PR (ex. WIP, R4R, docs, etc)
  • Reviewers assigned
  • Squashed all commits, uses message "Merge pull request #XYZ: [title]" (coding standards)

@ramacarlucho ramacarlucho requested a review from a team as a code owner November 24, 2022 11:45
@ramacarlucho ramacarlucho requested review from Vvaradinov, 4rgon4ut and facs95 and removed request for a team November 24, 2022 11:45
@github-actions github-actions bot added the Type: Tests issues and PR related to tests label Nov 24, 2022
@ramacarlucho ramacarlucho changed the title fix(tests): retry to send tx instead of wait fix(tests): make integration tests more consistent Nov 24, 2022
@fedekunze
Copy link
Contributor

@ramacarlucho is this a duplicate from #1488? Can you explain the different approaches

@ramacarlucho
Copy link
Contributor Author

@ramacarlucho is this a duplicate from #1488? Can you explain the different approaches

@fedekunze most of the tests are failing because we are waiting for a tx receipt on a failed send tx (even on that pr). So this is an attempt to solve that issue.

i think both prs are trying to solve the same issue.

I still have some inconsistency in some tests, that im trying to figure out

@ramacarlucho
Copy link
Contributor Author

@yihuang do you know what is causing this response in the following bit of code?

w3 = custom_ethermint.w3
contract, _ = deploy_contract(w3, CONTRACTS["TestChainID"])
assert 9000 == contract.caller.currentChainID()
msg = {
"to": contract.address,
"data": contract.encodeABI(fn_name="currentChainID"),
}
api_port = ports.api_port(custom_ethermint.base_port(2))
# in normal mode, grpc query works even if we don't pass chain_id explicitly
rsp = grpc_eth_call(api_port, msg)
print(rsp)
assert "code" not in rsp, str(rsp)
assert 9000 == int.from_bytes(base64.b64decode(rsp["ret"].encode()), "big")

Contract deployment should not be faling, so i dont understand the empty response.

@yihuang
Copy link
Contributor

yihuang commented Nov 25, 2022

@yihuang do you know what is causing this response in the following bit of code?

w3 = custom_ethermint.w3
contract, _ = deploy_contract(w3, CONTRACTS["TestChainID"])
assert 9000 == contract.caller.currentChainID()
msg = {
"to": contract.address,
"data": contract.encodeABI(fn_name="currentChainID"),
}
api_port = ports.api_port(custom_ethermint.base_port(2))
# in normal mode, grpc query works even if we don't pass chain_id explicitly
rsp = grpc_eth_call(api_port, msg)
print(rsp)
assert "code" not in rsp, str(rsp)
assert 9000 == int.from_bytes(base64.b64decode(rsp["ret"].encode()), "big")

Contract deployment should not be faling, so i dont understand the empty response.

and the gas_used is weird.

@mmsqe
Copy link
Contributor

mmsqe commented Nov 28, 2022

Contract deployment should not be faling, so i dont understand the empty response.

Seems retry could help to alleviate the gap.

0 {'hash': '0x0000000000000000000000000000000000000000000000000000000000000000', 'logs': [], 'ret': None, 'vm_error': '', 'gas_used': '4611686018427387903'}
[1856](https://github.com/evmos/ethermint/actions/runs/3560964594/jobs/5981443537#step:6:1857)
1 {'hash': '0x0000000000000000000000000000000000000000000000000000000000000000', 'logs': [], 'ret': 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAIyg=', 'vm_error': '', 'gas_used': '4611686018427387903'}

Copy link
Contributor

@adisaran64 adisaran64 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a suggestion. The test_cosmovisor_upgrade is still failing with a timeout error, for some reason it looks like blocks aren't being produced past a certain point?

tests/integration_tests/test_gas.py Outdated Show resolved Hide resolved
tests/integration_tests/test_priority.py Outdated Show resolved Hide resolved
@fedekunze
Copy link
Contributor

@ramacarlucho @adisaran64 can you fix the conflicts?

@fedekunze fedekunze marked this pull request as draft December 5, 2022 08:48
@github-actions
Copy link

github-actions bot commented Feb 6, 2023

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days-before-close if no further activity occurs.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Status: Stale Type: Tests issues and PR related to tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants