Skip to content
This repository has been archived by the owner on Nov 6, 2020. It is now read-only.

Node is failing to sync with error "Error: Error(Engine(RequiresClient)" #10085

Closed
ArseniiPetrovich opened this issue Dec 19, 2018 · 102 comments · Fixed by #10837
Closed

Node is failing to sync with error "Error: Error(Engine(RequiresClient)" #10085

ArseniiPetrovich opened this issue Dec 19, 2018 · 102 comments · Fixed by #10837
Labels
A3-stale 🍃 Pull request did not receive any updates in a long time. No review needed at this stage. Close it. F2-bug 🐞 The client fails to follow expected behavior. M4-core ⛓ Core client code / Rust. P2-asap 🌊 No need to stop dead in your tracks, however issue should be addressed as soon as possible.
Milestone

Comments

@ArseniiPetrovich
Copy link

  • Parity Ethereum version: tried 2.2.5 and 2.2.1
  • Operating system: Ubuntu 16/18
  • Installation: downloaded binary
  • Fully synchronized: no and yes
  • Network: POA xDAI
  • Restarted: both
    Error log:
2018-12-19 13:31:46  Stage 3 block verification failed for #1210413 (0xc2a1…9c29)
Error: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
2018-12-19 13:31:46  
Bad block detected: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
RLP: f90249f90244a015bf167c5b4c07587408ce07860c6a8a6f4ea098c2457f86042575e4073db456a01dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347946dc0c0be4c8b2dfe750156dc7d59faabfb5b923da0518feb50424fef5033af4fd2d11281e7919218c2538d45db7433191e3c0347d0a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000090fffffffffffffffffffffffffffffffd8312782d8398968080845c1a1e0c9fde830202018f5061726974792d457468657265756d86312e33302e31826c6984126b9f9cb8416432efb986cffdd90973796712f64819aa5daa07ecd21a40284c6a82f1dd7fe64e031a733e95c40e6dae2f9d52c0b56c4f380b189ce17409fe79066d0cf4109b00c0c0
Header: Header { parent_hash: 0x15bf167c5b4c07587408ce07860c6a8a6f4ea098c2457f86042575e4073db456, timestamp: 1545215500, number: 1210413, author: 0x6dc0c0be4c8b2dfe750156dc7d59faabfb5b923d, transactions_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347, extra_data: [222, 131, 2, 2, 1, 143, 80, 97, 114, 105, 116, 121, 45, 69, 116, 104, 101, 114, 101, 117, 109, 134, 49, 46, 51, 48, 46, 49, 130, 108, 105], state_root: 0x518feb50424fef5033af4fd2d11281e7919218c2538d45db7433191e3c0347d0, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 10000000, difficulty: 340282366920938463463374607431768211453, seal: [[132, 18, 107, 159, 156], [184, 65, 100, 50, 239, 185, 134, 207, 253, 217, 9, 115, 121, 103, 18, 246, 72, 25, 170, 93, 170, 7, 236, 210, 26, 64, 40, 76, 106, 130, 241, 221, 127, 230, 78, 3, 26, 115, 62, 149, 196, 14, 109, 174, 47, 157, 82, 192, 181, 108, 79, 56, 11, 24, 156, 225, 116, 9, 254, 121, 6, 109, 12, 244, 16, 155, 0]], hash: Some(0xc2a1a685d2cedd5f3c69826bfb7a8210213dd58e177dd438512baaa717cb9c29) }
Uncles: 
Transactions:


We have launched a number of validator nodes at xDAI network (statistics at https://dai-netstat.poa.network) at AWS that works pretty fine. After that we have tried to launch one of the nodes via the same way, but at the local infrastructure (Protofire Validator). Node is hided behind the NAT, and 30303 port were made public. It worked fine for some time and then errors started to occur (see logs above). We've launched a full resync, and it helped, but for a short period of time only.
We've decided, that the roots of that issue might lie in a fact that Ubuntu 18 were used. So we launched an Ubuntu 16, but it didn't help.
Can you help us solving the issue?

@jam10o-new jam10o-new added Z1-question 🙋‍♀️ Issue is a question. Closer should answer. M4-core ⛓ Core client code / Rust. labels Dec 19, 2018
@jam10o-new
Copy link
Contributor

Is this related to #9114 @sorpaas?

@ArseniiPetrovich this might be related to the fixes to emptyStep behaviour in 2.2.5 - but I'm not certain xDai uses emptySteps. Are you sure you're seeing this error in 2.2.1 and have you tried making the changes outlined in the changelog https://github.com/paritytech/parity-ethereum/releases/tag/v2.2.5 ?

I don't think this is related in any way to your OS.

@ArseniiPetrovich
Copy link
Author

xDai does not uses emptySteps, so it look like that problems aren't related.

@5chdn 5chdn added this to the 2.3 milestone Jan 2, 2019
@jam10o-new jam10o-new added F3-annoyance 💩 The client behaves within expectations, however this “expected behaviour” itself is at issue. and removed Z1-question 🙋‍♀️ Issue is a question. Closer should answer. labels Jan 5, 2019
@5chdn 5chdn modified the milestones: 2.3, 2.4 Jan 10, 2019
@jcortejoso
Copy link

Hi! I think I am seeing the same behavior in one node of our PoA. We are running v2.3.0-nightly (commit bf9fedc4ee2eaa4dcbc6fcb9ef73bdf6967ee071), secret store is enabled in this node. I find it strange because this node keep importing the new blocks. The other nodes (same configuration) works well. These are the logs:

Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:24 UTC Stage 3 block verification failed for #20576 (0x28ce…3a4b)
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Error: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:24 UTC
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Bad block detected: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: RLP: f90248f90243a0c0b6d8709608a422297d286640d11af4fabd9865b850a18198f41d9ec0d5764ea01dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d4934794a1345ed0b2d1e193aead673e33dac56515af128aa06517af1a718ec3f7e459f809438c1b96c9a394d85746b2cffd9538cfec336d2ea056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000090ffffffffffffffffffffffffffffffef8250608365b9aa80845c3c44f59fde830203008f5061726974792d457468657265756d86312e33312e31826c698412727431b8413aa81df658fd004c8b0589947dc2c7debeaf950ef11690156b6324272cd2bc8671a9f867df91b9cfc3acf5e95a05056f87d72463598a71f0740866cc5f909d2000c0c0
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Header: Header { parent_hash: 0xc0b6d8709608a422297d286640d11af4fabd9865b850a18198f41d9ec0d5764e, timestamp: 1547453685, number: 20576, author: 0xa1345ed0b2d1e193aead673e33dac56515af128a, transactions_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347, extra_data: [222, 131, 2, 3, 0, 143, 80, 97, 114, 105, 116, 121, 45, 69, 116, 104, 101, 114, 101, 117, 109, 134, 49, 46, 51, 49, 46, 49, 130, 108, 105], state_root: 0x6517af1a718ec3f7e459f809438c1b96c9a394d85746b2cffd9538cfec336d2e, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 6666666, difficulty: 340282366920938463463374607431768211439, seal: [[132, 18, 114, 116, 49], [184, 65, 58, 168, 29, 246, 88, 253, 0, 76, 139, 5, 137, 148, 125, 194, 199, 222, 190, 175, 149, 14, 241, 22, 144, 21, 107, 99, 36, 39, 44, 210, 188, 134, 113, 169, 248, 103, 223, 145, 185, 207, 195, 172, 245, 233, 90, 5, 5, 111, 135, 215, 36, 99, 89, 138, 113, 240, 116, 8, 102, 204, 95, 144, 157, 32, 0]], hash: Some(0x28ce97bb48c7decc725eae5e994a1ab48cc56e03bff469b6e73e0552c75d3a4b) }
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Uncles:
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Transactions:
Jan 14 08:27:26 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:26 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:27:56 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:56 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:28:26 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:28:26 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:28:56 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:28:56 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:29:05 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:29:05 UTC Imported #19973 0x07d5…5578 (0 txs, 0.00 Mgas, 50 ms, 0.57 KiB)```

@jam10o-new
Copy link
Contributor

Hey @ArseniiPetrovich, just pinging to check whether you are still seeing anything similar on recent releases or if this issue can be closed.

I should have asked about the status of the validators on the xDai network and whether you could verify that they were all online / the blocks you see in your logs matched the blocks on blockscout, but it is probably a little late for checking that now 😅

@varasev
Copy link
Contributor

varasev commented Feb 15, 2019

I caught the same error on one of our nine local nodes when test running. That error desynchronized the nodes the first node and the error seemed to appear after finalizeChange had been called but I’m not sure that was a reason (maybe coincidence). I’m attaching the logs for all 9 nodes. Please pay attention to the different blocks hashes after the error on all the first and the rest nodes and benign reports. See the logs for the first node (which was a validator along with nodes 2-6) from the First incident.zip attached.

I tried to launch our test setup once more from scratch and caught the same error Stage 3 block verification failed for #86 but at another block and only on the node #8 this time (which was not a validator). See the Second incident.zip (and node8 log file inside).

First incident.zip
Second incident.zip

@5chdn 5chdn modified the milestones: 2.4, 2.5 Feb 21, 2019
@phahulin
Copy link
Contributor

phahulin commented Mar 5, 2019

Hi, is there any insight on the probable cause of this? We are still seeing it on 2.3.2.

@jam10o-new jam10o-new added F2-bug 🐞 The client fails to follow expected behavior. and removed F3-annoyance 💩 The client behaves within expectations, however this “expected behaviour” itself is at issue. labels Mar 5, 2019
@ArseniiPetrovich
Copy link
Author

Hi, @joshua-mir. I can confirm the issue still exists and affects validators from time to time.

@gitcoinbot
Copy link

Issue Status: 1. Open 2. Started 3. Submitted 4. Done


This issue now has a funding of 500.0 DAI (500.0 USD @ $1.0/DAI) attached to it as part of the poanetwork fund.

@HCastano
Copy link
Contributor

HCastano commented Mar 9, 2019

I'm not looking to take the bounty, but I can take a look at this next week.

@SurfingNerd
Copy link

i was able to reproduce the issue very frequently, running an trustnode for ARTIS tau1 testnet at home.
parity 2.2.10 with authority round engine
https://github.com/lab10-coop/tau1

When i ran the node in a serverhouse, everything was fine.
So i upgraded my 5 year old stock router at home with a new one.
That solved the problem, so i can't reproduce it anymore.

So either try to get a really bad router, or maybe the instable network can be simulated ?!

@gitcoinbot
Copy link

gitcoinbot commented Mar 13, 2019

Issue Status: 1. Open 2. Cancelled


Work has been started.

These users each claimed they can complete the work by 9 months, 3 weeks from now.
Please review their action plans below:

1) faabsorg has started work.

Let's take a look at it.
I'll need ssh access to one of the nodes that are having the issue.
jim@commercebyte.com

Learn more on the Gitcoin Issue Details page.

@HCastano
Copy link
Contributor

HCastano commented Mar 15, 2019

@varasev Hey, can you share the chain spec and node config files you were using while you collected the First/Second Incident logs?

Also, in the future if you could run your node with --logging engine=debug,finality=debug it would be super helpful in debugging what's going on.

@varasev
Copy link
Contributor

varasev commented Mar 16, 2019

@HCastano For that launching, I used our posdao-test-setup repo and our a bit modified build of Parity.

To start the setup, the next commands should be performed:

$ git clone https://github.com/poanetwork/posdao-test-setup
$ cd posdao-test-setup
$ npm i
$ npm run get-all-submodules
$ npm run cleanup
$ npm run compile-posdao-contracts
$ npm run make-spec
$ npm run start-test-setup

Then you can watch the logs for all the nine local nodes in the parity-data directory during the first 100 blocks. If you don't see the error during the first launch, try to start again:

$ npm run stop-test-setup
$ npm run cleanup
$ npm run start-test-setup

The error doesn't appear every time, so it's not easy to catch it, but I think it would be enough to wait for the first 100 blocks for each test launching.

@HCastano
Copy link
Contributor

@varasev Hey, I set up your fork of Parity Ethereum as well as the posdao-test-setup repo and did mange to see the Stage 3 block verification error (albeit not that often). From the logs I got it seems that there are multiple blocks being produced at the same height by the same validator. If you take a look at this Gist you can see that starting at block 101 there are multiple blocks being built at the same height. This eventually leads to an error in node6 while trying to make block 104.

Looking further, I noticed that in the node.toml files there seem to be multiple nodes with the same engine_signer. I think that this is the reason that there were multiple blocks being made at the same height. This could also explain the Stage 3 block verification error. This error happens when the current candidate block's parent block isn't in the database. So if the current validators decided to build on the wrong chain their parent wouldn't be in the database and thus this error would occur.

I've tried running four nodes using the latest master of Parity Ethereum to try and replicate this problem there. I've set up two of the nodes to have the same engine_signer. Unfortunately I haven't been able to see the issue again. I've tried using a static validator set, and using the OwnedSet.sol contract as well.

Even though I wasn't able to confirm this was the root cause, going forward I don't think you should be reusing your signer key on multiple nodes. Try running your setup without this and let me know if you see the issue again. Alternatively, try running with with the same engine_signer on multiple nodes using the "canonical" Parity Ethereum client and see if you're able to replicate the bug.

@varasev
Copy link
Contributor

varasev commented Mar 19, 2019

Yes, we have a wrong set of nodes in that test repo, so those three validators launch two nodes each with the same engine_signer. Will fix that. But as far as I know, the error also occurs in xDai network which doesn't have such duplicated nodes. @ArseniiPetrovich @phahulin am I right?

@phahulin
Copy link
Contributor

phahulin commented Mar 21, 2019

Right, seen it just today on xDai chain, where we (POA Network) run a single node for our validator. Here's log

2019-03-21 06:53:15 UTC Stage 3 block verification failed for #2762467 (0x1cb6…b473)
Error: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
2019-03-21 06:53:15 UTC 
Bad block detected: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace:
 None } })
RLP: f90249f90244a006a14efb409ea712ff9aa1a376beb26f1ba1c6bb1a49188f604800e61762ef69a01dcc4de8dec75d7aab85b567b6ccd41a
d312451b948a7413f0a142fd40d49347949e41ba620feba8198369c26351063b26ec5b7c9ea0691d46159abb7f64319bbdd54f1209bc8166cceed
5a167eb68ae7641881aa461a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692
c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000090fffffffffffffffffffffffffff
ffffd832a26e38398968080845c9334d69fde830203028f5061726974792d457468657265756d86312e33312e31826c69841283d75eb841db05a9
85134a9a0f986ba526de76cd31f6bc7df9dcb14dc27fadbc741194afb0591253e5b3cf24e55345c4a804a98c3ff2d59cce4363b57b39782b4fe90
a422900c0c0
Header: Header { parent_hash: 0x06a14efb409ea712ff9aa1a376beb26f1ba1c6bb1a49188f604800e61762ef69, timestamp: 15531511
90, number: 2762467, author: 0x9e41ba620feba8198369c26351063b26ec5b7c9e, transactions_root: 0x56e81f171bcc55a6ff8345e
692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d
49347, extra_data: [222, 131, 2, 3, 2, 143, 80, 97, 114, 105, 116, 121, 45, 69, 116, 104, 101, 114, 101, 117, 109, 13
4, 49, 46, 51, 49, 46, 49, 130, 108, 105], state_root: 0x691d46159abb7f64319bbdd54f1209bc8166cceed5a167eb68ae7641881a
a461, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 10000000, difficulty: 340282366920938463463374607431768211453, seal: [[132, 18, 131, 215, 94], [184, 65, 219, 5, 169, 133, 19, 74, 154, 15, 152, 107, 165, 38, 222, 118, 205, 49, 246, 188, 125, 249, 220, 177, 77, 194, 127, 173, 188, 116, 17, 148, 175, 176, 89, 18, 83, 229, 179, 207, 36, 229, 83, 69, 196, 168, 4, 169, 140, 63, 242, 213, 156, 206, 67, 99, 181, 123, 57, 120, 43, 79, 233, 10, 66, 41, 0]], hash: Some(0x1cb6604ae6c76ffd888fc8412a994f2a34a6776d34a44f7a2458516d0028b473) }
Uncles: 
Transactions:

After this error the validator stops accepting blocks from the rest of the network and keeps producing new blocks in his own "fork".

@igorbarinov
Copy link

Saw it again on xDai Chain today again😬

@soc1c soc1c removed this from the 2.5 milestone Apr 2, 2019
@phahulin
Copy link
Contributor

@lazaridiscom we've tested parity's solution on kovan and couldn't reproduce the issue anymore, so in that respect it looks resolved now.
Do you have any other scenarios (could you open your repos?) that would show that this solution is incomplete?

@openethereum openethereum deleted a comment Jun 24, 2019
@openethereum openethereum deleted a comment Jun 24, 2019
@gitcoinbot
Copy link

Issue Status: 1. Open 2. Cancelled


The funding of 2500.0 DAI (2500.0 USD @ $1.0/DAI) attached to this issue has been cancelled by the bounty submitter

@gitcoinbot
Copy link

⚡️ A tip worth 500.00000 DAI (500.0 USD @ $1.0/DAI) has been granted to @dvdplm for this issue from @igorbarinov. ⚡️

Nice work @dvdplm! To redeem your tip, login to Gitcoin at https://gitcoin.co/explorer and select 'Claim Tip' from dropdown menu in the top right, or check your email for a link to the tip redemption page.

@gitcoinbot
Copy link

⚡️ A tip worth 100.00000 DAI (100.0 USD @ $1.0/DAI) has been granted to @VladLupashevskyi for this issue from @igorbarinov. ⚡️

Nice work @VladLupashevskyi! To redeem your tip, login to Gitcoin at https://gitcoin.co/explorer and select 'Claim Tip' from dropdown menu in the top right, or check your email for a link to the tip redemption page.

@gitcoinbot
Copy link

⚡️ A tip worth 500.00000 DAI (500.0 USD @ $1.0/DAI) has been granted to @sorpaas for this issue from @igorbarinov. ⚡️

Nice work @sorpaas! To redeem your tip, login to Gitcoin at https://gitcoin.co/explorer and select 'Claim Tip' from dropdown menu in the top right, or check your email for a link to the tip redemption page.

@gitcoinbot
Copy link

⚡️ A tip worth 100.00000 DAI (100.0 USD @ $1.0/DAI) has been granted to @lazaridiscom for this issue from @igorbarinov. ⚡️

Nice work @lazaridiscom! To redeem your tip, login to Gitcoin at https://gitcoin.co/explorer and select 'Claim Tip' from dropdown menu in the top right, or check your email for a link to the tip redemption page.

@ghost
Copy link

ghost commented Jun 24, 2019

@igorbarinov , I do not accept this tip, ask the gitcoin clowns for a refund.

@ghost
Copy link

ghost commented Jul 12, 2019

@debris, this is not solved by #10837.

@SidhMj
Copy link

SidhMj commented Sep 6, 2019

Hi! I think I am seeing the same behavior in one node of our PoA. We are running v2.3.0-nightly (commit bf9fedc4ee2eaa4dcbc6fcb9ef73bdf6967ee071), secret store is enabled in this node. I find it strange because this node keep importing the new blocks. The other nodes (same configuration) works well. These are the logs:

Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:24 UTC Stage 3 block verification failed for #20576 (0x28ce…3a4b)
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Error: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:24 UTC
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Bad block detected: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: RLP: f90248f90243a0c0b6d8709608a422297d286640d11af4fabd9865b850a18198f41d9ec0d5764ea01dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d4934794a1345ed0b2d1e193aead673e33dac56515af128aa06517af1a718ec3f7e459f809438c1b96c9a394d85746b2cffd9538cfec336d2ea056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000090ffffffffffffffffffffffffffffffef8250608365b9aa80845c3c44f59fde830203008f5061726974792d457468657265756d86312e33312e31826c698412727431b8413aa81df658fd004c8b0589947dc2c7debeaf950ef11690156b6324272cd2bc8671a9f867df91b9cfc3acf5e95a05056f87d72463598a71f0740866cc5f909d2000c0c0
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Header: Header { parent_hash: 0xc0b6d8709608a422297d286640d11af4fabd9865b850a18198f41d9ec0d5764e, timestamp: 1547453685, number: 20576, author: 0xa1345ed0b2d1e193aead673e33dac56515af128a, transactions_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347, extra_data: [222, 131, 2, 3, 0, 143, 80, 97, 114, 105, 116, 121, 45, 69, 116, 104, 101, 114, 101, 117, 109, 134, 49, 46, 51, 49, 46, 49, 130, 108, 105], state_root: 0x6517af1a718ec3f7e459f809438c1b96c9a394d85746b2cffd9538cfec336d2e, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 6666666, difficulty: 340282366920938463463374607431768211439, seal: [[132, 18, 114, 116, 49], [184, 65, 58, 168, 29, 246, 88, 253, 0, 76, 139, 5, 137, 148, 125, 194, 199, 222, 190, 175, 149, 14, 241, 22, 144, 21, 107, 99, 36, 39, 44, 210, 188, 134, 113, 169, 248, 103, 223, 145, 185, 207, 195, 172, 245, 233, 90, 5, 5, 111, 135, 215, 36, 99, 89, 138, 113, 240, 116, 8, 102, 204, 95, 144, 157, 32, 0]], hash: Some(0x28ce97bb48c7decc725eae5e994a1ab48cc56e03bff469b6e73e0552c75d3a4b) }
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Uncles:
Jan 14 08:27:24 ip-172-31-0-123.ec2.internal docker[23873]: Transactions:
Jan 14 08:27:26 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:26 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:27:56 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:27:56 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:28:26 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:28:26 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:28:56 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:28:56 UTC    0/25 peers     78 KiB chain    4 MiB db  0 bytes queue    1 MiB sync  RPC:  0 conn,    0 req/s,    0 µs
Jan 14 08:29:05 ip-172-31-0-123.ec2.internal docker[23873]: 2019-01-14 08:29:05 UTC Imported #19973 0x07d5…5578 (0 txs, 0.00 Mgas, 50 ms, 0.57 KiB)```

I have same issue. @jcortejoso let me know if you able to solve this

@jam10o-new
Copy link
Contributor

@sidharthaA could you share more information like the version you are running and any logs showing the issue?

@SidhMj
Copy link

SidhMj commented Sep 6, 2019

2019-09-06 13:59:51 UTC Starting Parity-Ethereum/v2.5.7-stable-6bd7db9-20190829/x86_64-linux-gnu/rustc1.36.0
2019-09-06 13:59:51 UTC Keys path /home/parity/data/keys/nitchain
2019-09-06 13:59:51 UTC DB path /home/parity/data/chains/nitchain/db/652780e9e78ae1c2
2019-09-06 13:59:51 UTC State DB configuration: fast
2019-09-06 13:59:51 UTC Operating mode: active
2019-09-06 13:59:51 UTC Configured for nitchain using AuthorityRound engine
2019-09-06 13:59:52 UTC Listening for new connections on 0.0.0.0:8546.
2019-09-06 13:59:57 UTC Public node URL: enode://7936e78db2ac455a9b829a08794d8bc371b54d900bc989d7d83af6e922f237e885555cca4139ba1af38d22351f45d9fea88c23a1407a72804dcac7f493096036@172.24.0.2:30303
2019-09-06 14:00:04 UTC Stage 3 block verification failed for #199 (0xef43…4d1b)
Error: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: Some(stack backtrace:
   0:     0x56351f6492ad - <no info>
   1:     0x56351f41cebc - <no info>
   2:     0x56351f0931dc - <no info>
   3:     0x56351f09a050 - <no info>
   4:     0x56351ef5f8b3 - <no info>
   5:     0x56351f125c2b - <no info>
   6:     0x56351e9c7021 - <no info>
   7:     0x56351eeeb1e9 - <no info>
   8:     0x56351eedf53b - <no info>
   9:     0x56351eef09c7 - <no info>
  10:     0x56351ef88ba4 - <no info>
  11:     0x56351f6a689e - <no info>
  12:     0x56351f6a93eb - <no info>
  13:     0x7fa88f2036b9 - <no info>
  14:     0x7fa88ed2341c - <no info>
  15:                0x0 - <no info>) } })
2019-09-06 14:00:04 UTC 
Bad block detected: Error(Engine(RequiresClient), State { next_error: None, backtrace: InternalBacktrace { backtrace: Some(stack backtrace:
   0:     0x56351f6492ad - <no info>
   1:     0x56351f41cebc - <no info>
   2:     0x56351f0931dc - <no info>
   3:     0x56351f09a050 - <no info>
   4:     0x56351ef5f8b3 - <no info>
   5:     0x56351f125c2b - <no info>
   6:     0x56351e9c7021 - <no info>
   7:     0x56351eeeb1e9 - <no info>
   8:     0x56351eedf53b - <no info>
   9:     0x56351eef09c7 - <no info>
  10:     0x56351ef88ba4 - <no info>
  11:     0x56351f6a689e - <no info>
  12:     0x56351f6a93eb - <no info>
  13:     0x7fa88f2036b9 - <no info>
  14:     0x7fa88ed2341c - <no info>
  15:                0x0 - <no info>) } })
RLP: f90248f90243a0b06b6154c6125da6956b8eb7a785209aa83ee30c910179852739bfaf24d09eaca01dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d4934794314a298054f660d8128b78ced1087d5921fda422a030fbb39f739745effd387faa16c8b6c2ae3cd9875e5a589753e13450433ab0c0a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000090fffffffffffffffffffffffffffffffb81c78401f6822580845d72568e9fde830205078f5061726974792d457468657265756d86312e33362e30826c698409583bdbb84190f798c546a9234e5900af266d6f9c27bb646fe760d7808abe52c9abc49b0124679115ee0d65a50bf3ddac80b909f3dd6c47e764160c7c9459a4b5756d015d8101c0c0
Header: Header { parent_hash: 0xb06b6154c6125da6956b8eb7a785209aa83ee30c910179852739bfaf24d09eac, timestamp: 1567774350, number: 199, author: 0x314a298054f660d8128b78ced1087d5921fda422, transactions_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347, extra_data: [222, 131, 2, 5, 7, 143, 80, 97, 114, 105, 116, 121, 45, 69, 116, 104, 101, 114, 101, 117, 109, 134, 49, 46, 51, 54, 46, 48, 130, 108, 105], state_root: 0x30fbb39f739745effd387faa16c8b6c2ae3cd9875e5a589753e13450433ab0c0, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 32932389, difficulty: 340282366920938463463374607431768211451, seal: [[132, 9, 88, 59, 219], [184, 65, 144, 247, 152, 197, 70, 169, 35, 78, 89, 0, 175, 38, 109, 111, 156, 39, 187, 100, 111, 231, 96, 215, 128, 138, 190, 82, 201, 171, 196, 155, 1, 36, 103, 145, 21, 238, 13, 101, 165, 11, 243, 221, 172, 128, 185, 9, 243, 221, 108, 71, 231, 100, 22, 12, 124, 148, 89, 164, 181, 117, 109, 1, 93, 129, 1]], hash: Some(0xef438e2c7509a27ce0bf9001c3e28b8dc173c0f7c9dc04e88057c19da3af4d1b) }
Uncles: 
Transactions:

2019-09-06 14:00:22 UTC    2/25 peers     22 KiB chain   32 KiB db  0 bytes queue  221 KiB sync  RPC:  0 conn,    0 req/s,    0 µs
2019-09-06 14:00:52 UTC    2/25 peers     22 KiB chain   32 KiB db  0 bytes queue  221 KiB sync  RPC:  0 conn,    0 req/s,    0 µs
2019-09-06 14:01:10 UTC Imported #97 0x8179…9d8b (0 txs, 0.00 Mgas, 1 ms, 0.57 KiB)

@joshua-mir
Running in docker
I kept this node offline for few hours when it came back it started synching but in block #199 i got this error and the node created a fork and started mining from #97 but at that time current block is #357

@jam10o-new
Copy link
Contributor

It'd be useful to know if there are any transitions activated around that block in your chain specification as well, I'll reopen this for now

@jam10o-new jam10o-new reopened this Sep 6, 2019
@SidhMj
Copy link

SidhMj commented Sep 6, 2019

No chain specs updated. I was observing Block generation timing of 3 nodes.
force_sealing = false
reseal_max_period = 60000
in TOML
and
{ "engine": { "authorityRound": { "params": { "stepDuration": "10", "validators": { "multi": { "0": { "list": [ "0x314a298054f660d8128b78ced1087d5921fda422", "0x643bb732810421056de703cff8c321b247fda642", "0xa89cae778b61f88926ba0198f20a5410f27cf47f" ] } } } } } } }

Interestingly observed that blocks are not generated in 60 sec gap. But at a strange frequency when 3 nodes are running but when a single node is running it showing a freq. of
60 60 60 90 60 90 sec.

@adria0 adria0 added the A3-stale 🍃 Pull request did not receive any updates in a long time. No review needed at this stage. Close it. label Jul 27, 2020
@adria0 adria0 closed this as completed Jul 27, 2020
@MarkusTeufelberger
Copy link

MarkusTeufelberger commented Jul 27, 2020

Has this been fixed finally? If yes, in which commit?

@adria0
Copy link

adria0 commented Jul 28, 2020

Hi @MarkusTeufelberger, this issue is related to versions that are not supported at this moment. In the developers call we decided to close all issues related to old and unsupported versions. If this issue is still pending is that nobody had time to work on it, I suppose.

@MarkusTeufelberger
Copy link

If it hasn't been fixed since, I would expect it to be valid for the current version though, which I assume to be supported? Just seems weird to close a "P2 - ASAP" bug as "Stale"...

@adria0
Copy link

adria0 commented Jul 28, 2020

@MarkusTeufelberger, happens that a lot of errors in old versions are just fixed in later versions, so keeping a 2.2.5 bug as open adds a lot of noise when you try to figure out what are the current issues (we had like 300 issues opened with old versions).

If this bug persists in the current version I expect that someone will fill a new issue.

@igorbarinov
Copy link

igorbarinov commented Jul 28, 2020 via email

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
A3-stale 🍃 Pull request did not receive any updates in a long time. No review needed at this stage. Close it. F2-bug 🐞 The client fails to follow expected behavior. M4-core ⛓ Core client code / Rust. P2-asap 🌊 No need to stop dead in your tracks, however issue should be addressed as soon as possible.
Projects
None yet
Development

Successfully merging a pull request may close this issue.