-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems with database, re-syncing the blockchain and duplicate log entries since upgrading to 10.0 #1144
Comments
OK, I've figured out the first of the duplicate log entry issues and have submitted #1145 to make things clearer in future. I still can't figure out the other duplicate issue but will keep looking. My bigger concern is the database issue though, if anyone can figure out what's happening here? @hyc - have you come across anything like this before? |
How many monerod processes are you running when this happens ? |
Should be just the one. I've got max-concurrency set to one and I only call monerod once. |
It would be nice to know, however. |
Is there any easy way to tell? |
ps u |
OK, there's only one process running with:
Which is |
Closing...will keep testing and open a new issue if it seems persistent |
Was running 9.4.0 pulling down regular PRs. Decided to backup my
.bitmonero
directory and start to sync from scratch with 10.0. I'm on a 64-bit Odroid C2 ARMv8 system with a 128Gb SD card for the data directory.Got to about 400k blocks then had to
monerod exit
and restart later, where the problems began, as detailed in #1128:So I pulled in #1123 and #1128 and deleted my directory to sync from scratch again. Now getting
Also note the duplicate log entries when starting up:
Is this starting two instances, or is the first one main.cpp and the second one daemon.cpp?
Why is it setting these limits twice?
Hope this makes some sense...I had no problems at all with firing up 0.9.4 and syncing from scratch. Now 10.0 seems to hog my connection and eventually stop syncing requiring a monerod exit, a restart and then I get database problems.
The text was updated successfully, but these errors were encountered: