-
-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to check Migration seems to hanging #1179
Comments
If it’s stops I see a log output in the docker console: Now I have try it again and stops after 800 records. |
Ok, my problem is fixed - it seems that the migrate-task was terminated to fast. After the systemload was lower I have truncate the results an start the migration again. Now it was faster an finished bevor the task was killed. |
I had the same issue and ended up in doing "partial" migrations chunk by chunk... Moving some and the deleting the migrated from the _bad_json table and re-starting the migration. I agree that the timeout should be increased especially for installations with large number of records. |
After starting the migration, directly two messages are showing. Will look into it later to share more information. |
I'll increase the timeout of the job. I tested with ~5000 records and didn't get this but that comes down to how fast the host system is. |
@AleksCee how long was the process running before it stopped? |
@alexjustesen it looks like 6 minutes from the first request to /results till the killed entry in the logs. |
What hardware are you running on? That feels really slow |
It‘s an Synology nas (ds716+) but at Update time the backup jobs were still running. |
@alexjustesen btw timing, when starting the docker-compose with Maria-dB and Speedtest sometimes the dB migration starts befor the sql server is ready to connect, because of release upgrade in the docker after update to latest of the dB-Container. Can you perhaps check connection in a loop in a little retry? In case if the database not ready, the Speedtest container crash and restart 2-3 times.. is eventually in some cases after an update a problem…. |
My Synology DS920+ wasn't able to afford the migration of 7.182 results, so I made the migration of the database with my laptop and now everything is working like a charm... Thank you!!! |
I'm planning on updating that doc page with a health check so it waits for a healthy DB connection. GitBook is having an issue with that component for the last few days so I haven't been able to make updates. Probably have to just delete it and make a new one. |
@alexjustesen my sqlite results_bad_json table has over 22 thousand entries. [2024-02-20 17:49:59] production.ERROR: Error: Object of class stdClass could not be converted to string in /var/www/html/vendor/laravel/framework/src/Illuminate/Database/Connection.php:723 I guess this makes it not part of the previous described issue. |
@sschneider 22k! What are you running it every 5min? I think you might be at the top of the leader board lol That was way outside of my test criteria, sounds like I'll have to split the job up into batches that can be processed separately to avoid long running processes. |
@alexjustesensind 2022-10-31 when I set it up the second or third time :-). |
I have 45576 in the old Speedtest (wait for the importer ;) ), 7600 in yours and about the same rows are lost by changing from sqlite to MySQL because can’t find a way to import the dump to MySQL. |
This comment was marked as duplicate.
This comment was marked as duplicate.
@thegodfatherrelish different issue, follow #1205 for that one. |
hello, Any advise? thanks update1: no update after 24h, hence updated to latest version and launched again migration...after more than 30min still nothing |
same issue running on Synology NAS 920+ and my bad results has 33.913 rows in it, kicked the migration off and it did 17,703 and just stopped. would be nice to do the migration fully but not major. Maybe we could get a SQL we could manually run to get the data over? |
@alexjustesen I checked the records in more detail and it might be that my issue is related to different JSONs with (most) and without "" (764). |
Import/export is coming in |
After the hotfix I can start the migration but it seems tu hang now, how can I check or resume it?
the count 635 is more than 10 minutes without changes.
select count() from results; select count() from results_bad_json;
+----------+
| count(*) |
+----------+
| 635 |
+----------+
1 row in set (0.001 sec)
+----------+
| count(*) |
+----------+
| 7257 |
+----------+
1 row in set (0.007 sec)
The text was updated successfully, but these errors were encountered: