-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to export large account #385
Comments
Thanks for the report @prairietree Please try exporting and importing from the CLI with the respective commands by referring to the output from the occ commands below occ user:export --help
occ user:import --help |
This seems much better but I am not sure what is going on. I ran:
It has been stuck there for over 12 hours now and there is no file in /data/ocexport/. Is it possible it is still running or has it failed? Thanks |
It appears from the stdout that the export command exited while files were being exported Please run the same command with |
ok, I had some trouble getting the old process killed. I had to reboot for it to go away. I ran the command with three v's but am getting the same results. It did not give me any more output and stops at the same place. I have let it sit at this point a few hours now.
Edit: If I would leave off the path, what would be the default location? Edit two: I think I turned off circles because of another error I was seeing earlier. |
The export destination path is required as there is no default location It seems that the core issue is in the files export, @come-nc how could @prairietree debug this? |
Some more items came up in the log that might be useful:
|
You need to let the command run until it’s finished. @Pytal But regarding the web UI we should look into it this is not the first report saying the UI says import complete before it actually is, this is unexpected. The error you get in the logs are suggesting there is a notification about a finished export put the exported file is not actually there, maybe you just deleted it in you testing. We should catch this error and handle it better in our Notifier though. |
Hello @come-nc If that path is suppose to be relative to the NextCloud data directory then it would be the path to the bad export that I deleted. I did not want that big file to be included in the next export I did. But why does it have an issue with it not being there? If it is not there then preferably just do not include it in the export. Edit:
The only other thing I can think of trying is running |
You have a notification in the database that refers to the file. The file does not exists so the render of the notification crashes. If you have access to notifications just dismiss them.
I do not know if it is relative or not. But it is not the export process but the notification system which runs into the error.
Which issue?
So they are not failing, just running. Did you ever see an error message? |
I am not sure about the notifications. I have not seen them or had the option to dismiss them. Where would I find them?
Good to know.
Same one as described above. Where it gets to the line 'Exporting files…' and just stops and does not return to the prompt.
That is the confusing part. I do not get an error message other then maybe in the log file. It just gets to the 'Exporting files…' and stops there and nothing seems to be happening. The first few runs I let run more then 14 hours. The last few I have only let run an hour or so. I never even begins to create an export file in the location I give it. (before that I did some runs from the web UI and they did create files) |
I ran I am out of ideas and need to get this done. Any ideas on how to move a single user manually? |
I’m pretty sure it’s just slow and exporting files. Try to let it run for several days. Maybe empty trashbin or delete old versions before if you can to lighten the load. If it is just too slow, you can migrate the user without the files migrator (should be fast), and migrate the files through any other mean, Nextcloud client sync, scp, sftp, … |
@come-nc Thanks for the reply. I did try a little longer run. I let one run for 1 and a half days or about 36 hours. Is there any way I can tell if it is doing something? I monitor with top and it looks like not much is going on. Mysql and apache were both using about 1-3% cpu. That may be a little more but not much different then they use most of the time. I also looked to see if it is making a file but I never does. I will look into emptying the trash bin, but part of the reason for using the migration tool was to get old versions of documents. There is not much else I want to migrate over. No settings or shares or data from other apps. Is there some combination of manually copying over the files and export tool without files that would get me the document histories? Thanks for your help. |
You do not see the file growing on local storage while it’s running? Maybe you can add directly in |
Thanks for the help @come-nc . That was a helpful suggestion. I put that in the code and I can see it processing the files now. I was not seeing any file growing or otherwise in the path I gave it. However, looking at the code it looks like it makes file in the /tmp directory. In my case that could be an issue because /tmp is on the system drive and only has 33GB of free space. I have another data drive with lots of free space. But I will let it run. If it only uses the 27.6GB that NextCloud says the user is using then it might work. In case this does not work, would I just edit line 59 ($stream = fopen('php://temp', 'r+');" to open a stream in a different path? |
No this temp file is only for the content of one file, temporarily. The ZipStreamer library needs a stream in input, not a string, so when we need to add a string we pass it through a temporary stream. It should not fill /tmp I think, and it is only used for json files, not for exported files themselves. |
@come-nc I believe you are right about that. Not sure what I was thinking. However, the /tmp does seem to be getting filled up with a file called 'oc_tmp_K1MM0c-.zip'. It is up to 25GB now. |
Indeed, the command exports in /tmp and then moves to the destination, I missed that. |
Export archive is moved from user_migration/lib/Command/Export.php Line 209 in 6b19754
IIRC we had a reason for doing it this way using https://github.com/nextcloud/server/blob/03e965a513accef64015ac307f4a0784f2ffc52d/lib/private/TempManager.php#L79-L114, probably so that failed and incomplete export archives automatically get cleaned up by the OS @come-nc? |
@Pytal could it be changed to use the same folder for the temp file but just use a .temp or .part file name tell it is done and then move it to the final file name? |
This comment was marked as outdated.
This comment was marked as outdated.
Sound good technically @come-nc? Also do you remember the original reason for exporting to |
Yes it sounds good, I do not remember why it was done like that. |
Done in #402 |
Hello, I hope it is going well for you.
I will let you in on a little of my experience with using user migration tool. It sounds like a nice idea but so far it has been only frustration for me. The user I am trying to export has about 27BG of data. I know that is a lot but I was hoping it would work.
The first issue I had was with exporting. It would run a few minutes (probably less then 10 and then say it is done successfully). But nextcloud always said the file was zero KB and would not let me download. I worked through some errors in the log in the admin account but none of them ended up being related. I ran it a few more times while watching the log and watching the file size. Sometimes I deleted the file in between runs. I think now it was just not done when it said it was. Also, I think it was not really a zero byte file. It was just nextcloud that said it was.
I finally let one run for a day or so and the file ended up at about 18GB (I would hope I watched the file size for a while to make sure it was done, but I am not sure if I did). NextCloud still said it was zero byte but it would let me download it. I thought maybe that was good but I decided to try again since it seemed kind of small. On the second run after a while I had a 37GB file. I am not sure if I deleted the file in between runs. That seems kind of large but I moved it over and tried to import it.
I had the same kind of user interface experience. It runs for a few minutes (maybe 5-7) then says "Import completed successfully", but when I look around I do not find any files. In this case there is an item in the log. It says "ValueError: Invalid or uninitialized Zip object". That makes me think it is a bad file.
It is not hard to get more then 27BGs of data but it feels like the migration tool is not made to work with that large of a data set.
I saw someone talk about running it form the command line but was not able to find documentation on that. Could someone point me to that? I will try running it form the command line.
I think it would be nice to have examples of running it from the command line on the readme (https://github.com/nextcloud/user_migration#readme).
Also, it would be nice to have a more detailed report. Like the total number of files imported or exported and the total size. Or just something to help determine if it ran successfully.
I am using "Nextcloud Hub 3 (25.0.4)"
Thanks for your help.
The text was updated successfully, but these errors were encountered: