-
Notifications
You must be signed in to change notification settings - Fork 707
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
move_file
may cause data loss
#358
Comments
I've checked my backups. It shows about 11K data, more than 200 lines of history were lost in March 6 or 5. It's longer than what autojump keeps for as backup (only 24 hours) before I realize data loss had happened. |
Hi lilydjwg, |
This should make shutil.move use rename, which is atomic, avoiding losing data when interweaving writes happen. This will close wting#358.
@sxe I've forked and made some change to avoid the situation I've described. Would you try that and see if it solves the problem for you? |
I will and report back, thx! Give me a couple of days to get some data. |
@lilydjwg looks really good so far. 6469 14. Feb 18:07 autojump.txt.150214 |
I have also experienced data loss every week or so. |
This should make shutil.move use rename, which is atomic, avoiding losing data when interweaving writes happen. This will close wting#358.
Days ago I found autojump forgot some of my frequent paths. After reading the code, I think this may be caused by that
move_file
isn't really atomic.My system is Arch Linux and autojump version is v22.2.4.
On saving paths, autojump creates a temporary file, which by default, is in /tmp. /tmp isn't on the same filesystem with my autojump data file, so when
move_file
callsshutil.move
, what really happens is copy and then delete. It's not atomic. Running with strace I can see there are multiplewrite
calls to the data file. When multiple processes hit here, interweaving writes will result in a partial loss.I suggest you use a temporary file in the same directory as the data file. I can provide a patch if you accept doing it this way.
The text was updated successfully, but these errors were encountered: