-
-
Notifications
You must be signed in to change notification settings - Fork 227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Addition of a ghost backup
and potentially ghost restore
command.
#468
Comments
Braindump: |
ghost backup
and potentially ghost restore
command.
@acburdine can you assign me to this please 😄 |
I know there's a lot of active work going on here, which I'm super excited about! I wanted to flag up that we had a chat about this yesterday in a wider meeting about backups where we ended up talking about whether Ghost should have a backup mechanism. It made me realise that there are a couple of different things that Where we are: The original spec on this issue is for a mysqldump + a tar of the content folder. This sounds a lot like what we were discussing yesterday. Meanwhile, it seems the implementation is doing an export of the database to JSON and then a zip of the content folder. These small differences are, I think quite important. Also in both the spec & the implementation, the job of Where I'd like to be: I think it is correct that In terms of what the backup does, I think it would be wrong to overlook using mysqldump instead of, or as-well-as doing a JSON backup. It may be that this needs to be coded so it's a possible future extension? I'm also not 100% on the method currently being used to generate the JSON export. Currently the exporter is being required, which seems odd when we have a backup mechanism in Ghost which could either be required or used over the API. There is some code duplication I'd like to avoid, and it's fine IMO if that needs changes to Ghost itself. In terms of the file-format, I'm not sure that using .zip instead of .tar is a trivial difference. I am not 100% on what the differences are, but there's got to be a reason why A. the original request was for .tar and my gut feeling is .tar is normal 🤔 I have a feeling there's a difference in terms of permissions, and also tooling - I believe Bottom line here:
Sorry for the wall of text. I am appreciative of all the work that is already going on here to make this a reality! |
From: https://itsfoss.com/tar-vs-zip-vs-gz/
We are probably looking for |
Alright, I'm free! 😄 Something I'd like to reiterate (although based on what I've seen over the past however-long I've been active with ghost I don't think it's necessary) is I'm super open to change. I'm no expert in ghost, and I'm fairly new here, so I might be not be following your standards 100% 😆 Sidenote: Work on the PR related to this started around 3 months ago, so I don't fully remember the reasoning for doing some of the things I did.
oops 😁 |
There are two disadvantages i can see using Ghost's exporter.
Having I think the dump strategy is in general more user friendly, because A you won't loose any data (clients, tokens..) and B the CLI could offer a command to insert a given dump.
Yeah that would make sense, because of the gzip compression. The CLI requires Ubuntu 16.04 -
Yeah that is true. I think it makes sense to offer the option to exclude the backup of images. |
I'm not too familiar w/ database exports, so I can't say much - all I can say is I just realized w/ sqlite3, there's no need to export the database, since we can just add the db file to the archive
I know the same assumption was made for git (#524 - the user said they were running Ubuntu 16.04) - however, tar is used by many more applications than git. If we end up using shell tar, do you think it should be documented somewhere?
What about giving them the option to skip / force everything? Yargs already supports the |
I might be beating a dead horse, but I just found out tar is coming to Windows so can definitely get away with using shell tar |
Hi there, just wanted to +1 for this feature :) Thank you for your hard work! |
@kirrg001 @acburdine @ErisDS I wrote this spec a while ago, thoughts? The goal of the backup command is to create an archive of all non-standard ghost data (user-created content) in a method which allows the data to be imported into a fresh installation at a later time. The backup command has 5 main content areas which need to be archived
All of which will be saved in a The initial implementation of the command will rely on the following preconditions
Images and logs will directly be cloned into the /content folder of the archive. Sqlite databases will be cloned verbatim into the archive. Mysql databases will be exported using Configuration files will copied into the root of the archive Extensions will be hooked via the |
Would just like to 👍 this feature, to be able to automatically backup and restore the contents is key! |
I don't get why #605 was closed. Is there a ghost-cli import feature? If not, then how do you import an old database when launching a new ghost instance programmatically?
Right now one needs to register an account just to use their local instance's import tool :( |
@dm17 it was closed because it was a duplicate of this issue that you are commenting on. If you need export/import then you'll need to use Ghost's admin interface or the API (import/export will need session auth because they are not exposed to 3rd party integrations for security). |
Can you add to the Ghost docs exactly which data importing/exporting json files in the Ghost admin interface will include/exclude? It says "posts," and also - based on testing - I was able to figure out it does not import API keys for integrations... But what else is missing? It is a lot of data to go through so good documentation would save a lot of time. |
refs TryGhost#468 - add `--from-export` argument to install command - add import command to import a ghost export file into an existing instance - add import parsing/loading code
refs #468 - add `--from-export` argument to install command - add import command to import a ghost export file into an existing instance - add import parsing/loading code
refs TryGhost#468 - add export command & export taks
refs #468 - add export command & export taks
Migrating my Ghost blogs from one server to another these days I've noticed that there are quite a lot of .json files in the content/fata folder, which contain the website content. Are these saved when someone does the export in the admin interface or are done programmatically? This could be an option to explore for the backup utility, i.e., every time a new post/page is added (or modified) or periodically (probably a better option) the site content is exported into this .json file. Also, the update command should probably perform a backup in the background as it does work to restore the files, but is unclear what it does in terms of website content/database. The other elements (not mentioned above by @vikaspotluri123) that need to be backed up:
Also, based on my recent experience, highly important to check on existing storage (nonvolatile memory). And another suggestion: backups could be setup in config.production.json (i.e. backup toggle yes/no, backup interval daily, weekly, monthly, etc.). |
Our bot has automatically marked this issue as stale because there has not been any activity here in some time. The issue will be closed soon if there are no further updates, however we ask that you do not post comments to keep the issue open if you are not actively working on a PR. We keep the issue list minimal so we can keep focus on the most pressing issues. Closed issues can always be reopened if a new contributor is found. Thank you for understanding 🙂 |
@kevinansfield may you please add a pointer to own to import/export with the API? I don't find any documentation on the matter. Plus the ghost-cli has an import/export option, but it's not documented either. @vikaspotluri123 is the backup done / postponned to a future release / dropped? Thanks a lot to both of you. |
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - Only show the warning about upgrading if there's a major upgrade to do
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - this ensures that we export a members.csv file if the endpoint exists
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - the old streams wiring didn't handle the 404 error - my previous attempt at changing the stream code hung on success instead - this is more modern code, but works on node 12 for both the success and failure case
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - if the backup command is run multiple times, and the latter runs fail to export, the backup files would be old - this ensures that the backup files all belong together - also renames backup to content
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - this ensures that we export a members.csv file if the endpoint exists
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - the old streams wiring didn't handle the 404 error - my previous attempt at changing the stream code hung on success instead - this is more modern code, but works on node 12 for both the success and failure case
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - if the backup command is run multiple times, and the latter runs fail to export, the backup files would be old - this ensures that the backup files all belong together - also renames backup to content
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - this ensures that we export a members.csv file if the endpoint exists
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - the old streams wiring didn't handle the 404 error - my previous attempt at changing the stream code hung on success instead - this is more modern code, but works on node 12 for both the success and failure case
refs: TryGhost/Toolbox#334 refs: TryGhost#468 - if the backup command is run multiple times, and the latter runs fail to export, the backup files would be old - this ensures that the backup files all belong together - also renames backup to content
refs: TryGhost/Toolbox#334 refs: #468 - Only show the warning about upgrading if there's a major upgrade to do
refs: TryGhost/Toolbox#334 refs: #468 - this ensures that we export a members.csv file if the endpoint exists
refs: TryGhost/Toolbox#334 refs: #468 - the old streams wiring didn't handle the 404 error - my previous attempt at changing the stream code hung on success instead - this is more modern code, but works on node 12 for both the success and failure case
refs: TryGhost/Toolbox#334 refs: #468 - if the backup command is run multiple times, and the latter runs fail to export, the backup files would be old - this ensures that the backup files all belong together - also renames backup to content
refs: TryGhost/Toolbox#334 refs: #468 - we have to handle files differently depending on whether we're working locally or on a server - first pass worked locally, second worked on ubuntu, this one should work on both
refs: TryGhost/Toolbox#334 refs: #468 - zip is meant to be installed on ubuntu 18 but for some reason on DO it is not - use our JS lib instead, as that will work on any platform, although it may have issues with scaling
refs: TryGhost/Toolbox#334 refs: #468 - in Ghost CLI casper is always a symlink - we don't need to back it up, it will always be present and updated
This issue is a
Summary
Addition of a
ghost backup
command which gathers a dump of the MySQL database as well as a tar of the content folder would be a great addition. Users could add it to a cron task and backup the files to whatever 3rd party they wish.I discussed this a bit on the slack with Austin and we think it would be a useful addition.
The text was updated successfully, but these errors were encountered: