Thoughts on FTP Deployment #28
Replies: 2 comments
-
Very cool! Statiq now has deployment support out-of-the-box for several hosts (Netlify, GitHub Pages, etc.) but I wonder if it also makes sense to dust off this FTP approach and put it in the box as well.
Does this mean there's a backend job that creates the hash file after upload? If so, that's a neat enhancement to the approach. |
Beta Was this translation helpful? Give feedback.
-
No, I kept all the creation local. What I meant is simply that the file itself never has to be saved locally. I created it using a memory stream and uploaded that to the host. My file, like yours, has a path and a hash. The host doesn't need to understand what the path means, which I felt was an advantage in case the provider doesn't make it easy to run arbitrary code. I think it would be fairly easy to generalize this approach. My code uses defined constants for the host, the local path and the remote path. I stored the hash file in the root of the host, just to keep it out of the way. If that didn't work for some people, it could be made to work by saving it under the remote ( |
Beta Was this translation helpful? Give feedback.
-
I used Dave's post about FTP deployment as inspiration for creating a similar approach in my Cake build script. I'll do a blog post about it soon, but here's the short version.
The general idea is that you save a file with the hashes for each file you are maintaining on the host. Before updating the site, you download that file and also generate hashes for your local files. Comparing the two sets of hashes is sufficient to decide what to upload and what to delete on the site. Read Dave's post for more details.
I had to re-create the approach in a different context - upload to my own service provider. Here are some observations based on the experience...
I referenced the FluentApi assembly from my Cake script by adding it as a reference to my csproj. This gets it copied into the bin directory where I
#reference
it from the script.I used
SHA256
for the hash and it ran fast enough for my purposes.The hash file lives on the host and never actually has to be created locally. Download it into a
MemoryStream
.If your site doesn't generate pages deterministically, you can end up thinking that the hash is causing the problem. In my case, two top-level pages had the same
Order
so the nav buttons changed order at random. That made every post change when I regenerated the site. I ended up writing tests for hash function to prove it was not the problem. If pages seem to be changing, they probably are, so look to your own code. 😄The two feed files contain a timestamp and will always need to be uploaded. If you want them to be uploaded only for "real" changes, you'll need to write some code. I didn't because I wouldn't be running my Deploy target normally without having first changed something.
Beta Was this translation helpful? Give feedback.
All reactions