You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 9, 2021. It is now read-only.
While the new _files.tumblr is good, it would be better to have an options to get those links only.
Having only the urls saved, but not the actual files, one can merge those lists outside the program.
Once merged, duplicated links can be cleaned and batch downloaded with a browser's extension saving space and bandwidth.
Just an idea. I've tried something similar by checking to download only the meta files. But those urls are blog specific if I recall correctly.
Thanks.
The text was updated successfully, but these errors were encountered:
I like this idea. This can be introduced in the form an additional download mode for only the URL list as a separate text file. This will also allow for example to check the number of URLs with the number of downloaded files, if any files are missing, then can weed out the downloaded URLs and download the remaining missing files by any downloader.
While the new _files.tumblr is good, it would be better to have an options to get those links only.
Having only the urls saved, but not the actual files, one can merge those lists outside the program.
Once merged, duplicated links can be cleaned and batch downloaded with a browser's extension saving space and bandwidth.
Just an idea. I've tried something similar by checking to download only the meta files. But those urls are blog specific if I recall correctly.
Thanks.
The text was updated successfully, but these errors were encountered: