Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to download #89

Closed
StrangeDOS opened this issue Jan 19, 2018 · 18 comments
Closed

Failed to download #89

StrangeDOS opened this issue Jan 19, 2018 · 18 comments
Labels

Comments

@StrangeDOS
Copy link

I'm running the new Windows binary (Script Version : 2018.01.19) to test closed bug #79 , and its much better. Most of the titles I flagged as it having a problem with downloaded without issue. Most but not all.

It still would not download the following...

comic_dl -i http://readcomiconline.to/Comic/Final-Crisis-Sketchbook -dd f:\DownloadComic --convert cbz --quality high --keep yes --keep yes

This actually looks different than bug #79 . It just stops. The log is showing nothing in the 'All Links' section.

DEBUG: All Links : []

So it appears to be some sort of parsing issue.

However the binary worked fine with all the other titled that I had flagged. If I come across any additional titles that won't download, I'll update this bug.
Error_Log.log

Also, thanks for the new version!

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 19, 2018

I'll push an update tomorrow and when you get time, please do check those. Hopefully that push will fix this issue.

@Xonshiz Xonshiz added the bug label Jan 19, 2018
@StrangeDOS
Copy link
Author

I have the new version that you posted last night. I'll run it through my usual tests and let you know how it goes.

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 20, 2018 via email

@StrangeDOS
Copy link
Author

hmmm...this doesn't look any different. Trying to download the same title is failing in the same manner.


comic_dl -i http://readcomiconline.to/Comic/Final-Crisis-Sketchbook -dd f:\DownloadComic --convert cbz --quality high --keep yes --verbose

Starting the script in Verbose Mode

Finished Downloading
Total Time Taken To Complete : 6.86100006104


And the debug log looks the same...
DEBUG: All Links : []
Error_Log.log

And I went ahead and tested for bug #90 . That still seems to be happening as well.
I tried the following...
comic_dl -i http://readcomiconline.to/Comic/Final-Crisis-Sketchbook -dd f:\DownloadComic --convert cbz --quality high --keep yes --verbose

It starts downloading, but it misses the last comicbook. Looking at the log, the last comicbook on the list is missing...
DEBUG: All Links : ['AmeriKarate/Issue-9', 'AmeriKarate/Issue-1?id=105667', 'AmeriKarate/Issue-2?id=109768', 'AmeriKarate/Issue-3?id=111894', 'AmeriKarate/Issue-4?id=114774', 'AmeriKarate/Issue-5?id=120191', 'AmeriKarate/Issue-6?id=121472', 'AmeriKarate/Issue-7?id=124024']

It shows 1-7, and the extra 9 (which I assume is a placeholder for the 'to be released' issue), but issue 8 is missing.

image

Renaming debug log as I'm sending two logs in the same post.

Error_Log2.log

This is the same behavior as what I had in bug #90

@StrangeDOS
Copy link
Author

I hope all that made sense. Let me know if you have any questions.

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 21, 2018

Okay, Found the error. The website is returning "503". Cloudflare has changed something a few days ago I believe. cfscrape isn't able to get past it now. I'll have to look for alternative. The script may or may not work for a few sites behind Cloudflare for a while. I'll have to open an issue on cfscrape.

@StrangeDOS
Copy link
Author

Yeah. I ran a NetMon trace and saw the 503. I didn't know if that was part of the problem or not because even after the 503, information packets were still getting delivered. Or at least they seemed to. I'm not too strong on NetMon traces. I'm out of practice.

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 21, 2018

Nah, it's just how cloudflare works. First your connection goes through CF for the first time and then it provides you with a check and meanwhile, it gives you a 503 error.For that 5 second delay period, you receive 503 and in bacground, it's asking your browser for few checks and finding a "answer" (I'm not sure if the technical term would be "answer").

Anyways, it's CF messing things up. Uhh... I've noticed that other people are having the same trouble.

@StrangeDOS
Copy link
Author

Ah. That would explain why my simple POSH 'Invoke-WebRequest' was met with a crash and burn. lol

Was trying to get parse out the comic lists on http://readcomiconline.to/. :)

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 21, 2018

LOL. I've been trying to get to these RCO guys, but well, they don't want to interact. I don't like to scrap and waste their bandwidth, but they could at least reply. Uh.

@StrangeDOS
Copy link
Author

So are we stuck until CF gets an update?

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 21, 2018

Worst case, I'll have to get selenium back in the action. I'm waiting on @Anorov to fix the issue. If it's unable to work, I'll bring the bigger guns.

@StrangeDOS
Copy link
Author

Sweet. I'm good for testing and bug reports but my coding skills will but of no help. I've done some POSH scripts to auto-organize my file collections but that's about all.

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 21, 2018

No issues. Testing is a lot of help. I don't download manga/comics now, don't get much time so I'm not sure which website needs updation. You all help a lot in keeping this script working :D

@StrangeDOS
Copy link
Author

Looks like we got a response.

Anorov/cloudflare-scrape#132

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 22, 2018

Crunchyroll is working just fine. I need to test RCO now.

@StrangeDOS
Copy link
Author

What's Crunchyroll?

@Xonshiz
Copy link
Owner

Xonshiz commented Jan 22, 2018

Fixed the issue, uploading the windows binary in a while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants