Skip to content
This repository has been archived by the owner on Oct 4, 2024. It is now read-only.

Sudden FAILURE for every download #202

Closed
ozupey opened this issue Feb 26, 2020 · 26 comments
Closed

Sudden FAILURE for every download #202

ozupey opened this issue Feb 26, 2020 · 26 comments

Comments

@ozupey
Copy link

ozupey commented Feb 26, 2020

First of all, thanks for this great project. :)

After downloading 75,450 photos and videos successfully, every attempt is now returning an instant FAILURE. The quotas on the Google Console only show about 30% usage, but to rule this out, I created a new client ID with no quota usage at all, and it keeps happening.

Unfortunately, even with a trace level, nothing gets appended to gphotos.trace for these downloading
attempts. If I search for these files on photos.google.com I can see and download them perfectly fine. I've also made sure the partition still has free space (3.7 TB) and I can write files perfectly fine.

Running version: 2.14.0, database schema version 5.7.

Any tips on how to debug this?

02-26 11:58:34 gphotos.GooglePhotosDownload INFO     downloading 1 photos/2017/09/IMG_0884.MOV
02-26 11:58:34 gphotos.GooglePhotosDownload ERROR    FAILURE 1 downloading photos/2017/09/IMG_0884.MOV
02-26 11:58:34 gphotos.GooglePhotosDownload INFO     downloading 2 photos/2016/09/IMG_0200 (2).JPG
02-26 11:58:35 gphotos.GooglePhotosDownload ERROR    FAILURE 2 downloading photos/2016/09/IMG_0200 (2).JPG
02-26 11:58:35 gphotos.GooglePhotosDownload INFO     downloading 3 photos/2016/09/IMG_0172 (2).JPG
02-26 11:58:36 gphotos.GooglePhotosDownload ERROR    FAILURE 3 downloading photos/2016/09/IMG_0172 (2).JPG
02-26 11:58:36 gphotos.GooglePhotosDownload INFO     downloading 4 photos/2016/09/IMG_0154 (2).JPG
02-26 11:58:37 gphotos.GooglePhotosDownload ERROR    FAILURE 4 downloading photos/2016/09/IMG_0154 (2).JPG
02-26 11:58:37 gphotos.GooglePhotosDownload INFO     downloading 5 photos/2016/09/IMG_0151 (2).JPG
02-26 11:58:38 gphotos.GooglePhotosDownload ERROR    FAILURE 5 downloading photos/2016/09/IMG_0151 (2).JPG
02-26 11:58:38 gphotos.GooglePhotosDownload INFO     downloading 6 photos/2016/09/IMG_0139 (2).JPG
02-26 11:58:39 gphotos.GooglePhotosDownload ERROR    FAILURE 6 downloading photos/2016/09/IMG_0139 (2).JPG
02-26 11:58:39 gphotos.GooglePhotosDownload INFO     downloading 7 photos/2016/09/IMG_0128.JPG
02-26 11:58:40 gphotos.GooglePhotosDownload ERROR    FAILURE 7 downloading photos/2016/09/IMG_0128.JPG
02-26 11:58:40 gphotos.GooglePhotosDownload INFO     downloading 8 photos/2016/09/IMG_0125 (2).JPG
02-26 11:58:41 gphotos.GooglePhotosDownload ERROR    FAILURE 8 downloading photos/2016/09/IMG_0125 (2).JPG
02-26 11:58:41 gphotos.GooglePhotosDownload INFO     downloading 9 photos/2016/09/IMG_0118 (2).JPG
02-26 11:58:42 gphotos.GooglePhotosDownload ERROR    FAILURE 9 downloading photos/2016/09/IMG_0118 (2).JPG
02-26 11:58:42 gphotos.GooglePhotosDownload INFO     downloading 10 photos/2016/09/IMG_0099 (2).JPG
02-26 11:58:43 gphotos.GooglePhotosDownload ERROR    FAILURE 10 downloading photos/2016/09/IMG_0099 (2).JPG
02-26 11:58:43 gphotos.GooglePhotosDownload INFO     downloading 11 photos/2016/09/IMG_0092 (2).JPG
02-26 11:58:43 gphotos.GooglePhotosDownload ERROR    FAILURE 11 downloading photos/2016/09/IMG_0092 (2).JPG
02-26 11:58:43 gphotos.GooglePhotosDownload INFO     downloading 12 photos/2016/09/IMG_0088 (2).JPG
02-26 11:58:44 gphotos.GooglePhotosDownload ERROR    FAILURE 12 downloading photos/2016/09/IMG_0088 (2).JPG
02-26 11:58:44 gphotos.GooglePhotosDownload INFO     downloading 13 photos/2016/09/IMG_0073 (2).JPG
02-26 11:58:45 gphotos.GooglePhotosDownload ERROR    FAILURE 13 downloading photos/2016/09/IMG_0073 (2).JPG
02-26 11:58:45 gphotos.GooglePhotosDownload INFO     downloading 14 photos/2016/09/IMG_0066.JPG
02-26 11:58:46 gphotos.GooglePhotosDownload ERROR    FAILURE 14 downloading photos/2016/09/IMG_0066.JPG
02-26 11:58:46 gphotos.GooglePhotosDownload INFO     downloading 15 photos/2016/09/IMG_0063 (2).JPG
02-26 11:58:47 gphotos.GooglePhotosDownload ERROR    FAILURE 15 downloading photos/2016/09/IMG_0063 (2).JPG
02-26 11:58:47 gphotos.GooglePhotosDownload INFO     downloading 16 photos/2016/09/IMG_0062 (2).JPG
02-26 11:58:48 gphotos.GooglePhotosDownload ERROR    FAILURE 16 downloading photos/2016/09/IMG_0062 (2).JPG
02-26 11:58:48 gphotos.GooglePhotosDownload INFO     downloading 17 photos/2016/09/IMG_0054 (2).JPG
02-26 11:58:49 gphotos.GooglePhotosDownload ERROR    FAILURE 17 downloading photos/2016/09/IMG_0054 (2).JPG
02-26 11:58:49 gphotos.GooglePhotosDownload INFO     downloading 18 photos/2016/09/IMG_0044 (2).JPG
02-26 11:58:50 gphotos.GooglePhotosDownload ERROR    FAILURE 18 downloading photos/2016/09/IMG_0044 (2).JPG
02-26 11:58:50 gphotos.GooglePhotosDownload INFO     downloading 19 photos/2016/09/IMG_0043 (2).JPG
02-26 11:58:51 gphotos.GooglePhotosDownload ERROR    FAILURE 19 downloading photos/2016/09/IMG_0043 (2).JPG
02-26 11:58:51 gphotos.GooglePhotosDownload INFO     downloading 20 photos/2016/09/IMG_0042 (2).JPG
02-26 11:58:52 gphotos.GooglePhotosDownload ERROR    FAILURE 20 downloading photos/2016/09/IMG_0042 (2).JPG
02-26 11:58:52 gphotos.GooglePhotosDownload INFO     downloading 21 photos/2016/09/IMG_0019 (2).JPG
02-26 11:58:53 gphotos.GooglePhotosDownload ERROR    FAILURE 21 downloading photos/2016/09/IMG_0019 (2).JPG
02-26 11:58:53 gphotos.GooglePhotosDownload INFO     downloading 22 photos/2016/09/IMG_0012.JPG
02-26 11:58:54 gphotos.GooglePhotosDownload ERROR    FAILURE 22 downloading photos/2016/09/IMG_0012.JPG
02-26 11:58:54 gphotos.GooglePhotosDownload INFO     downloading 23 photos/2016/09/IMG_3068.MOV
02-26 11:58:54 gphotos.GooglePhotosDownload ERROR    FAILURE 23 downloading photos/2016/09/IMG_3068.MOV
02-26 11:58:54 gphotos.GooglePhotosDownload INFO     downloading 24 photos/2016/09/IMG_3062.PNG
02-26 11:58:54 gphotos.GooglePhotosDownload ERROR    FAILURE 24 downloading photos/2016/09/IMG_3062.PNG
02-26 11:58:54 gphotos.GooglePhotosDownload INFO     downloading 25 photos/2016/08/IMG_3028.JPG
02-26 11:58:55 gphotos.GooglePhotosDownload ERROR    FAILURE 25 downloading photos/2016/08/IMG_3028.JPG
02-26 11:58:55 gphotos.GooglePhotosDownload INFO     downloading 26 photos/2016/08/IMG_3014.JPG
02-26 11:58:55 gphotos.GooglePhotosDownload ERROR    FAILURE 26 downloading photos/2016/08/IMG_3014.JPG
@gilesknap
Copy link
Owner

Hi.

I'm surprised that there is no additional logging to this. The log file in the root folder always is log-level debug. Is there any additional output in there?

@gilesknap
Copy link
Owner

if not, I'll push a version with some additional debugging to help you diagnose.

@ozupey
Copy link
Author

ozupey commented Feb 26, 2020

Thanks for the very quick response.

There is no additional output in either stdout, gphotos.log or gphotos.trace. The stdout and gphotos.log match and gphotos.trace only has the result of the single search it runs at the beginning (mediaItems:search responds with 200) - no further output during these FAILURE downloads.

Here is the full command and output:

photos@photos:~$ gphotos-sync photos/ --log-level trace --max-threads 1 --progress
02-26 12:08:42 WARNING  gphotos-sync 2.14.0 2020-02-26 12:08:42.395914
02-26 12:08:42 DEBUG    MINIMUM_DATE = 1800-01-01 00:00:00
02-26 12:08:42 INFO     Target filesystem /home/photos/photos is xfs
02-26 12:08:42 DEBUG    Checking if is filesystem supports symbolic links...
02-26 12:08:42 DEBUG    attempting to symlink /home/photos/photos/test_src_3149641293 to /home/photos/photos/test_dst_3612313734
02-26 12:08:42 DEBUG    Checking if File system supports unicode filenames...
02-26 12:08:42 INFO     Filesystem supports Unicode filenames
02-26 12:08:42 DEBUG    Checking if File system is case insensitive...
02-26 12:08:42 INFO     Case sensitive file system found
02-26 12:08:42 INFO     Max Path Length: 4096
02-26 12:08:42 INFO     Max filename length: 255
02-26 12:08:42 INFO     version: 2.14.0, database schema version 5.7
02-26 12:08:42 WARNING  Indexing Google Photos Files ...
02-26 12:08:42 INFO     searching for media start=2020-02-26 09:32:14, end=None, videos=True
02-26 12:08:42 DEBUG    mediaItems.search with body:
{'pageToken': None, 'pageSize': 100, 'filters': {'dateFilter': {'ranges': [{'startDate': {'year': 2020, 'month': 2, 'day': 26}, 'endDate': {'year': 3000, 'month': 1, 'day': 1}}]}, 'mediaTypeFilter': {'mediaTypes': ['ALL_MEDIA']}, 'featureFilter': {'includedFeatures': ['NONE']}, 'includeArchivedMedia': False}}
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 1 photos/2020/02/IMG_2992.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 2 photos/2020/02/IMG_2991.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 3 photos/2020/02/IMG_2990.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 4 photos/2020/02/IMG_2989.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 5 photos/2020/02/IMG_2988.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 6 photos/2020/02/IMG_2987.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 7 photos/2020/02/IMG_2986.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 8 photos/2020/02/IMG_2985.JPG
 2-26 12:08:43 DEBUG    Skipped Index (already indexed) 9 photos/2020/02/IMG_2984.JPG
02-26 12:08:43 DEBUG    Skipped Index (already indexed) 10 photos/2020/02/a3fab568-bf70-42d2-865e-d3eea3991705.jpg
02-26 12:08:43 DEBUG    search_media parsed 10 media_items with 100 PAGE_SIZE
02-26 12:08:43 DEBUG    mediaItems.search with body:
[ removed JSON output ]
02-26 12:08:44 WARNING  indexed 0 items
02-26 12:08:44 WARNING  Downloading Photos ...
02-26 12:08:45 INFO     downloading 1 photos/2017/09/IMG_0884.MOV
02-26 12:08:46 ERROR    FAILURE 1 downloading photos/2017/09/IMG_0884.MOV
02-26 12:08:46 INFO     downloading 2 photos/2016/09/IMG_0200 (2).JPG
02-26 12:08:47 ERROR    FAILURE 2 downloading photos/2016/09/IMG_0200 (2).JPG
02-26 12:08:47 INFO     downloading 3 photos/2016/09/IMG_0172 (2).JPG
02-26 12:08:48 ERROR    FAILURE 3 downloading photos/2016/09/IMG_0172 (2).JPG
.......

@gilesknap
Copy link
Owner

OK, I'll drop some extra debugging in this evening. Will let you know when it is pushed.

@ozupey
Copy link
Author

ozupey commented Feb 26, 2020

I added print(e) on line 300 in gphotos/GooglePhotosDownload.py and got this:

02-26 12:28:34 INFO     downloading 1 photos/2017/09/IMG_0884.MOV
429 Client Error: Too Many Requests for url: https://lh3.googleusercontent.com/lr/A...
02-26 12:28:34 ERROR    FAILURE 1 downloading photos/2017/09/IMG_0884.MOV
02-26 12:28:34 INFO     downloading 2 photos/2016/09/IMG_0200 (2).JPG
429 Client Error: Too Many Requests for url: https://lh3.googleusercontent.com/lr/A...
02-26 12:28:35 ERROR    FAILURE 2 downloading photos/2016/09/IMG_0200 (2).JPG
02-26 12:28:35 INFO     downloading 3 photos/2016/09/IMG_0172 (2).JPG
429 Client Error: Too Many Requests for url: https://lh3.googleusercontent.com/lr/A...
02-26 12:28:36 ERROR    FAILURE 3 downloading photos/2016/09/IMG_0172 (2).JPG
02-26 12:28:36 INFO     downloading 4 photos/2016/09/IMG_0154 (2).JPG
^C02-26 12:28:36 WARNING  Cancelling download threads ...
02-26 12:28:37 WARNING  Cancelled download threads
429 Client Error: Too Many Requests for url: https://lh3.googleusercontent.com/lr/A...
02-26 12:28:37 ERROR    FAILURE 4 downloading photos/2016/09/IMG_0154 (2).JPG

It looks like they do rate limiting on IP addresses as well:

photos@photos:~$ curl -I 'https://lh3.googleusercontent.com/lr/....'
HTTP/2 429

Because any other IP address works fine:

root@dev:~# curl -I 'https://lh3.googleusercontent.com/lr/....'
HTTP/2 200

Perhaps I shouldn't have ran this at 128 threads at 870 Mbit/s for nearly 3 hours. :)

Judging by the code base, there is no code that specifically handles 429 responses. It would be a good idea to throttle and slow down to avoid Google from placing more severe limits on the IP address:
https://developers.google.com/photos/library/guides/best-practices#retrying-failed-requests

For 429 errors, the client may retry with minimum 30 s delay.
A better approach is to retry with increasing delays between attempts.

@gilesknap
Copy link
Owner

your first raise cause find_bad_items() to be invoked and that needs to be removed since it was there to handle an earlier issue in the Google API that is fixed.

My error handling in this bit of the code is rather poor and your second fix helps.

I have never seen quota failure like this before. I've downloaded my entire 100,000 item library multiple times to the same PC.

I'm at work right now but will do a little more digging this evening.

Oh wait, I just saw your last bit about threads and MBits - yeah I have not tested at anywhere near that rate!

I will put some better error handling in the code, at least so that it reports the issue correctly.

I could add the suggested throttling too but would not be able to real-world test it! I originally added the --max-threads option to help people throttle their bandwidth and had not anticipated your use case.

Good to know that gphotos-sync (almost) held it together under such load. :-)

@gilesknap gilesknap pinned this issue Feb 26, 2020
@ozupey
Copy link
Author

ozupey commented Feb 26, 2020

I have a library of 2.9 TB, I just wanted to speed things up :D

For a quick fix, just sleep all HTTP requests for 30 seconds whenever you hit a 429 and double the sleep duration until you get a 200 again. At that point, you can reset it to 30 seconds again.

To real-world test it, just change your credentials quota to 1 and then exceed it. :)

rclone has implemented this pretty well. Here are the HTTP codes they apply this kind of throttling to: https://github.com/rclone/rclone/blob/29b4f211ab95b42b7544103984f025eab3281b2a/backend/googlephotos/googlephotos.go#L206

@gilesknap
Copy link
Owner

gilesknap commented Feb 26, 2020 via email

@ozupey
Copy link
Author

ozupey commented Feb 26, 2020

Unfortunately, Python is not my strong suit, so I don't think you or I would be happy with the resulting code. :) You could use the Google APIs console to set a very low quota to trigger the 429:
Screenshot 2020-02-26 at 21 06 47

I'd be happy to stress test it though and see if it holds up in a real-world scenario.

@gilesknap
Copy link
Owner

OK, thanks for excellent diagnosis and useful info.

I'll take a look at this sometime this week and get you to try it out.

@ozupey
Copy link
Author

ozupey commented Feb 26, 2020

Just found out one more quirk, the https://lh3.googleusercontent.com URLs that it tries to download seem to be blocked for every IP making the request now. I'm guessing it has something embedded that tells Google which IP originally requested it.

No matter which IP I use to access the https://lh3.googleusercontent.com URLs now, all of them are returning a 429, even IPs that never made a single request to Google ever.

I'm gonna wait 12-24 hours and try again. :)

@gilesknap
Copy link
Owner

Did it come back to life after a day?

@ozupey
Copy link
Author

ozupey commented Feb 28, 2020

It did. It kept giving failures for roughly 12 hours afterwards, but I think it reset at midnight Pacific time (which is also when the API quotas reset), after which there were no more failures. :)

@gilesknap
Copy link
Owner

Cool.
I think this means that the backing off for 30 seconds is not likely to be effective.
For the API calls, I report the error that quota is exceeded and that is all can do. It is then necessary to wait until the quota refreshes.

@ozupey
Copy link
Author

ozupey commented Feb 28, 2020

I believe the reason for the long blacklisting was because the initial 429 errors were ignored. That's why Google recommends the "Exponential backoff", where the sleeps get longer until the 429 is gone.

@gilesknap
Copy link
Owner

OK, that makes sense. I'll add the backoff soon.

@gilesknap
Copy link
Owner

The latest push of the code in the branch 429-reponse handles backoff correctly. It reads the 'retry after' response header to get the server's recommendation on how long to pause.

Not much credit due to me since it turns out I just needed to switch on the feature which is in the python urllib3 library.

I'm not sure if you are interested in gphotos-sync anymore now you are aware of the video transcoding issue. But if you are then please can you try a little stress test?

Thanks, giles.

@ozupey
Copy link
Author

ozupey commented Feb 29, 2020

As soon as my bandwidth cap has reset for this month, I'll give it another go and let you know. Feel free to send me a reminder if I didn't get back to you by the end of next week.

Enjoy your weekend! :)

@gilesknap
Copy link
Owner

@ozupey please can you try out that stress test if you have time?
Thanks

@ozupey
Copy link
Author

ozupey commented Mar 9, 2020

@gilesknap Did I do anything wrong? It just instantly dies now:

03-09 07:33:53 ERROR
Process failed.
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/gphotos/Main.py", line 418, in main
    self.start(args)
  File "/usr/local/lib/python3.6/dist-packages/gphotos/Main.py", line 369, in start
    self.do_sync(args)
  File "/usr/local/lib/python3.6/dist-packages/gphotos/Main.py", line 333, in do_sync
    self.google_photos_idx.index_photos_media()
  File "/usr/local/lib/python3.6/dist-packages/gphotos/GooglePhotosIndex.py", line 154, in index_photos_media
    favourites=self.favourites,
  File "/usr/local/lib/python3.6/dist-packages/gphotos/GooglePhotosIndex.py", line 119, in search_media
    pageToken=page_token, pageSize=self.PAGE_SIZE
  File "/usr/local/lib/python3.6/dist-packages/gphotos/restclient.py", line 100, in execute
    result.raise_for_status()
  File "/usr/lib/python3/dist-packages/requests/models.py", line 935, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 429 Client Error: Too Many Requests for url: https://photoslibrary.googleapis.com/v1/mediaItems?pageSize=100
03-09 07:33:53 WARNING  Done.
03-09 07:33:53 INFO     Elapsed time = 0:00:00.205800

Note: I'm just testing this by changing my quotes to 1:
Screenshot 2020-03-09 at 15 34 52

@gilesknap
Copy link
Owner

So you have tripped the 429 on the API call rather than on the Base URL.

I have not implemented the 429 handling for the API calls (but probably should and it is easy to do). I implemented it in the download code which uses the Base URL quota (this is the one you bumped into, I believe).

Now the other problem is that the Quotas you can adjust on the API console are on a daily basis. There is also an underlying rate quota which does not appear to be configurable.

I don't really think the request backoff can deal with hitting the daily quota. It is intended to deal with the rate quotas. Apparently these are 10/s/IPaddress and 100/s/user - at least for the APIs, maybe the BaseURL Rate Quota is higher. If you hit the daily quota and try to back off you would need to do so for several hours until the day rolls over.

Having written this all down, I have convinced myself that you probably never hit the rate quota, but did hit the daily quota. I assert that in this case aborting with a sensible error is the best we can do.

Perhaps you could verify that you are happy with that by switching 'All Requests' to a high number and leaving 'BaseUrl requests' at 1.

Thanks

@gilesknap
Copy link
Owner

Closing this since I think we have done what we can. Reopen if you have any more to add. Thanks for an interesting issue!

@ozupey
Copy link
Author

ozupey commented May 15, 2020

Just so you know, I just hit the 429s again on the latest version installed via pip(env). This time I stuck to all the default arguments, including max-threads.

05-15 14:33:11 ERROR    FAILURE 14561 downloading photos/2017/11/IMG_1944.JPG - 429 Client Error: Too Many Requests for url: https://lh3.googleusercontent.com/lr/AF...=d
05-15 14:33:11 ERROR    FAILURE 14562 downloading photos/2018/12/IMG_8856.JPG - 429 Client Error: Too Many Requests for url: https://lh3.googleusercontent.com/lr/AF...=d
05-15 14:33:11 WARNING  Downloaded 76603 Items, Failed 14562, Already Downloaded 76603

@gilesknap
Copy link
Owner

Am I correct in saying that once you hit this you have to wait until the next day?

If that is correct then the only thing we can do is cap the maximum download rate to match Google.

@ozupey
Copy link
Author

ozupey commented May 15, 2020

I'm not sure how long it will take until it will be reset again. As mentioned before, Google uses "exponential backoff". This means each time you hit a 429, you should wait longer and longer until you do the next HTTP request:

  1. Make a request to Cloud IoT Core.
  2. If the request fails, wait 1 + random_number_milliseconds seconds and retry the request.
  3. If the request fails, wait 2 + random_number_milliseconds seconds and retry the request.
  4. If the request fails, wait 4 + random_number_milliseconds seconds and retry the request.

If I look at the logs, gphotos-sync just skipped the photo instantly and moved on to the next without any delay whatsoever. This will make the block longer and longer and is the opposite of what Google recommends:

For 429 errors, the client may retry with minimum 30 s delay. For all other errors, retry may not be applicable.

A better approach is to retry with increasing delays between attempts. Usually, the delay is increased by a multiplicative factor with each attempt, an approach known as Exponential backoff.

So, ideally, as soon as you hit a 429, wait for 30 seconds and try again. If that fails, wait 60 seconds and try again. If that fails, wait 120 seconds and try again. As soon as the 429 vanishes, you can reset the delay.

@gilesknap
Copy link
Owner

I don't think you'll see any retries in the logs since I'm letting the underlying http library handle the 429 errors.

It is my understanding that the Google API call quotas have a calls / second quota as well as a calls / day cap. Thus 429 handling should work with API calls. However, I believe that the download URLs only have a daily quota and therefore once the quota is exceeded then you are out of luck.

I will confess that I have not traced the code or the network to discover if the library is correctly handling 429s. I don't have the bandwidth to do a genuine test. However I might be able to do some unit testing using https://httpbin.org/status/429 or generating my one 429 responses.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants