-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
External Storage: Google Drive: 403 User Rate Limit Exceeded #20481
Comments
The only way to fix this is trying to reduce the number of API calls. Such repeated calls could be buffered/prevented by using a local stat cache, similar to #7897 |
I really like the idea of buffering the API requests. That would both improve speed and reduce the number of requests. Another way to deal with that behavior is to retry the request when it fails with that error as Google suggests. The question is whether that's the task of files_external or the client. For more abstract things I'd say this is the task of the client, for internal things I'd say this can be the task of files_external. |
A bit of research shows that the AWS library and others are referring to a curl plugin called "BackoffStrategy". That might be it: https://github.com/Guzzle3/plugin-backoff/blob/master/CurlBackoffStrategy.php |
there's also batching: https://developers.google.com/api-client-library/php/guide/batch |
BackoffStrategy seems to implement what I mentioned in the first post and is recommended by Google. Here's the link again (it is from the REST API, but that applies to the other APIs as well): https://developers.google.com/drive/web/handle-errors#implementing_exponential_backoff Batching will probably not help in regard of this issue as "A set of n requests batched together counts toward your usage limit as n requests, not as one request." (from https://developers.google.com/drive/v2/web/batch) |
Unfortunately it seems the library OC uses doesn't use the Curl library but rather uses PHP's So even if that plugin was added, I'm not sure it would fit in. |
@davitol this is what you observed yesterday |
@PVince81 thanks 😺 |
Setting this to critical, it has been observed already 3-4 times in different environments. |
Considering that the limit is 1000 requests per 100 seconds per user we probably need some change detection here. Otherwise this seems like something that one can easily fall into again. |
I too am experiencing this issue on OC 9. I am attempting to upload a music library to ownCloud so I have a lot of little files being synced. If you need any more logs, I will happily provide them. |
i have the same problem, should be fixed somehow.. maybe we can cache the api calls? especially for the single files.. Or introduce some sort of rate limit counter, which postpones api calls not absolutely necesarry? |
I think there is already some caching inside the library, I remember seeing some code that "remembers" calls made to the same URLs. Not sure if that works though. |
I am experiencing issues involving the 403 User Rate Limit Exceeded message. Owncloud is 9.0.1 running on Debian 7 |
I'm not sure but from what I heard, in some setups people seem to hit against the limit less often. In theory one could add a few The proper solution is to implement exponential backoff, which would use adaptive sleep to sleep longer and longer until the request goes through, retrying several times. |
(I haven't checked how our GDrive stuff works, I don't even know how often we "list" the remote directory) Maybe it's interesting to incorporate this into our caching or even ETag logic: https://developers.google.com/drive/v2/web/manage-changes#retrieving_changes |
Good news ! We've updated the Google SDK library and from grepping the code I saw that there are parts that will automatically do the exponential backoff ! core/apps/files_external/3rdparty/google-api-php-client/src/Google/Task/Runner.php Line 23 in c23bc91
|
From looking at the code it seems that it should already work without any additional configs, so I'm going to close this. If you are able to, please try the 9.1beta1 build that contains this update and let me know if you're still getting "403 User Rate Limit Exceeded" as I'm not able to reproduce this locally. |
Most of them are fairly small. Some tiny spreadsheets and word documents. I'd say there's about 225 files total, with maybe 1 or 2 of them being between 75 and 100M, some big photoshop .PSD files. It hasn't even gotten to the big files yet... it's synced about 40 or so files, and everytime I try to pause the sync then resume it to try to grab some more, by the time it verifies the files it already has, it hits the user rate limit before it can download anymore. |
Would it be possible to return a different error code to the client in addition to trying to fix the rate limit issue? |
You are probably referring to owncloud/client#5187 (comment). In the case of PUT or any write operation, we could change Webdav to translate the exception to another failure code. @ogoffart any suggestion ? |
The problem is that 503 for PUT will stop the sync. Alternative is to dinstinguish code that should block the sync using another header or something. |
I need to build a good test case where this issue is reproducible every time. Any suggestions ? |
@PVince81 upload a lot of files to Google Drive and sync everything to your local machine. I guess that should do it. A lot of small files, and perhaps 100.000 files or so. |
That sounds about right. That exactly what I have… probably a little over 225 small word docs and spreadsheets. Then as soon as I add that folder to sync it pretty much does grab a set number of files which seems to be about 100, hit the user rate limit error, then when it tries again it downloads a few more… but the problem is, eventually it reaches a max where when trying again, just the verification that the original files are still there/haven’t changed, hits the rate limit before new files are downloaded. So I’m stuck with half of my good drive downloaded. That seems to be right around 100 files.
… On Nov 30, 2016, at 11:50 AM, Christophe Trefois ***@***.***> wrote:
@PVince81 <https://github.com/PVince81> upload a lot of files to Google Drive and sync everything to your local machine.
I guess that should do it. A lot of small files, and perhaps 100.000 files or so.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub <#20481 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AF4rNEszbTM8dajGUbMGsa1s0D5Xv2VEks5rDalqgaJpZM4GhGZK>.
|
Ah to be sure I meant 100 000 files not 100. That's what our users has anyway :)
…Sent from my iPhone
On 30 Nov 2016, at 18:11, stevenmcastano ***@***.***> wrote:
That sounds about right. That exactly what I have… probably a little over 225 small word docs and spreadsheets. Then as soon as I add that folder to sync it pretty much does grab a set number of files which seems to be about 100, hit the user rate limit error, then when it tries again it downloads a few more… but the problem is, eventually it reaches a max where when trying again, just the verification that the original files are still there/haven’t changed, hits the rate limit before new files are downloaded. So I’m stuck with half of my good drive downloaded. That seems to be right around 100 files.
> On Nov 30, 2016, at 11:50 AM, Christophe Trefois ***@***.***> wrote:
>
> @PVince81 <https://github.com/PVince81> upload a lot of files to Google Drive and sync everything to your local machine.
>
> I guess that should do it. A lot of small files, and perhaps 100.000 files or so.
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub <#20481 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AF4rNEszbTM8dajGUbMGsa1s0D5Xv2VEks5rDalqgaJpZM4GhGZK>.
>
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Yup, I would say that would be a pretty solid test case as mine stops working LONG before that. I just tried it again to confirm. While it hasn’t been working, I basically have it turned off or the desktop clients just keep trying over and over again to sync and it’s just unnecessary load on the sever… not much though, I have a much smaller installation, only about 8 users.
… On Nov 30, 2016, at 12:33 PM, Christophe Trefois ***@***.***> wrote:
Ah to be sure I meant 100 000 files not 100. That's what our users has anyway :)
Sent from my iPhone
> On 30 Nov 2016, at 18:11, stevenmcastano ***@***.***> wrote:
>
> That sounds about right. That exactly what I have… probably a little over 225 small word docs and spreadsheets. Then as soon as I add that folder to sync it pretty much does grab a set number of files which seems to be about 100, hit the user rate limit error, then when it tries again it downloads a few more… but the problem is, eventually it reaches a max where when trying again, just the verification that the original files are still there/haven’t changed, hits the rate limit before new files are downloaded. So I’m stuck with half of my good drive downloaded. That seems to be right around 100 files.
>
> > On Nov 30, 2016, at 11:50 AM, Christophe Trefois ***@***.***> wrote:
> >
> > @PVince81 <https://github.com/PVince81> upload a lot of files to Google Drive and sync everything to your local machine.
> >
> > I guess that should do it. A lot of small files, and perhaps 100.000 files or so.
> >
> > —
> > You are receiving this because you were mentioned.
> > Reply to this email directly, view it on GitHub <#20481 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AF4rNEszbTM8dajGUbMGsa1s0D5Xv2VEks5rDalqgaJpZM4GhGZK>.
> >
>
> —
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub, or mute the thread.
>
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub <#20481 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AF4rNNQWhrm8UcwXty3SEqm6szQ6L83Oks5rDbOGgaJpZM4GhGZK>.
|
I store my entire music Library in Google Drive (~65GB) This includes mixtapes and other shit you cannot find on Spotify and the likes. I also tend to keep high quality files like 320kbps MP3s and FLACs/ALACs I tried to use Drive as my MASTER copy but the Google Drive client along with syncing between NTFS+ and HFS has fucked it sometimes and created duplicates, doesn't sync files, etc. I'm paying for 1TB premium in Google I just got this error and I assumed I would, I have moved 500GB of data through Googles network they have to have some sort of internal guage for consumers, correct? |
Another way to put this to the test I've found is to upload/sync a folder with 1,000 files - any size and then rename them locally and resync. Expected behaviour is to simply rename remote files and the procedure is offered correctly. Trying to rename at the rate GoodSync does on Goohle Drive, although quick and we all prefer quick, these 403 User Limits are hit and no files that are already uploaded are renamed either. |
I suspect that GoodSync doesn't rename the way the desktop client renames. Instead of a single MOVE it might be doing a copy of every file first. Or maybe it does a MKCOL on the new folder and then recursively move every file there instead of doing just once. Looking at the web server access log should tell. In general I'd expect a simple Webdav MOVE to not cause any rate limit issue. |
Here you go, one liner to enable retries in the GDrive lib: #27530 |
Please all help testing this and let us know if it solved your problem. You might still bump into API limits from time to time but it shouldn't be as bad as before. |
This will be in 9.1.5 |
Hi there and if i using this requests from 5 devices i will be like 1 user? |
Steps to reproduce
Expected behaviour
Files should be uploaded without errors.
Actual behaviour
Error: (403) User Rate Limit Exceeded
Google Drive has a limitation of maximum requests per second, that is set to 10 max according to Google's API documentation and the (max) value set in the Google Developer Console.
According to Google's documentation an application should implement exponential backoff when it gets that error, see https://developers.google.com/drive/web/handle-errors#implementing_exponential_backoff
Although owncloud/google returns a 403 error the upload succeeds sometimes, see also http://stackoverflow.com/questions/18578768/403-rate-limit-on-insert-sometimes-succeeds
Finally the owncloud client uploaded all my files successfully, I guess it has retried the ones that failed initially.
Server configuration
Operating system:
Linux af91f 2.6.32-504.8.1.el6.x86_64
Web server:
Apache (unknown version; shared hosting provider)
Database:
5.5.46 - MySQL Community Server
PHP version:
PHP Version 5.4.45
ownCloud version: (see ownCloud admin page)
ownCloud 8.2.0 (stable)
Updated from an older ownCloud or fresh install:
fresh install
List of activated apps:
default + external storage
The content of config/config.php:
config.txt
Are you using external storage, if yes which one: local/smb/sftp/...
Yes, Google Drive.
Are you using encryption: yes/no
No
Are you using an external user-backend, if yes which one: LDAP/ActiveDirectory/Webdav/...
No
Client configuration
Browser:
irrelevant
Operating system:
irrelevant; in my case it was a Windows XP Virtual Machine
Logs
Web server error log
empty
ownCloud log (data/owncloud.log)
owncloud.txt
Browser log
irrelevant
The text was updated successfully, but these errors were encountered: