Skip to content

Add self.retry_after to SparkRateLimitError class #49

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 21, 2017

Conversation

sardoodledom
Copy link
Contributor

If one sets the wait_on_rate_limit to True in their CiscoSparkAPI
class instantiation, they will hit an error telling them that
retry_after is not an attribute of the SparkRateLimitError class.

This is because the RestSession class checks for its existence
when it catches the error at line 282 in restsession.py

Error:
File "/home/daspano/../ciscosparkapi/restsession.py", line 282, in request
if self.wait_on_rate_limit and e.retry_after:
AttributeError: 'SparkRateLimitError' object has no attribute 'retry_after'

This commit adds that the retry_after attribute as an int, as my testing
shows the Retry-After result always being an integer in seconds.

Ex:
-----------------------------------Response------------------------------------
429 Too Many Requests
Date: Wed, 15 Nov 2017 21:47:00 GMT
Content-Type: application/json
Content-Length: 182
Connection: keep-alive
TrackingID: ROUTER_5A0CB5D2-B92F-01BB-7C3F-AC12D9227C3F
Cache-Control: no-cache
Retry-After: 194
Server: Redacted
l5d-success-class: 1.0
Via: 1.1 linkerd
content-encoding: gzip
Strict-Transport-Security: max-age=63072000; includeSubDomains; preload

If one sets the wait_on_rate_limit to True in their CiscoSparkAPI
class instantiation, they will hit an error telling them that
retry_after is not an attribute of the SparkRateLimitError class.

This is because the RestSession class checks for its existence
when it catches the error at line 282 in restsession.py

Error:
  File "/home/daspano/../ciscosparkapi/restsession.py", line 282, in request
    if self.wait_on_rate_limit and e.retry_after:
AttributeError: 'SparkRateLimitError' object has no attribute 'retry_after'

This commit adds that the retry_after attribute as an int, as my testing
shows the Retry-After result always being an integer in seconds.

Ex:
-----------------------------------Response------------------------------------
429 Too Many Requests
Date: Wed, 15 Nov 2017 21:47:00 GMT
Content-Type: application/json
Content-Length: 182
Connection: keep-alive
TrackingID: ROUTER_5A0CB5D2-B92F-01BB-7C3F-AC12D9227C3F
Cache-Control: no-cache
Retry-After: 194
Server: Redacted
l5d-success-class: 1.0
Via: 1.1 linkerd
content-encoding: gzip
Strict-Transport-Security: max-age=63072000; includeSubDomains; preload
@cmlccie cmlccie self-assigned this Nov 21, 2017
@cmlccie cmlccie added the bug label Nov 21, 2017
@cmlccie
Copy link
Collaborator

cmlccie commented Nov 21, 2017

@dlspano , thank you for catching this! This is a regression bug where I "cleaned up" and simplified the exception classes after the rate-limit handling was added. Have added a couple of consistency and syntax updates to your patch, and I'm running the test suit now. The changes should be merged in shortly.

@cmlccie cmlccie merged commit b2d86c6 into WebexCommunity:master Nov 21, 2017
@cmlccie
Copy link
Collaborator

cmlccie commented Nov 21, 2017

@dlspano , the patch is included in v0.8.3, which is now available on PyPI.

Also, apparently the way I edited and merged in your changes caused your commit to get "merged" in with my updates to the patch, not showing you as the author of the original fix. I do apologize; this was not intentional. Please find another bug, update, enhancement or example that you would like to contribute to, and I'll make sure I don't squash your commit. I do appreciate your contributions to the package.

@cmlccie
Copy link
Collaborator

cmlccie commented Nov 21, 2017

Scratch that... I didn't mess it up. Your contribution is there! 😎
commits_ _ciscodevnet_ciscosparkapi

@sardoodledom
Copy link
Contributor Author

Thanks Chris! Our team is starting to do a lot of integration work with Spark's apis, creating bots, etc. If we run into anything that we can do to help out. We will.

@sardoodledom sardoodledom deleted the fix-rate-limit-err-2 branch November 29, 2017 20:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants