Requests-cache is a transparent persistent cache for requests (version >= 1.1.0) library.
Just write:
import requests
import requests_cache
requests_cache.install_cache('demo_cache')
And all responses with headers and cookies will be transparently cached to demo_cache.sqlite database. For example, following code will take only 1-2 seconds instead of 10, and will run instantly on next launch:
for i in range(10):
requests.get('http://httpbin.org/delay/1')
It can be useful when you are creating some simple data scraper with constantly changing parsing logic or data format, and don't want to redownload pages or write complex error handling and persistence.
requests-cache
ignores all cache headers, it just caches the data for the
time you specify.
If you need library which knows how to use HTTP headers and status codes, take a look at httpcache and CacheControl.
- Documentation at readthedocs.org
- Source code and issue tracking at GitHub.
- Working example at Real Python.