Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching headers #76

Open
danielrichman opened this issue Sep 16, 2014 · 6 comments
Open

Caching headers #76

danielrichman opened this issue Sep 16, 2014 · 6 comments
Milestone

Comments

@danielrichman
Copy link
Member

We should probably set some. The response is not really cacheable for longer than a few minutes, due to release of new dataset.

@danielrichman
Copy link
Member Author

If we don't trust browsers, we might want to tell them to never cache. We can give Varnish caching advice via other headers if necessary later on...?

@rjw57
Copy link
Member

rjw57 commented Nov 21, 2014

How about an etag which is a hash of (dataset which would be used) + (launch parameters)? That should be safe since we can then make the decision for the browser.

@rjw57
Copy link
Member

rjw57 commented Nov 21, 2014

(And then maybe some sort of 30-60 second age for GET requests so that Varnish can help against /.-ing.)

@danielrichman
Copy link
Member Author

I do like the ETag header idea but I think it solves a different problem to
the one I was thinking of. Someone rerunning a prediction is atypical,
probably?

Isn't our api via POST? Will browsers even cache these reqs?

I was mainly thinking about what happens if someone publicly shares a
prediction URL; the varnish thing.

...actually some googling suggests that varnish can't cache post either >_>
On 21 Nov 2014 15:42, "Rich Wareham" notifications@github.com wrote:

(And then maybe some sort of 30-60 age for GET requests so that Varnish
can help against /.-ing.)


Reply to this email directly or view it on GitHub
#76 (comment)
.

@rjw57
Copy link
Member

rjw57 commented Nov 22, 2014

The v1 API does everything via GET which should be cache-able.

@danielrichman
Copy link
Member Author

You are absolutely correct. I was confused (and at Maths happy hour) so somehow convinced myself that we were POSTing JSON.

So I think my other point stands: I don't think the ETag idea is worth it, because it probably won't yield a very high hit rate at all, and the request still has to go through to Python. We just need some sensible cache headers for varnish to use in case we have a burst and need to switch it on?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants