Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update the cacheing #2396

Merged
merged 2 commits into from
Jul 3, 2024
Merged

update the cacheing #2396

merged 2 commits into from
Jul 3, 2024

Conversation

JisanAR03
Copy link
Contributor

@arkid15r sir, can you please check this commit . update the caching

Copy link
Contributor

@arkid15r arkid15r left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe instead we shouldn't cache anonymous user responses at all?

@DonnieBLT
Copy link
Collaborator

I think it is good to cache anonymous users, it speeds up the homepage significantly. What is the issue?

@arkid15r
Copy link
Contributor

arkid15r commented Jul 3, 2024

There is a standard Django way to cache a page. In newhome there is some extra logic for authenticated users email check/verification that can be a middleware and cover all routes instead of just the home page.

@DonnieBLT
Copy link
Collaborator

Cool, well I'm fine having it only on the homepage. I feel like if it's a middleware, we will have to check all the routes and api. The main issue we are solving is the slow homepage load. I'm open to any solution for that.

@JisanAR03
Copy link
Contributor Author

@arkid15r , any suggestion for me ?

@arkid15r
Copy link
Contributor

arkid15r commented Jul 3, 2024

The point of not having authenticated user related logic is that we could use cache_page freely -- the content would be the same for everyone.

Also it may be a good idea to start from the other end in terms of page loading optimization -- index Issue::is_hidden or Point::created or make other changes to speed up the view logic.

@arkid15r
Copy link
Contributor

arkid15r commented Jul 3, 2024

@arkid15r , any suggestion for me ?

No, not at this time. I believe it's better to leave it for review to @DonnieBLT

@JisanAR03
Copy link
Contributor Author

@arkid15r ok sure. and also with that I can optimize the home page to . so I'm thinking to create a issue and start optimize the home page .

@DonnieBLT sir , should i do that ? and can you please review this PR too.

@DonnieBLT
Copy link
Collaborator

Sorry, I'm not sure what the issue is that this PR is solving. I first tried full page caching and then realized that each user has a different view. Ideally, we could have full page caching and then rewrite sections like the user icon or specific user issues using JavaScript but I thought I came up with a quick solution, just using the user session as the cache key it seems to be working OK for me. I'm not sure what problem we ran into. Is this more of an architecture design change or did we run into a specific bug?

@JisanAR03
Copy link
Contributor Author

JisanAR03 commented Jul 3, 2024

@DonnieBLT , now this PR is just update the login cache,.. if user trigger "signin, signout , or logout " any of this ,, my last pr was clear all the cache now this PR only clear the login related cache

@DonnieBLT DonnieBLT enabled auto-merge (squash) July 3, 2024 19:01
@DonnieBLT
Copy link
Collaborator

Awesome thanks! Feel free to create another issue to address the other concerns.

@DonnieBLT DonnieBLT merged commit db61ab2 into OWASP-BLT:main Jul 3, 2024
8 checks passed
@JisanAR03
Copy link
Contributor Author

JisanAR03 commented Jul 3, 2024

Awesome thanks! Feel free to create another issue to address the other concerns.

@DonnieBLT Thank you so much

@JisanAR03 JisanAR03 mentioned this pull request Jul 3, 2024
@JisanAR03 JisanAR03 deleted the update_cacheing branch July 9, 2024 15:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants