Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add caching capabilities with in-memory cache #48

Open
luispfgarces opened this issue Nov 28, 2021 · 0 comments
Open

Add caching capabilities with in-memory cache #48

luispfgarces opened this issue Nov 28, 2021 · 0 comments
Labels
enhancement New feature or request help wanted Extra attention is needed medium priority Medium priority issue

Comments

@luispfgarces
Copy link
Contributor

Purpose

Create caching functionality for rules fetched from data source. Implement first cache provider in-memory.

Main functionalities

Extend Rules.Framework functionalities by adding following cache behavior:

  • Cache response from rules data source.
  • Cache response from specific operations - optionally activated if users desire so.
  • Cache invalidation of specific sets of cached values.

Proposal

Perform changes to selected points:

  • Cache response from rules data source - with cache key by content type and dates interval used to fetch from rules data source.
    • e.g. If one tries to match one rule for "ContenType1" on date 2021-11-28, RulesEngine will ask to IRulesDataSource the collection of rules active for "ContenType1" between 2021-11-28 (limit inclusive) and 2021-11-29 (limit exclusive). When this happens and caching is active, result will be stored on cache with key composed of "ContenType1", 2021-11-28 and 2021-11-29.
    • If a request is placed to rules data source that generates a given cache key, and that cache key has value on cache provider, stored value must be returned as response from rules data source.
  • Cache response from specific operations of RulesEngine - MatchOneAsync(...), MatchManyAsync(...), SearchAsync(...) - considering conditions:
    • Generate a unique hash from operations parameter values.
    • Register, per each operation, the requests made with same hash for last X minutes.
      • Should be a configurable value?
    • Cache response of operation hashes with request counters superior to defined thresholds
      • Have a threshold based on absolute number of requests - configurable.
      • Have a threshold based on the percentage of requests (requests made for a specific operation hash vs. requests made for all operation hashes) - configurable.
    • Return cached response when criteria of thresholds is matched.
    • Implement mechanisms to ensure cache invalidation.
      • Always use a sliding window, where older requests tend to leave statistic on a cleanup activity and new requests are registered as they are done.
      • 2 alternatives to assure cache consistency: cached value max TTL or cache invalidation on operations that cause changes to rules data source.
    • Feature is optional, meaning, adding caching capabilities does not imply caching operation responses - needs to be explicitly configured to do so.
  • Invalidate all cached values for any change done to a content type (add/update rules).
    • Consider invalidating only a subset if possible, meaning, if changed rule/rules only affect evaluation from date A to date B, only cached values between A and B need to be invalidated, remaining ones are still valid.

Create cache abstractions to allow:

  • Setting a new cached value by cache key.
  • Invalidate a cached value by cache key.
  • Verify if a cache key has value.

And implement a cache specific implementation for in-memory, considering the abstractions above and grouping cached values by content type.

NOTE: proposal is open to debate and actual implementation plan is to be added later here.

@luispfgarces luispfgarces added enhancement New feature or request help wanted Extra attention is needed labels Nov 28, 2021
@luispfgarces luispfgarces changed the title Add caching capabilities wit in-memory cache Add caching capabilities with in-memory cache Dec 2, 2021
@Daniel-C-Dias Daniel-C-Dias added the medium priority Medium priority issue label May 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed medium priority Medium priority issue
Projects
None yet
Development

No branches or pull requests

2 participants