Skip to content

ConcurrentLfu Quickstart

Alex Peck edited this page May 19, 2024 · 8 revisions

ConcurrentLfu is a thread-safe bounded size approximate LFU.

Constructor

int capacity = 128;
var lfu = new ConcurrentLfu<int, SomeItem>(capacity);

Getting Items

bool success1 = lfu.TryGet(1, out var value);
var value1 = lfu.GetOrAdd(1, (k) => new SomeItem(k));
var value2 = lfu.GetOrAdd(1, (k, arg) => new SomeItem(k, arg), new Arg());
var value3 = await lfu.GetOrAddAsync(0, (k) => Task.FromResult(new SomeItem(k)));
var value4 = await lfu.GetOrAddAsync(0, (k, arg) => Task.FromResult(new SomeItem(k, arg)), new Arg());

Removing Items

bool success2 = lfu.TryRemove(1); // remove item with key == 1
lfu.Clear();
lfu.Eviction.Policy.Value.Trim(1); // remove the least recently used item

Updating Items

var item = new SomeItem(1);
bool success3 = lfu.TryUpdate(1, item);
lfu.AddOrUpdate(1, item);

Diagnostics

Console.WriteLine(lfu.Metrics.Value.HitRatio);

// enumerate keys
foreach (var k in lfu.Keys)
{
   Console.WriteLine(k);
}

// enumerate key value pairs
foreach (var kvp in lfu)
{
   Console.WriteLine($"{kvp.Key} {kvp.Value}");
}

Builder API

Below is an example using all of the possible builder options:

var lfu = new ConcurrentLfuBuilder<int, Disposable>()
    .AsAsyncCache()
    .AsScopedCache()
    .WithAtomicGetOrAdd()
    .WithScheduler(new ForegroundScheduler())
    .WithCapacity(3)
    .WithKeyComparer(StringComparer.OrdinalIgnoreCase)
    .WithConcurrencyLevel(8)
    .Build();
Builder Method Description
AsAsyncCache Build an IAsyncCache, the GetOrAdd method becomes GetOrAddAsync.
AsScopedCache Build an IScopedCache. IDisposable values are wrapped in a lifetime scope. Scoped caches return lifetimes that prevent values from being disposed until the calling code completes.
WithAtomicGetOrAdd Execute the cache's GetOrAdd method atomically, such that it is applied at most once per key. Other threads attempting to update the same key will be blocked until value factory completes. Incurs a small performance penalty.
WithCapacity Sets the maximum number of values to keep in the cache. If more items than this are added, the cache eviction policy will determine which values to remove. If omitted, the default capacity is 128.
WithKeyComparer Use the specified equality comparison implementation to compare keys. If omitted the default comparer is EqualityComparer<K>.Default.
WithConcurrencyLevel Sets the estimated number of threads that will update the cache concurrently. If omitted, the default concurrency level is Environment.ProcessorCount.
WithExpireAfterAccess Evict after a fixed duration since an entry's most recent read or write.
WithExpireAfterWrite Evict after a fixed duration since an entry's creation or most recent replacement.
WithExpireAfter Evict after a duration calculated for each item using the specified IExpiryCalculator. Expiry time is fully configurable, and may be set independently at creation, after a read and after a write.

Eviction policy compatibility

The table below summarizes which cache options are compatible with each eviction policy. Time-based expiry policies may not be combined. If the option is not listed it is supported in all variants of ConcurrentLfu.

Bounded Size (Default) ExpireAfterWrite ExpireAfterAccess ExpireAfter
Default Supported Supported Supported Supported
WithAtomicGetOrAdd Supported Supported Supported Not Supported
AsAsyncCache Supported Supported Supported Supported
AsScopedCache Supported Supported Supported Not Supported