Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimise Performance for Reverse Chaining in FHIR Search #1772

Open
alexanderkiel opened this issue Jun 4, 2024 · 1 comment
Open

Optimise Performance for Reverse Chaining in FHIR Search #1772

alexanderkiel opened this issue Jun 4, 2024 · 1 comment
Assignees

Comments

@alexanderkiel
Copy link
Member

Currently the implementation of the _has search parameter in blaze.db.impl.search-param.has uses a cache for resource handles. The cache is filled by the computation function resource-handles* which can take a long time. One site has experienced a warning from Caffeine that the long-running computation halted eviction. So we should revisit why ne need that cache and if how we can handle the cache updates differently.

@alexanderkiel alexanderkiel self-assigned this Jun 4, 2024
@ben-manes
Copy link

ben-manes commented Jun 5, 2024

This is a tiny cache at 100 entries, which means the number of internal hash bins is kept very low. These bins are locks in ConcurrentHashMap, so fewer leads to a higher chance of collisions between two independent writes. A simple solution is to increase the initialCapacity, which merely increases the table (array) size, to reduce the chance of colliding writes. For example setting it to 10_000 would not waste much space and make collisions much less common, but would not eliminate them.

Caffeine.newBuilder()
    .initialCapacity(10_000)
    .maximumSize(100)
    .build();

A slightly more complex approach is to use AsyncCache, optionally with JCiP-style computations to avoid wasting threads. This performs the mapping updates immediately and defers the loading to be performed under the context of a future. An example in Java is below,

AsyncCache<K, V> cache = Caffeine.newBuilder().buildAsync();

V get(K key, Function<K, V> mappingFunction) {
  var future = new CompletableFuture<DecodedValue>();
  var prior = cache.asMap().putIfAbsent(key, future);
  if (prior != null) {
    return prior.join();
  }
  try {
    var value = mappingFunction.apply(key);
    future.complete(value);
    return value;
  } catch (Throwable t) {
    future.completeExceptionally(t);
    throw t;
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants