Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Perf: Lru cache #93

Merged
merged 11 commits into from
Aug 6, 2021
69 changes: 10 additions & 59 deletions lib/Onyx.js
Original file line number Diff line number Diff line change
Expand Up @@ -102,19 +102,6 @@ function isCollectionKey(key) {
return _.contains(_.values(onyxKeys.COLLECTION), key);
}

/**
* Find the collection a collection item belongs to
* or return null if them item is not a part of a collection
* @param {string} key
* @returns {string|null}
*/
function getCollectionKeyForItem(key) {
return _.chain(onyxKeys.COLLECTION)
.values()
.find(name => key.startsWith(name))
.value();
}

/**
* Checks to see if a given key matches with the
* configured key of our connected subscriber
Expand Down Expand Up @@ -412,48 +399,6 @@ function connect(mapping) {
return connectionID;
}

/**
* Remove cache items that are no longer connected through Onyx
* @param {string} key
*/
function cleanCache(key) {
// Don't remove default keys from cache, they don't take much memory and are accessed frequently
if (_.has(defaultKeyStates, key)) {
return;
}

const hasRemainingConnections = _.some(callbackToStateMapping, {key});

// When the key is still used in other places don't remove it from cache
if (hasRemainingConnections) {
kidroca marked this conversation as resolved.
Show resolved Hide resolved
return;
}

// When this is a collection - also recursively remove any unused individual items
if (isCollectionKey(key)) {
cache.drop(key);

getAllKeys().then(cachedKeys => _.chain(cachedKeys)
.filter(name => name.startsWith(key))
.forEach(cleanCache));

return;
}

// When this is a collection item - check if the collection is still used
const collectionKey = getCollectionKeyForItem(key);
if (collectionKey) {
// When there's an active subscription for a collection don't remove the item
const hasRemainingConnectionsForCollection = _.some(callbackToStateMapping, {key: collectionKey});
if (hasRemainingConnectionsForCollection) {
return;
}
}

// Otherwise remove the value from cache
cache.drop(key);
}

/**
* Remove the listener for a react component
*
Expand All @@ -471,11 +416,7 @@ function disconnect(connectionID, keyToRemoveFromEvictionBlocklist) {
removeFromEvictionBlockList(keyToRemoveFromEvictionBlocklist, connectionID);
}

const key = callbackToStateMapping[connectionID].key;
delete callbackToStateMapping[connectionID];

// When the last subscriber disconnects, drop cache as well
cleanCache(key);
}

/**
Expand Down Expand Up @@ -742,13 +683,16 @@ function mergeCollection(collectionKey, collection) {
* @param {function} registerStorageEventListener a callback when a storage event happens.
* This applies to web platforms where the local storage emits storage events
* across all open tabs and allows Onyx to stay in sync across all open tabs.
* @param {Number} [options.maxCachedKeysCount=55] Sets how many recent keys should we try to keep in cache
* Setting this to 0 would keep cache forever
* @param {Boolean} [options.captureMetrics]
*/
function init({
keys,
initialKeyStates,
safeEvictionKeys,
registerStorageEventListener,
maxCachedKeysCount = 55,
captureMetrics = false,
}) {
if (captureMetrics) {
Expand All @@ -757,6 +701,13 @@ function init({
applyDecorators();
}

if (maxCachedKeysCount > 0) {
setInterval(() => {
// Try to free anything dated from cache
cache.removeLeastRecentUsedKeys(maxCachedKeysCount);
}, 10 * 1000);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created this interval based cleanup because trying to free cache after each disconnect is too often
For example when I switch to a new chat - 10 old connections would disconnect at the same time and cause 10 cache cleans, where only 1 would be enough

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of interval based solution perhaps we can hook to something that happens less frequently
I think it would be OK to clean cache every minute or even less frequently as long as we clean it

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel maybe the interval based solution is too aggressive since most of the time we don't really need to clean the cache at all.

What if we hook the cache clean into the connection of a safe eviction key like the reportActions_*. If we are connecting to another large object that seems like a good opportunity to clean the cache in preparation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the idea, that sounds much better, will update

}

// Let Onyx know about all of our keys
onyxKeys = keys;

Expand Down
36 changes: 36 additions & 0 deletions lib/OnyxCache.js
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,13 @@ class OnyxCache {
*/
this.storageKeys = new Set();

/**
* @private
* Unique list of keys maintained in access order (most recent at the end)
* @type {Set<string>}
*/
this.recentKeys = new Set();

/**
* @private
* A map of cached values
Expand Down Expand Up @@ -53,6 +60,7 @@ class OnyxCache {
* @returns {*}
*/
getValue(key) {
this.addToAccessedKeys(key);
return this.storageMap[key];
}

Expand Down Expand Up @@ -83,6 +91,7 @@ class OnyxCache {
*/
set(key, value) {
this.addKey(key);
this.addToAccessedKeys(key);
this.storageMap[key] = value;

return value;
Expand All @@ -106,6 +115,7 @@ class OnyxCache {
const storageKeys = this.getAllKeys();
const mergedKeys = _.keys(data);
this.storageKeys = new Set([...storageKeys, ...mergedKeys]);
_.each(mergedKeys, key => this.addToAccessedKeys(key));
}

/**
Expand Down Expand Up @@ -144,6 +154,32 @@ class OnyxCache {

return this.pendingPromises[taskName];
}

/**
* @private
* Adds a key to the top of the recently accessed keys
* @param {string} key
*/
addToAccessedKeys(key) {
// Removing and re-adding a key ensures it's at the end of the list
this.recentKeys.delete(key);
this.recentKeys.add(key);
}

/**
* Remove keys that don't fall into the range of recently used keys
* @param {number} recentKeysSize - a list of most recent keys from this size will remain in cache.
*/
removeLeastRecentUsedKeys(recentKeysSize) {
if (this.recentKeys.size > recentKeysSize) {
// Get the last N keys by doing a negative slice
const recentlyAccessed = [...this.recentKeys].slice(-recentKeysSize);
const storageKeys = _.keys(this.storageMap);
const keysToRemove = _.difference(storageKeys, recentlyAccessed);

_.each(keysToRemove, this.drop);
}
}
kidroca marked this conversation as resolved.
Show resolved Hide resolved
}

const instance = new OnyxCache();
Expand Down
19 changes: 9 additions & 10 deletions lib/decorateWithMetrics.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,6 @@ function decorateWithMetrics(func, alias = func.name) {
throw new Error(`"${alias}" is already decorated`);
}

stats[alias] = [];

kidroca marked this conversation as resolved.
Show resolved Hide resolved
function decorated(...args) {
const startTime = performance.now() - PERFORMANCE_OFFSET;

Expand All @@ -48,6 +46,7 @@ function decorateWithMetrics(func, alias = func.name) {
methodName: alias,
startTime,
endTime,
duration: endTime - startTime,
args,
});
});
Expand Down Expand Up @@ -78,8 +77,7 @@ function sum(list, prop) {
*/
function getMetrics() {
const summaries = _.chain(stats)
.map((data, methodName) => {
const calls = _.map(data, call => ({...call, duration: call.endTime - call.startTime}));
.map((calls, methodName) => {
const total = sum(calls, 'duration');
const avg = (total / calls.length) || 0;
const max = _.max(calls, 'duration').duration || 0;
Expand Down Expand Up @@ -144,21 +142,22 @@ function toDuration(millis, raw = false) {
* @param {'console'|'csv'|'json'|'string'} [options.format=console] The output format of this function
* `string` is useful when __DEV__ is set to `false` as writing to the console is disabled, but the result of this
* method would still get printed as output
* @param {string[]} [options.methods] Print stats only for these method names
* @returns {string|undefined}
*/
function printMetrics({raw = false, format = 'console'} = {}) {
function printMetrics({raw = false, format = 'console', methods} = {}) {
const {totalTime, summaries, lastCompleteCall} = getMetrics();

const tableSummary = MDTable.factory({
heading: ['method', 'total time spent', 'max', 'min', 'avg', 'time last call completed', 'calls made'],
leftAlignedCols: [0],
});

const methodCallTables = _.chain(summaries)
.filter(method => method.avg > 0)
.sortBy('avg')
.reverse()
.map(({methodName, calls, ...methodStats}) => {
const methodNames = _.isArray(methods) ? methods : _.keys(summaries);

const methodCallTables = _.chain(methodNames)
.map((methodName) => {
const {calls, ...methodStats} = summaries[methodName];
tableSummary.addRow(
methodName,
toDuration(methodStats.total, raw),
Expand Down