Skip to content
This repository was archived by the owner on Apr 12, 2024. It is now read-only.

WIP: Filter cache #8942

Closed
wants to merge 4 commits into from
Closed

WIP: Filter cache #8942

wants to merge 4 commits into from

Conversation

IgorMinar
Copy link
Contributor

use a cache to avoid recomputing filters during dirty-checking.

in the modified large-table benchmark delivers 3x speed improvement for number filter (apply duration), but makes the uppercase filter 20% slower

TODO:

  • more benchmarking
  • special case dates since they are not primitives but can be cheaply compared via getTime()
  • decide on whether to make the cache an opt-in or an opt-out

they pass without any modifactions.
BREAKING CHANGE: filters that have hidden state won't be recalculated if the hidden state changes

TODO: more info...
avoid hasOwnProperty calls and more
@IgorMinar IgorMinar added this to the 1.3.0-rc.2 milestone Sep 5, 2014
@IgorMinar
Copy link
Contributor Author

@tbosch, @vojtajina please take a look and provide preliminary feedback

@jbedard
Copy link
Collaborator

jbedard commented Sep 5, 2014

Have you thought of doing this for all expressions instead of just filters? $parse could record the inputs of an expression (variables + function calls?) while parsing, then use something such as a $$watchDelegate that does $watchGroup on all the inputs of the expression. For expressions such as a == 1 || a == 2 || a == 3 it could only watch a and avoid the operators/constants being evaluated. I think this would reduce watch time of all expressions but would increase memory usage by storing all the old values...

@jbedard
Copy link
Collaborator

jbedard commented Sep 5, 2014

... 5 hours later ... checkout jbedard@844d04c (note that this is branched off #8901) - it actually works pretty well for a quick poc of what I mentioned. All the watched expressions I've looked at are faster, and object/array literals (in my test) are 10x faster because they don't have to build an object each time (I assume that's the reason). Expressions with noop filters are also 2-3x faster just by avoiding the noop filter call... (see the linked PR for the benchmarks I'm using).

This avoids executing the expression by only executing them when their input changes (or the input value is non-primitive similar to your PR). If the input changes then the expression is fully evaluated which will contain a lot of duplicate evaluation of those inputs. This means that expressions with frequently changing inputs (rare?) or non-primitive inputs could actually be slower due to the double evaluation. But if $parse was refactored more and those input values can be passed directly into parse instead of reevaluating them then this might be well worth it...

Sorry if this is off topic or the wrong place to put this, but I've been thinking about this lately and this PR seems pretty similar...

@@ -573,6 +573,12 @@ function isPromiseLike(obj) {
}


function isPrimitive(value) {
var valueType;
return (value == null || (valueType = typeof value) === 'string' || valueType === 'number');
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I missed boolean here

@IgorMinar
Copy link
Contributor Author

@jbedard very cool. I need more time to digest your changes, but it looks promising.

I'm surprised that you were able to make noop filters faster. does this include simple expressions like a | noop? I find it hard to believe that any caching can be faster than executing the noop filter.

I need to dig more into your benchmarks

@IgorMinar
Copy link
Contributor Author

awesome work btw!

@jbedard
Copy link
Collaborator

jbedard commented Sep 5, 2014

Normally the expression tree for a | noop is binaryFn + |op + getterFn + fnInvoke/argFns + noop, so reducing the watcher to just getterFn actually cuts out quite a bit...

@lgalfaso
Copy link
Contributor

lgalfaso commented Sep 6, 2014

Even when I like the general idea, if this is going to be an opt-out, then I would like it to be at the 1.3 before we get into the next RC. The reason is that if this is enabled by default, then it will break modules like https://github.com/lgalfaso/angular-dynamic-locale

@IgorMinar
Copy link
Contributor Author

I know. If now unknown problems arise we need to get this or the general
expression caching in asap.

I previously thought that we wouldn't be able to implement this easily, but
it turned out to be doable.
On Sep 6, 2014 10:16 AM, "Lucas Galfasó" notifications@github.com wrote:

Even when I like the general idea, if this is going to be an opt-out, then
I would like it to be at the 1.3 before we get into RC. The reason is that
if this is enabled by default, then it will break modules like
https://github.com/lgalfaso/angular-dynamic-locale


Reply to this email directly or view it on GitHub
#8942 (comment).

@tbosch
Copy link
Contributor

tbosch commented Sep 6, 2014

Should we delay rc1 then until this is in? I would vote for that...

On Saturday, September 6, 2014, Igor Minar notifications@github.com wrote:

I know. If now unknown problems arise we need to get this or the general
expression caching in asap.

I previously thought that we wouldn't be able to implement this easily,
but
it turned out to be doable.
On Sep 6, 2014 10:16 AM, "Lucas Galfasó" <notifications@github.com
javascript:_e(%7B%7D,'cvml','notifications@github.com');> wrote:

Even when I like the general idea, if this is going to be an opt-out,
then
I would like it to be at the 1.3 before we get into RC. The reason is
that
if this is enabled by default, then it will break modules like
https://github.com/lgalfaso/angular-dynamic-locale


Reply to this email directly or view it on GitHub
#8942 (comment).


Reply to this email directly or view it on GitHub
#8942 (comment).

}

if (isPrimitive(input)) {
if (filterCache[filterCacheKeyInput] === input && (input === undefined || filterCacheKeyInput in filterCache)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am reading this wrong of if this is the first time the filter is executed and input is undefined then the result will be undefined and not the value from the filter?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

darn it Lucas, how do you always find these corner-cases. you rock!

@rubenv
Copy link

rubenv commented Sep 10, 2014

angular-gettext would certainly break because of this. Nonetheless, I'd also welcome the change (it makes).

The breakage for locale filters can easily be avoided if you provide a clear method to wipe $$filterCache globally (once) whenever the current language changes.

@OverZealous
Copy link

Would it make sense for a filter to have an opt-out defined on the filter function. This is just me brainstorming, but what about allowing a property to be defined on the function object, something like this:

module.filter('myfilter', function() {
    function myFilter(value) {
        //...
    }
    myFilter.$useFilterCache = false;
    return myFilter;
});

Then if a filter function has $useFilterCache defined and === to false, it is never cached. (Alternatively, invert the name and change it to something easier to check, like $noFilterCache = true.)

@knalli
Copy link

knalli commented Sep 11, 2014

The breakage for locale filters can easily be avoided if you provide a clear method to wipe $$filterCache globally (once) whenever the current language changes.

I second that. Very good improvement, but we still have to instrument the cache. As far as I got it right, it currently a pure object? Perhaps you extend this with a clear/reset function?

@IgorMinar IgorMinar mentioned this pull request Sep 14, 2014
@IgorMinar
Copy link
Contributor Author

closing in favor of #9006 and #9082

@IgorMinar IgorMinar closed this Sep 14, 2014
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants