Skip to content

Commit

Permalink
review comments
Browse files Browse the repository at this point in the history
  • Loading branch information
abhimanyusinghgaur committed Nov 26, 2020
1 parent 7457341 commit d675d69
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 7 deletions.
4 changes: 2 additions & 2 deletions wiki/content/query-language/functions.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Comparison functions (`eq`, `ge`, `gt`, `le`, `lt`) in the query root (aka `func
be applied on [indexed predicates]({{< relref "query-language/schema.md#indexing" >}}). Since v1.2, comparison functions
can now be used on [@filter]({{<relref "query-language/graphql-fundamentals.md#applying-filters" >}}) directives even on predicates
that have not been indexed.
Filtering on non-indexed predicates can be slow for large datasets, as they require
Filtering on non-in dexed predicates can be slow for large datasets, as they require
iterating over all of the possible values at the level where the filter is being used.

All other functions, in the query root or in the filter can only be applied to indexed predicates.
Expand Down Expand Up @@ -265,7 +265,7 @@ Index Required: An index is required for the `eq(predicate, ...)` forms (see tab
| `int` | `int` |
| `float` | `float` |
| `bool` | `bool` |
| `string` | `exact`, `hash` |
| `string` | `exact`, `hash`, `term`, `fulltext` |
| `dateTime` | `dateTime` |

Test for equality of a predicate or variable to a value or find in a list of values.
Expand Down
12 changes: 7 additions & 5 deletions worker/tokens.go
Original file line number Diff line number Diff line change
Expand Up @@ -105,11 +105,13 @@ func pickTokenizer(ctx context.Context, attr string, f string) (tok.Tokenizer, e
return nil, errors.Errorf("Attribute:%s does not have proper index for comparison", attr)
}

// If we didn't find a sortable or !isLossy() tokenizer for eq function,
// then let's see if we can find a term or fulltext tokenizer
for _, t := range tokenizers {
if t.Identifier() == tok.IdentTerm || t.Identifier() == tok.IdentFullText {
return t, nil
// If we didn't find a !isLossy() tokenizer for eq function on string type predicates,
// then let's see if we can find a non-trigram tokenizer
if typ, err := schema.State().TypeOf(attr); err == nil && typ == types.StringID {
for _, t := range tokenizers {
if t.Identifier() != tok.IdentTrigram {
return t, nil
}
}
}

Expand Down

0 comments on commit d675d69

Please sign in to comment.