Skip to content

Commit

Permalink
Remove the :typecast_latest variable
Browse files Browse the repository at this point in the history
  • Loading branch information
ddnexus committed Nov 17, 2024
1 parent 8a9ae6f commit 3837c60
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 40 deletions.
36 changes: 5 additions & 31 deletions docs/api/keyset.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,21 +181,6 @@ end
Pagy::Keyset(set, filter_newest:)
```

==- `:typecast_latest`

A lambda to override the automatic typecasting of your ORM. For example: `SQLite` stores date and times as strings, and
the query interpolation may fail composing and comparing string dates. The `typecast_latest` is an effective last-resort
option when fixing the typecasting in your models and/or the data in your storage is not possible.

```ruby
typecast_latest = lambda do |latest|
latest[:timestamp] = Time.parse(latest[:timestamp]).strftime('%F %T')
latest
end

Pagy::Keyset(set, typecast_latest:)
```

==- `:jsonify_keyset_attributes`

A lambda to override the generic json encoding of the `keyset` attributes. Use it when the generic `to_json` method would lose
Expand Down Expand Up @@ -237,7 +222,7 @@ _(Notice that it doesn't work with `Sequel::Dataset` sets)_

==- Records may repeat or be missing from successive pages

!!!danger Your set is not `uniquely ordered`
!!!danger The set may not be `uniquely ordered`

```rb
# Neither columns are unique
Expand All @@ -251,30 +236,19 @@ Product.order(:name, :production_date)
Product.order(:name, :production_date, :id)
```
!!!

!!!danger You may have an encoding problem
The generic `to_json` method used to encode the `page` loses some information when decoded.
The generic `to_json` method used to encode the `page` may lose some information when decoded

!!!success
- Check the actual executed DB query and the actual stored value
- Identify the column that have a format that doesn't match with the keyset
- Use your custom encoding with the [:jsonify_keyset_attributes](#jsonify-keyset-attributes) variable
- Override the encoding with the [:jsonify_keyset_attributes](#jsonify-keyset-attributes) variable
!!!

!!!danger You may have a typecasting problem
Your ORM and the storage formats don't match for one or more columns. It's a common case with `SQLite` and Time columns.
They may have been stored as strings formatted differently than the default format used by your current ORM.

!!!success
- Check the actual executed DB query and the actual stored value
- Identify the column that have a format that doesn't match with the keyset
- Fix the typecasting consistence of your ORM with your DB or consider using your custom typecasting with the
[:typecast_latest](#typecast-latest) variable
!!!

==- The order is OK, but the DB is still slow

!!!danger Most likely your index is not right, or your case needs a custom query
!!!danger Most likely the index is not right, or your case needs a custom query

!!! Success

Expand Down
2 changes: 1 addition & 1 deletion gem/lib/pagy/keyset.rb
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def initialize(set, **vars)
return unless @page

latest = JSON.parse(B64.urlsafe_decode(@page)).transform_keys(&:to_sym)
@latest = @vars[:typecast_latest]&.(latest) || typecast_latest(latest)
@latest = typecast_latest(latest)
raise InternalError, 'page and keyset are not consistent' \
unless @latest.keys == @keyset.keys
end
Expand Down
8 changes: 0 additions & 8 deletions test/pagy/keyset_test.rb
Original file line number Diff line number Diff line change
Expand Up @@ -41,14 +41,6 @@
_(records.size).must_equal 10
_(records.first.id).must_equal 13
end
it 'uses :typecast_latest' do
pagy = Pagy::Keyset.new(model.order(:id),
page: "eyJpZCI6MTB9",
limit: 10,
typecast_latest: ->(latest) { latest })
_ = pagy.records
_(pagy.latest).must_equal({id: 10})
end
it 'uses :jsonify_keyset_attributes' do
pagy = Pagy::Keyset.new(model.order(:id),
page: "eyJpZCI6MTB9",
Expand Down

0 comments on commit 3837c60

Please sign in to comment.