Releases: graphql/dataloader
v2.2.2
What's Changed
- fix: expose
name
in Dataloader instance types by @henrinormak in #334
New Contributors
- @henrinormak made their first contribution in #334
Full Changelog: v2.2.1...v2.2.2
v2.2.1
v2.2.0
What's Changed
- fix: incorrect variable name in README.md by @danielcaballero in #313
- Add typings and test for priming with promise by @LinusU in #252
- Add cppdataloader to README.md by @jafarlihi in #314
- Chore / Add Changesets to Contributing.md by @thekevinbrown in #322
- Fixes #294. by @thekevinbrown in #321
- Bump qs from 6.5.2 to 6.5.3 by @dependabot in #323
- Bump decode-uri-component from 0.2.0 to 0.2.2 by @dependabot in #320
- fix: propagate batchFn sync throws to the loader instead of crashing by @boopathi in #318
- feat: add
name
toDataLoader
by @SimenB in #326 - chore: add prettier to eslint run by @SimenB in #327
- Update tested node versions by @oporkka in #329
- chore: fix flow errors by @SimenB in #330
New Contributors
- @danielcaballero made their first contribution in #313
- @LinusU made their first contribution in #252
- @jafarlihi made their first contribution in #314
- @thekevinbrown made their first contribution in #322
- @boopathi made their first contribution in #318
- @SimenB made their first contribution in #326
- @oporkka made their first contribution in #329
Full Changelog: v2.1.0...v2.2.0
v2.1.0
Minor Changes
- 28cf959: - Do not return void results from arrow functions 3b0bae9
- Fix typo in
loader.load()
error message 249b2b9 - Fix typo in SQL example cae1a3d
- Fix typo in TypeScript declaration ef6d32f
- Most of the browsers don't have
setImmediate
.setImmediate || setTimeout
doesn't work and it throwssetImmediate
is not defined in this case, so we should check setImmediate with typeof. And some environments like Cloudflare Workers don't allow you to set setTimeout directly to another variable. 3e62fbe
- Fix typo in
Patch Changes
- 3135e9a: Fix typo in jsdoc comment; flip "objects are keys" to "keys are objects"
v2.0.0
This is the first release since becoming part of the GraphQL Foundation and the most significant since the initial release over four years ago. Read more about the history of the project and this release in the blog post.
Breaking:
- #216:
.loadMany()
now returns an array which may containError
if one of the requested keys failed.Previously
.loadMany()
was exactly the same as callingPromise.all()
on multiple.load()
calls. While syntactically a minor convenience, this wasn't particularly useful over what could be done withPromise.all
directly and if one key failed, it meant the entire call to.loadMany()
would fail. As of this version,.loadMany()
can now return a mix of values andError
instances in the case that some keys failed, but the Promise it returns will never be rejected. This is similar to the behavior of the new Promise.allSettled method in the upcoming version of JavaScript.This will break any code which relied on
.loadMany()
. To support this change, either ensure the each item in the result of.loadMany()
are checked againstinstanceof Error
or replace calls likeloader.loadMany([k1, k2])
withPromise.all([loader.load(k1), loader.load(k2))
. - #220: The timing of calls to
batchLoadFn
when{ batch: false }
has changed to the end of the run-loop tick.Previously when batching was disabled the
batchLoadFn
would be called immediately when.load()
is called. This differed from thebatchLoadFn
being called at the end of the tick of the run-loop for when batching was enabled. This timing difference could lead to subtle race conditions for code which dynamically toggled batching on or off. As a simplification, thebatchLoadFn
is now always called at the end of the run-loop tick regardless of whether batching is disabled.Hopefully this will not break your code. It could cause issues for any code which relied on this synchronous call to
batchLoadFn
for loaders where batching was disabled. - #222: Promises for cached values now wait to resolve until the rest of the batch resolves.
Previously when
.load()
encountered a cached value it would return an already resolved (or rejected) Promise. However when additional dependent loads happened after these, the difference in time between the cache hit value resolving and the cache miss value resolving would result in additional unnecessary network requests. As of this version when.load()
encounters a cached value it returns a Promise which waits to resolve until the call tobatchLoadFn
also resolves. This should result in better whole-program performance and is the most significant conceptual change and improvement. This is actually not a new innovation but a correction to match the original behavior of Facebook's "Loader" from 2010 this library is inspired by.This changes the timing of when Promises are resolved and thus could introduce subtle behavioral change in your code, especially if your code is prone to race conditions. Please test carefully.
This also means each return of
.load()
is a new Promise instance. Where prior versions returned the same Promise instance for cached results, this version does not. This may break code which uses the returned Promise as a memoization key or in some other way assumed reference equality. - #226: The names of private class variables have changed
This really shouldn't break your code because you definitely don't reach into class private variables, right? I just figured it would be something you'd like to know, you know... just in case.
New:
- #176 #209: MIT licensed (no longer BSD+Patents) and copyrights moved from Facebook to the GraphQL Foundation
- #182: The DataLoader instance is now available as
this
inbatchLoadFn
- #228: Support for custom batch scheduling functions
The dirty secret of DataLoader is that most of it is quite boring. The interesting bit is the batch scheduling function which takes advantage of Node.js's unique run-loop scheduler to acheive automatic batching without any additional latency. However since its release, ports to other languages have found this bit to be not be easily replicated and have either replaced it with something conceptually simpler (like manual dispatch) or with a scheduler custom fit to a GraphQL execution engine. These are interesting innovations which deserve ground for experimentation in this original library as well.
Via
batchScheduleFn
, you can now provide a custom batch scheduling function and experiment with manual dispatch, added latency dispatch, or any other behavior which might work best for your application.
Types:
- #145: Improved TypeScript/Flow types for custom
cacheKeyFn
andcacheMap
- #146: TypeScript types allow
batchLoadFn
to return aPromiseLike
, supporting use of bluebird - #214 #219: TypeScript/Flow types allow
batchLoadFn
to returnArrayLike
, supporting returning read-only arrays - #168: Flow types now use strict mode, allowing safe import into other strict mode code
- #217: Fixed an issue where TypeScript/Flow would incorrectly report an error when providing an
Error
to.prime()
Fixes:
- #215: Fixed an issue where a cache could still consume memory, even when caching was disabled
- #223: Fixed an issue where providing an
Error
to.prime()
could incorrectly cause an unhandled promise rejection warning
Documentation:
- Added references to a ton more ports of DataLoader into other languages (keep 'em coming!)
- #213: All examples have been updated to latest JavaScript (preferring async/await over Promise chaining)
- Improved documentation for custom
cacheMap
along with an LRU example. - Improved documentation for using higher-order functions on
batchLoadFn
. - Improved documentation for converting Map results to Array results in
batchLoadFn
.
v1.4.0
New:
- Direct support for using Dataloader in a browser (#134)
Note: Dataloader in the browser cannot rely on the same post-promise job queuing behavior that allows for best performance in Node environments. A fallback behavior is used in a browser.
- The integrity of custom provided Cache Maps are now checked during construction. (#86)
This leads to better error messages when in use with custom caches that do not provide the full required interface. It may now produce eager errors where latent bugs were allowed in prior versions.
Fixed:
- Flow type no longer complains when using
require("dataloader")
(#135) - Flow types from public API are now directly exported (cc0c62b)
- Fixed an issue where Flow may complain about the provided batch function's return type. (#100)
Due to more recent versions of Flow treating
Array
as invariant, DataLoader uses$ReadOnlyArray
which is covariant.
v1.3.0
New:
- Thanks to contributions from many, the documentation for DataLoader is now significantly better, with portions of
README.md
reworked and improved and more information inexamples/
. - DataLoader installs now come with both TypeScript and Flow type definitions. (#43, #45)
- Batch loads can now be limited to a certain number of keys (#42)
v1.2.0
New:
Prime values in the cache - #18. — If you have values cached locally, this provides an API for adding those already known values to a DataLoader cache:
let loader = new DataLoader(keys => promiseToGetKeys(keys));
loader.prime('abc', 'My Value');
loader.load('abc').then(val => console.log(val)); // Logs "My Value"
This also works for priming errors:
let loader = new DataLoader(keys => promiseToGetKeys(keys));
loader.prime('abc', new Error('My Error'));
loader.load('abc').catch(err => console.log(err)); // Logs "My Error"
This is particularly useful for allowing two loaders to interact based on two fetchable keys for one type:
let userByIDLoader = new DataLoader(ids => promiseToGetByID(ids).then(users => {
for (let user of users) {
userByUsernameLoader.prime(user.username, user);
}
return users;
}));
let userByUsernameLoader = new DataLoader(names => promiseToGetByUsername(names).then(users => {
for (let user of users) {
userByIDLoader.prime(user.id, user);
}
return users;
}));
v1.1.0
New:
-
Provide a custom cache-key function. #15 — Since JavaScript does not use value equality, but fetch keys are often treated as values, this allows for a custom function to be provided to produce a cache-key for any load-key. Example:
var loader = new DataLoader(keys => promiseToGetKeys(keys), { cacheKeyFn: key => key._id }); var first = loader.load({ _id: 'abc123' }); var second = loader.load({ _id: 'abc123' }); assert(first === second);
-
Provide a custom cache instance. #17 — By default, DataLoader uses a Map as a cache. However this simple cache does not support anything like TTL or a cache eviction policy. Now you may provide any object which implements the Map API to use as a cache. Example:
var loader = new DataLoader(keys => promiseToGetKeys(keys), { cacheMap: new LRUCacheMap() });