-
Notifications
You must be signed in to change notification settings - Fork 12.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable type parameter lower-bound syntax #14520
Comments
What is this useful for? |
@RyanCavanaugh : In short, it mimics contravariance, just as We will try to sort an array of cats to see the necessity of this feature. To do comparison-based sorting, we need a interface Comparator<T> {
compare(x: T, y: T): number
} The following code shows that the class class Animal {}
class Cat extends Animal {} Now we can write a sorting function that supports arbitrary function sort(cats: Cat[], comparator: Comparator<Cat>): void {
// Some comparison-based sorting algorithm.
// The following line uses the comparator to compare two cats.
comparator.compare(cats[0], cats[1]);
// ...
} Now, we will try to use the class CatComparator implements Comparator<Cat> {
compare(x: Cat, y: Cat): number {
throw new Error('Method not implemented.');
}
} Then we create a list of const cats = [ new Cat(), new Cat(), new Cat() ] Now we can call sort(cats, new CatComparator()); We have not seen the need for contravariance so far. Now, suppose we are told that someone has already implemented a comparator for class AnimalComparator implements Comparator<Animal> {
compare(x: Animal, y: Animal): number {
throw new Error('Method not implemented.');
}
} Since a Naturally, we would want to use sort(cats, new AnimalComparator()); However, since the following two types:
are not related from the point of view of TypeScript's type system, we cannot do that. Therefore, I would like the function sort<T super Cat>(cats: Cat[], comparator: Comparator<T>): void {
// Some comparison-based sorting algorithm.
// The following line uses the comparator to compare two cats.
comparator.compare(cats[0], cats[1]);
// ...
} or as in Java, function sort(cats: Cat[], comparator: Comparator<? super Cat>): void {
// Some comparison-based sorting algorithm.
// The following line uses the comparator to compare two cats.
comparator.compare(cats[0], cats[1]);
// ...
} I am aware of the fact that TypeScript does not complain if I pass |
This is the best example of contravariance I've read in a long time. Props! 🙌 |
As a reference point, flowtype uses
|
Hi @jyuhuan , since you probably already know this https://github.com/Microsoft/TypeScript/wiki/FAQ#why-are-function-parameters-bivariant I'm afraid lower bound isn't that useful in current TypeScript where variance is unsound. Indeed, there are also cases lower bound can be useful without variance. Like interface Array<T> {
concat<U super T>(arg: ): Array<U>
}
var a = [new Cat]
a.concat(new Animal) // inferred as Animal[] In such case like immutable sequence container, lower bound helps TypeScript to enable pattern where new generic type is wider than original type. Yet such usage still needs more proposal since TS also has union type. For example should |
migrated from #14728: currently when we see a constraint declare var a: A;
declare var b: B;
b = a; // allowed
a = b; // not allowed consider adding a new constraint of the reversed relation: declare var a: A;
declare var b: B;
b = a; // not allowed
a = b; // allowed use case i have a what i need it to declare expect(new BigFatClass()).toEqual({ value: true }); |
|
the idea is that U is always a subset, never a superset, so if so it should not be a problem |
So |
i always forget that subtype of a union is a subset of that union, conversely a supertype of a union must be a superset, so you right that |
@aaronbeall problem with type Super<T> = Partial<T>;
type Data = { nested: { value: number; }; }
const one: Super<Data> = {}; // works
const another: Super<Data> = { nested: {} }; // bummer so i am back to hammering the expected values with the type assertions expect(data).toEqual(<Data>{ nested: {} }); |
@Aleksey-Bykov Probably doesn't make this feature any less valuable but for your case I think you can use a recursive partial type:
(This was suggested as an addition to the standard type defs, which I think would be very useful.) |
@aaronbeall type Super<T> = DeepPartial<T>
type Data = { value: number; }
const one: Data = 5; // not allowed
const another: Super<Data> = null; // not allowed
const yetAnother: Super<Data> = 5; // allowed
const justLike: {} = 5; // <-- reason why (allowed) |
@Aleksey-Bykov Woah, you just answered my question. But now I have another one, why is this allowed? const a1: {} = 0;
const a2: {} = '0';
const a3: {} = false;
const a4: {} = { a: 0 }; Is EDIT: I tried the following type DeepPartial<T> = {[P in keyof T]?: T[P] | (DeepPartial<T[P]> & object); }; It is pretty late, maybe that won't work either, I'll probably think of some example that will break it once I wake up. |
`getOrElse` and `orElse` use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[2], but true F-bounded polymorphism hasn't been. This means a type like `interface Option<A, B = Option<A, B>>` is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[3]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#stricter-checking-for-generic-functions [2] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#type-parameters-as-constraints [3] microsoft/TypeScript#14520
getOrElse and orElse use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[2], but true F-bounded polymorphism hasn't been. This means a type like `interface Option<A, B = Option<A, B>>` is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[3]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#stricter-checking-for-generic-functions [2] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#type-parameters-as-constraints [3] microsoft/TypeScript#14520
getOrElse and orElse use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[2], but true F-bounded polymorphism hasn't been. This means a type like interface Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[3]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#stricter-checking-for-generic-functions [2] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#type-parameters-as-constraints [3] microsoft/TypeScript#14520
getOrElse and orElse use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[2], but true F-bounded polymorphism hasn't been. This means a type like interface Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[3]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#stricter-checking-for-generic-functions [2] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#type-parameters-as-constraints [3] microsoft/TypeScript#14520
getOrElse and orElse use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[2], but true F-bounded polymorphism hasn't been. This means a type like interface Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[3]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#stricter-checking-for-generic-functions [2] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#type-parameters-as-constraints [3] microsoft/TypeScript#14520
getOrElse and orElse use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[2], but true F-bounded polymorphism hasn't been. This means a type like interface Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[3]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#stricter-checking-for-generic-functions [2] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript#type-parameters-as-constraints [3] microsoft/TypeScript#14520
`getOrElse` and `orElse` use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[1], but true F-bounded polymorphism hasn't been. This means a type like `interface` Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[2]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript [2] microsoft/TypeScript#14520
`getOrElse` and `orElse` use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[1], but true F-bounded polymorphism hasn't been. This means a type like `interface` Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[2]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript [2] microsoft/TypeScript#14520
A bit of a somewhat working workaround for anyone interested 😃 type Range<Lower, Upper> = Upper & Partial<Extract<Lower, Upper>>
// test
type Big = { f1: number, f2: string, f3: string }
type Small = { f1: number }
const scoped: Range<Big, Small> = { f1: 1, f2: 'one' }
const small: Small = scoped
const big: Big = scoped // no no |
`getOrElse` and `orElse` use self-types in order to support typed upper bounds.[0] In TypeScript 2.4, generic functions were checked more strictly[1]. This causes the implicit downward type cast to fail, so we explicitly invoke the cast in the method body. This workaround is backwards-compatible with TypeScript 2.3. Bounded polymorphism has been implemented[1], but true F-bounded polymorphism hasn't been. This means a type like `interface` Option<A, B = Option<A, B>> is invalid. Alternatively, we can solve this with a lower type bound, but these don't work against concrete classes[2]. --- We should also upgrade monapt's TypeScript dependency to 2.4, but there are unrelated errors compiling the tests. [0] microsoft/TypeScript#13337 [1] https://github.com/Microsoft/TypeScript/wiki/What%27s-new-in-TypeScript [2] microsoft/TypeScript#14520
In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent.
In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent.
In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent.
Node authors may now create their own arbitrary/custom field types. Any pydantic model is supported. Two notes: 1. Your field type's class name must be unique. Suggest prefixing fields with something related to the node pack as a kind of namespace. 2. Custom field types function as connection-only fields. For example, if your custom field has string attributes, you will not get a text input for that attribute when you give a node a field with your custom type. This is the same behaviour as other complex fields that don't have custom UIs in the workflow editor - like, say, a string collection. feat(ui): fix tooltips for custom types We need to hold onto the original type of the field so they don't all just show up as "Unknown". fix(ui): fix ts error with custom fields feat(ui): custom field types connection validation In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent. fix(ui): typo feat(ui): add CustomCollection and CustomPolymorphic field types feat(ui): add validation for CustomCollection & CustomPolymorphic types - Update connection validation for custom types - Use simple string parsing to determine if a field is a collection or polymorphic type. - No longer need to keep a list of collection and polymorphic types. - Added runtime checks in `baseinvocation.py` to ensure no fields are named in such a way that it could mess up the new parsing chore(ui): remove errant console.log fix(ui): rename 'nodes.currentConnectionFieldType' -> 'nodes.connectionStartFieldType' This was confusingly named and kept tripping me up. Renamed to be consistent with the `reactflow` `ConnectionStartParams` type. fix(ui): fix ts error feat(nodes): add runtime check for custom field names "Custom", "CustomCollection" and "CustomPolymorphic" are reserved field names. chore(ui): add TODO for revising field type names wip refactor fieldtype structured wip refactor field types wip refactor types wip refactor types fix node layout refactor field types chore: mypy organisation organisation organisation fix(nodes): fix field orig_required, field_kind and input statuses feat(nodes): remove broken implementation of default_factory on InputField Use of this could break connection validation due to the difference in node schemas required fields and invoke() required args. Removed entirely for now. It wasn't ever actually used by the system, because all graphs always had values provided for fields where default_factory was used. Also, pydantic is smart enough to not reuse the same object when specifying a default value - it clones the object first. So, the common pattern of `default_factory=list` is extraneous. It can just be `default=[]`. fix(nodes): fix InputField name validation workflow validation validation chore: ruff feat(nodes): fix up baseinvocation comments fix(ui): improve typing & logic of buildFieldInputTemplate improved error handling in parseFieldType fix: back compat for deprecated default_factory and UIType feat(nodes): do not show node packs loaded log if none loaded
Node authors may now create their own arbitrary/custom field types. Any pydantic model is supported. Two notes: 1. Your field type's class name must be unique. Suggest prefixing fields with something related to the node pack as a kind of namespace. 2. Custom field types function as connection-only fields. For example, if your custom field has string attributes, you will not get a text input for that attribute when you give a node a field with your custom type. This is the same behaviour as other complex fields that don't have custom UIs in the workflow editor - like, say, a string collection. feat(ui): fix tooltips for custom types We need to hold onto the original type of the field so they don't all just show up as "Unknown". fix(ui): fix ts error with custom fields feat(ui): custom field types connection validation In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent. fix(ui): typo feat(ui): add CustomCollection and CustomPolymorphic field types feat(ui): add validation for CustomCollection & CustomPolymorphic types - Update connection validation for custom types - Use simple string parsing to determine if a field is a collection or polymorphic type. - No longer need to keep a list of collection and polymorphic types. - Added runtime checks in `baseinvocation.py` to ensure no fields are named in such a way that it could mess up the new parsing chore(ui): remove errant console.log fix(ui): rename 'nodes.currentConnectionFieldType' -> 'nodes.connectionStartFieldType' This was confusingly named and kept tripping me up. Renamed to be consistent with the `reactflow` `ConnectionStartParams` type. fix(ui): fix ts error feat(nodes): add runtime check for custom field names "Custom", "CustomCollection" and "CustomPolymorphic" are reserved field names. chore(ui): add TODO for revising field type names wip refactor fieldtype structured wip refactor field types wip refactor types wip refactor types fix node layout refactor field types chore: mypy organisation organisation organisation fix(nodes): fix field orig_required, field_kind and input statuses feat(nodes): remove broken implementation of default_factory on InputField Use of this could break connection validation due to the difference in node schemas required fields and invoke() required args. Removed entirely for now. It wasn't ever actually used by the system, because all graphs always had values provided for fields where default_factory was used. Also, pydantic is smart enough to not reuse the same object when specifying a default value - it clones the object first. So, the common pattern of `default_factory=list` is extraneous. It can just be `default=[]`. fix(nodes): fix InputField name validation workflow validation validation chore: ruff feat(nodes): fix up baseinvocation comments fix(ui): improve typing & logic of buildFieldInputTemplate improved error handling in parseFieldType fix: back compat for deprecated default_factory and UIType feat(nodes): do not show node packs loaded log if none loaded chore(ui): typegen
Node authors may now create their own arbitrary/custom field types. Any pydantic model is supported. Two notes: 1. Your field type's class name must be unique. Suggest prefixing fields with something related to the node pack as a kind of namespace. 2. Custom field types function as connection-only fields. For example, if your custom field has string attributes, you will not get a text input for that attribute when you give a node a field with your custom type. This is the same behaviour as other complex fields that don't have custom UIs in the workflow editor - like, say, a string collection. feat(ui): fix tooltips for custom types We need to hold onto the original type of the field so they don't all just show up as "Unknown". fix(ui): fix ts error with custom fields feat(ui): custom field types connection validation In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent. fix(ui): typo feat(ui): add CustomCollection and CustomPolymorphic field types feat(ui): add validation for CustomCollection & CustomPolymorphic types - Update connection validation for custom types - Use simple string parsing to determine if a field is a collection or polymorphic type. - No longer need to keep a list of collection and polymorphic types. - Added runtime checks in `baseinvocation.py` to ensure no fields are named in such a way that it could mess up the new parsing chore(ui): remove errant console.log fix(ui): rename 'nodes.currentConnectionFieldType' -> 'nodes.connectionStartFieldType' This was confusingly named and kept tripping me up. Renamed to be consistent with the `reactflow` `ConnectionStartParams` type. fix(ui): fix ts error feat(nodes): add runtime check for custom field names "Custom", "CustomCollection" and "CustomPolymorphic" are reserved field names. chore(ui): add TODO for revising field type names wip refactor fieldtype structured wip refactor field types wip refactor types wip refactor types fix node layout refactor field types chore: mypy organisation organisation organisation fix(nodes): fix field orig_required, field_kind and input statuses feat(nodes): remove broken implementation of default_factory on InputField Use of this could break connection validation due to the difference in node schemas required fields and invoke() required args. Removed entirely for now. It wasn't ever actually used by the system, because all graphs always had values provided for fields where default_factory was used. Also, pydantic is smart enough to not reuse the same object when specifying a default value - it clones the object first. So, the common pattern of `default_factory=list` is extraneous. It can just be `default=[]`. fix(nodes): fix InputField name validation workflow validation validation chore: ruff feat(nodes): fix up baseinvocation comments fix(ui): improve typing & logic of buildFieldInputTemplate improved error handling in parseFieldType fix: back compat for deprecated default_factory and UIType feat(nodes): do not show node packs loaded log if none loaded chore(ui): typegen
Node authors may now create their own arbitrary/custom field types. Any pydantic model is supported. Two notes: 1. Your field type's class name must be unique. Suggest prefixing fields with something related to the node pack as a kind of namespace. 2. Custom field types function as connection-only fields. For example, if your custom field has string attributes, you will not get a text input for that attribute when you give a node a field with your custom type. This is the same behaviour as other complex fields that don't have custom UIs in the workflow editor - like, say, a string collection. feat(ui): fix tooltips for custom types We need to hold onto the original type of the field so they don't all just show up as "Unknown". fix(ui): fix ts error with custom fields feat(ui): custom field types connection validation In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent. fix(ui): typo feat(ui): add CustomCollection and CustomPolymorphic field types feat(ui): add validation for CustomCollection & CustomPolymorphic types - Update connection validation for custom types - Use simple string parsing to determine if a field is a collection or polymorphic type. - No longer need to keep a list of collection and polymorphic types. - Added runtime checks in `baseinvocation.py` to ensure no fields are named in such a way that it could mess up the new parsing chore(ui): remove errant console.log fix(ui): rename 'nodes.currentConnectionFieldType' -> 'nodes.connectionStartFieldType' This was confusingly named and kept tripping me up. Renamed to be consistent with the `reactflow` `ConnectionStartParams` type. fix(ui): fix ts error feat(nodes): add runtime check for custom field names "Custom", "CustomCollection" and "CustomPolymorphic" are reserved field names. chore(ui): add TODO for revising field type names wip refactor fieldtype structured wip refactor field types wip refactor types wip refactor types fix node layout refactor field types chore: mypy organisation organisation organisation fix(nodes): fix field orig_required, field_kind and input statuses feat(nodes): remove broken implementation of default_factory on InputField Use of this could break connection validation due to the difference in node schemas required fields and invoke() required args. Removed entirely for now. It wasn't ever actually used by the system, because all graphs always had values provided for fields where default_factory was used. Also, pydantic is smart enough to not reuse the same object when specifying a default value - it clones the object first. So, the common pattern of `default_factory=list` is extraneous. It can just be `default=[]`. fix(nodes): fix InputField name validation workflow validation validation chore: ruff feat(nodes): fix up baseinvocation comments fix(ui): improve typing & logic of buildFieldInputTemplate improved error handling in parseFieldType fix: back compat for deprecated default_factory and UIType feat(nodes): do not show node packs loaded log if none loaded chore(ui): typegen
In the initial commit, a custom field's original type was added to the *field templates* only as `originalType`. Custom fields' `type` property was `"Custom"`*. This allowed for type safety throughout the UI logic. *Actually, it was `"Unknown"`, but I changed it to custom for clarity. Connection validation logic, however, uses the *field instance* of the node/field. Like the templates, *field instances* with custom types have their `type` set to `"Custom"`, but they didn't have an `originalType` property. As a result, all custom fields could be connected to all other custom fields. To resolve this, we need to add `originalType` to the *field instances*, then switch the validation logic to use this instead of `type`. This ended up needing a bit of fanagling: - If we make `originalType` a required property on field instances, existing workflows will break during connection validation, because they won't have this property. We'd need a new layer of logic to migrate the workflows, adding the new `originalType` property. While this layer is probably needed anyways, typing `originalType` as optional is much simpler. Workflow migration logic can come layer. (Technically, we could remove all references to field types from the workflow files, and let the templates hold all this information. This feels like a significant change and I'm reluctant to do it now.) - Because `originalType` is optional, anywhere we care about the type of a field, we need to use it over `type`. So there are a number of `field.originalType ?? field.type` expressions. This is a bit of a gotcha, we'll need to remember this in the future. - We use `Array.prototype.includes()` often in the workflow editor, e.g. `COLLECTION_TYPES.includes(type)`. In these cases, the const array is of type `FieldType[]`, and `type` is is `FieldType`. Because we now support custom types, the arg `type` is now widened from `FieldType` to `string`. This causes a TS error. This behaviour is somewhat controversial (see microsoft/TypeScript#14520). These expressions are now rewritten as `COLLECTION_TYPES.some((t) => t === type)` to satisfy TS. It's logically equivalent.
This seems like the most common thing I want from TypeScript that it can't currently do. Examples: /** Numeric text box component. `number` must be a valid value of `Num`,
but TypeScript does not accept <number extends Num extends NFTHNum>` */
function NumericTextField<Num extends NumEx>(...): JSX.Element {...}
type NumEx = number | undefined | null | { toString():string };
/** This should say `null extends CFV` i.e. that null must be a valid value of CFV,
but TypeScript doesn't support that kind of constraint AFAICT. */
function WithNullCheckbox<CFV = CustomFieldValue>(...) {...} Until just now I thought the syntax should "obviously" be type U = ...;
type V = ...;
// It's unclear whether the `U` or `V` is meant as the parameter name, and for
// backward compatibility, `U` must be the name and `V` must be the constraint.
function Foo<U extends V>() {} So I propose that when this feature is implemented, the TypeScript compiler should detect what the user is trying to do in most cases and offer a correction, e.g. if the final syntax were // Currently the error is "Cannot find name 'Y'" but if `X` was already defined,
// I propose "Cannot find name 'Y'. Did you mean 'Y includes X'?"
function Foo<X extends Y>() {}
// Currently the error is "Type parameter name cannot be 'number'". I propose
// "Type parameter name cannot be 'number'. Did you mean 'T includes number'?"
function Foo<number extends T>() {}
// These don't parse, so better error messaging would need parser changes.
function Foo<{ toString(): string } extends T>() {}
function Foo<(X|Y) extends T>() {}
function Foo<X[] extends T>() {} |
The use case I came across for this is an "action executor" in the context of an application. interface AppContext { database; service1; service2; serviceN; }
function executeAction<T>(actionFn: (context: AppContext, args) => T) { ... } Naturally, you can pass in an action that takes a subset of interface Action<
TContext, // super AppContext
TResult
> {
(context: TContext): void;
name: string;
metadata;
} How do I constrain However, there is a relatively easy workaround, though kind of annoying, which is to always use a function type when you want a contravariant type parameter: interface Action<
TContextFn extends (context: AppContext) => void,
TResult
> {
(context: Parameters<TContextFn>[0]): TResult;
name: string;
metadata;
} But that makes it kind of hard to explictly type |
This would also allow safer property writes: interface Foo { a: true }
const foo: Foo = { a: true };
function f(o: { a: boolean }) {
o.a = Math.random() < 0.5; // is accepted but unsafe
}
f(foo); // is accepted but unsafe
function g<T super boolean>(o: { a: T }) {
// ^^^^^ <--- proposed feature
o.a = Math.random() < 0.5; // accepted because it's safe
}
g(foo); // rejected |
@jcalz the underlying issue here is that TS unsoundly treats mutable records as covariant. The |
Yes the solution is to have readonly properties with the current assignability, and mutable properties that are invariant. Perhaps a strict setting would make all properties of a received object readonly (preserving current assignability for the sake of external compatibility), highlighting unsafe code. The author could then mark the property mutable to make it invariant, or choose to override the error e.g. via cast to continue using unsafe code. #18770 |
This is my use case. I would like to be able to do things like:
|
I’d like to present a compelling use case for introducing such a feature, inspired by Ramda’s import * as R from "ramda";
type State = 0 | 1 | 2;
const includes0 = R.includes(0 as const);
const states: State[] = [1, 0, 1, 2];
includes0(states) Ideally, the type signature of If you paste the code above into the TypeScript Playground, it actually results in an error: includes0(states)
// ~~~~~~
// Argument of type 'State[]' is not assignable to parameter of type 'readonly 0[]'.
// Type 'State' is not assignable to type '0'.
// Type '1' is not assignable to type '0'. This error highlights limitations in the current typing of export function includes(s: string): (list: readonly string[] | string) => boolean;
export function includes<T>(target: T): (list: readonly T[]) => boolean; The additional overload for string-based use cases exists to support scenarios like I’ve encountered several similar issues due to the contravariant nature of function parameter types, where I desperately wish for syntax like |
My use case is related to For example, // Type of myArr is ["a", "b"]
const myArr = ['a', 'b'] as const;
// This item could come from reading env or an API response, anywhere that I can not know the exact value or type
const myItem = process.env.MY_ITEM;
// This will error out saying: Argument of type 'string' is not assignable to parameter of type '"a" | "b"'
myArr.includes(myItem) The reason I call |
@linkfang very much agree, I have run into this many times especially when trying to validate a value is in some allowed set. |
TypeScript Version: 2.1.1
Code
Expected behavior:
The type parameter
A
has the typeKitten
as lower-bound.Actual behavior:
Compilation failure. The syntax is unsupported.
Discussion:
The upper-bound counterpart of the failed code works fine:
People in issue #13337 have suggested to use
to lower-bound
Y
withX
. But this does not cover the case whereX
is an actual type (instead of a type parameter).The text was updated successfully, but these errors were encountered: