-
-
Notifications
You must be signed in to change notification settings - Fork 214
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pass down the original input, making it available as second parameter in "custom" function #260
Comments
I want to focus on the problem you are trying to solve with this issue in the next days. So I appreciate any input. I had the same thought. However, type safety is the big problem. Strictly speaking, we would have to type the input as The most promising approach so far is PR #223, which marks the required keys as dependencies, so that the corresponding pipe function is only executed if the type of those dependencies is guaranteed. However, I have to investigate this first before I can make a decision. |
I see, after thinking about it i do agree with you. My current aproach would be hard to mantain and error prone if there are many changes over time in big/complex forms/inputs with a lot of custom validations. Took a look at #223, its clever, however im not really a fan of it, it will look ugly once you start adding more and more fields/validations, and custom validations will be separated from the fields you want to apply those (in some cases you want that, but in most of the cases you don't). How about splitting the parsing into two parts?, the first one being only the type validation (same ones as in typescript), and the second one the methods itself (like minLength, email, custom)? This way it will ensure strong typing in the first run, and then it will proceed to validate all the fields. export function parse(schema, input, info?) {
// Adding "_full_input" (so its available in nested custom methods)
// Adding "_parsing_mode" (to indicate the parsing mode)
info._full_input = input
info._parse_mode = 'types'
// Execute _parse only checking the types (skipping ALL PIPES)
const typesResult = schema._parse(input, info);
if (typesResult.issues) {
throw new ValiError(typesResult.issues);
}
// If everything is good, parse now the pipes (skipping the types)
info._parse_mode = 'pipes'
const pipesResult = schema._parse(input, info)
if (pipesResult.issues) {
throw new ValiError(pipesResult.issues);
}
return pipesResult.output;
} //declaration
const schema = object({
email: string([email()]),
password: string([minLength(6, 'Minimum 6 characters')]),
repeatPassword: string([
minLength(6),
custom((input, { password }) => input === password, `Passwords don't match`)
])
}) // Execution, under the hood it will be like this:
//first run
object({
email: string(),
password: string(),
repeatPassword: string()
})
// second run only if the typings are correct
object({
email: [email()],
password: [minLength(6, 'Minimum 6 characters')],
repeatPassword: [
minLength(6),
custom((input, { password }) => input === password, `Passwords don't match`)
])
}) |
Your idea is great. I will consider it. However, there are two problems. Since each schema is independent, it can be combined with any other schema. For example, Another disadvantage is that if any type is wrong, all pipelines will not be executed. The only way to avoid this is to know which keys must be valid to execute a pipeline function. It would probably be an advantage to enable both in the final implementation. Your idea and #223. I will probably start researching and testing it this weekend. |
I see. How about adding two new things:
// smartObject.ts
export const smartObject = (objectSchema) => ({...objectSchema, _newBoundary : true}) // object.ts
// _parse method
for (const [key, schema] of cachedEntries) {
const value = (input as Record<string, unknown>)[key];
const result = schema._parse(value, _newBoundary ? { ...info, _full_input: input } : info ); // usage
import { smartObject } from 'valibot'
const registerSchema = smartObject({email ... }) Then the smartCustom((full_input_value) => ...., 'error message') // all closest smartObject type must be valid
smartCustom(['pass1', 'pass2'], (full_input_value) => ...., 'error message') // only pass1 and pass2 types must be valid So the new mental model would be something like this?:
But still, regarding the second disadvantage, I don't think is that big of a deal, most of the time your typings are going to be correct, and i cant think of any case in which you would care about executing pipes if there is some kind of typing error in the schema. Im not sure if there would be more edge cases, the ones i can think of are methods like |
As soon as I have a little more time in the next few days, I'll get back to you. |
Sorry that it took so long to get back to you. I just added a import * as v from 'valibot';
const RegisterSchema = v.pipe(
v.object({
email: v.pipe(
v.string(),
v.nonEmpty('Please enter your email.'),
v.email('The email address is badly formatted.'),
),
password1: v.pipe(
v.string(),
v.nonEmpty('Please enter your password.'),
v.minLength(8, 'Your password must have 8 characters or more.'),
),
password2: v.string(),
}),
v.forward(
v.partialCheck(
[['password1'], ['password2']],
(input) => input.password1 === input.password2,
'The two passwords do not match.',
),
['password2'],
),
); |
Hi, forgot about it haha, its been a while. Yes i think so, thanks a lot !!!, I hope to have the opportunity to use it again soon. |
Hi, i want to reiterate on this topic. Is there a way that |
What exactly is not working? It works as expected for me via the shared link. |
Ah, I'm apparently blind. Thought I missed |
That would be weird because I implemented the resolver. The code is pretty simple. Is your version of Valibot and the resolver up to date? |
Yeah, really strange. So versions are:
Here is code sandbox: https://codesandbox.io/p/sandbox/wdxr64?file=%2Fpackage.json%3A16%2C5-18%2C24 |
Ah, I believe this condition makes it escape early:
|
Yes, if I now have this like so (https://codesandbox.io/p/sandbox/wdxr64?file=%2Fsrc%2FApp.tsx%3A40%2C18):
then it works correctly. The only issue is that it doesn't work if I use this:
maybe this should also take into account. So basically:
|
|
Suggestion:
Pass the "full input being tested" down the tree (at call time, for some/all of the methods available, like "parse/parseAsync"), either as a 3rd parameter to the _parse method, or as a new property ("_full_input") to the info object. It would allow us to access the full input in the "custom" function as second parameter, which i think it may provide a good solution for customization and DX.
Pros
Good DX, granularity, order execution, solves some of the issues with form libraries and its resolvers (works with react-hook-forms and valibotResolver). Not being able to compare fields easily or having a bad DX may be a deal-breaker for the devs using this library, and this suggestion kinda fixes most of the common use cases.
Cons
The second parameter on the "custom" function most likelly can't be well typed (it will have "any" type), this is because the function will run before checking/comparing the hole "original input" with the hole schema, so you may access properties without checking its existence beforehand. Also i dont think there is a way to infer/pass down the hole Output-object type at schema definition, you may have to give it manually beforehand 😕
Branch with this implementation
passing down full input
Additional feature to complement this suggestion (didn't test/implemented this one)
We could add also a "deferedCustom" function, which returns an action with mode = 'defered', making it so it doesn't execute at main "_parse runtime", but instead if the pipe encounters a "deferedCustom" action (action.mode="defered") will skip it until the main "_parse" execution finishes without issues, then it ll run again but only with the "deferedCustom" actions (this will garantee that the input matches the schema structure). An easy implementation/mental-model could be:
And inside the "executePipe" function
PD: I'm new to this library, so i only took a look at the things i use/need; so im not sure if these implementations may introduce bugs in the rest parts of the library; but i think everything should work fine. ( all test also passed in my branch where i implemented these changes)
The text was updated successfully, but these errors were encountered: