-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Define functools.partial as overloaded function instead of its own class #2878
Conversation
4ff51af
to
9322a05
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your PR, and sorry for the long wait!
I think this approach makes sense, but just have one comment.
Note that this PR generates a small number of false positives in our internal repositories since keyword arguments can no longer be used with the result of |
This change also causes false positives in situations where |
With the new mypy release this also caused a couple of new errors in our codebase: two from |
This change caused errors in my code base where I had used |
So, I opened this issue 3 years ago, I see that some use cases are now handled, but there're still issues with properly checking simple snippets like:
(thanks for the all the work in Mypy and Typeshed since then, btw! I haven't touched much Python in the meanwhile, and it's good to see the improvements)
I thought a bit more about what can be done. One possibility could be to define different overloads for
__init__
and__call__
, like:But this would have the same problem with the snippet above, since there'd be no relationship between how the
partial
had been constructed (which__init__
) had been invoked, and its__call__
(the wrong arity for the Callable could be invoked). Also, every partial would have to be parametrized on the number of types of the max arity supported by the stub (even for the types that aren't used)Ideally, we'd want to "overload the whole class", something like:
but this obviously cannot work.
This is when I thought of defining
partial
's type as a simple function.Ideally we'd want to return a different class (with different type parameters) in each case. I don't think we'd be able to properly type that class, since the only 2 type constructors that I've seen with an arbitrary number of type parameters are
Tuple
andCallable
(besides other stuff likeGeneric
), and neither can be subclassed (though that's a detail that might change). It'd probably have to be defined as aSpecialForm
So, that's why I settled on it simply returning a
Callable
.This is obviously not ideal, since this way we'd lose the ability to type-check inspections of
func
,args
,keywords
. But when people usefunctools.partial
, that's to obtain some callables and call them, so I think the most common use case can actually be covered with the overloading, at the expense of these 3 other attributes.I picked a maximum arity of 5, as that's also the arity used to type
map
(above it'll just rely onAny
). This also won't help in the case of partial application with keyword arguments.I don't particularly like this approach, and I'm sure that it had already been thought of when first implementing the current type stub for
functools.partial
, but I thought worthwhile to open this PR since I haven't seen anyone else proposing this.On the other hand, I think this approach can also be justified by the other precedents we have, like
itertools.cycle
anditertools.tee
, which are implemented as returning a custom type and yet are defined in Typeshed as simply being a function that returns an abstract type likeIterator
(even if the actual implementation provides a couple more implementation details, namely{'__setstate__'}
and{'__setstate__', '__copy__'}
respectively)Another advantage for doing things this way, is that it wouldn't require any special type checker implementation, so the stub could improve the current checking in both mypy and pytype.
Again: I'm not particularly fond of this change, so I understand if the current approach will continue to be preferred.