-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor ops to allow non-funsor parameters #491
Conversation
@eb8680 finally got the tests to pass 😓 |
For some reason, I am getting
in NumPyro DirichletMultinomial(concentration=[1.5 0.5], total_count=10.0, value=Finitary(stack, (Tensor(7.0, OrderedDict(), 'real'), Number(3.0)))) |
@eb8680, @fehiepsi's issue with |
Yeah, enabling it raises two errors
but it seems to me |
Addresses #489
partly pair coded with @eb8680
This major op refactoring aims to unblock @ordabayevy in #477 and #482 by allowing ops to have arbitrary *arg,**kwarg parameters. Changes include:
nullop
is renamed tonull
for consistency (NullOp
remains unchanged).@make_op
has moved to a classmethod@MyOpClass.make
e.g.@BinaryOp.make
.arity
attribute.[:arity]
-many args (and patterns are validated to ensure this).WeakValueDict
..subclass_register()
handlers have a different inferface, now includingcls
.WrappedTransformOp
class, and its instances each have an op.ops.WrappedTransformOp(fn)
must now be written as a kwargops.WrappedTransformOp(fn=fn)
is_numeric_array()
is demoted from anOp
to a@singledispatch
.ops.stack
interface has changed toops.stack(parts, dim)
as in NumPy and PyTorch.funsor.tensor.stack
is replaced byops.stack
.ops.einsum
interface has changed toops.einsum(operands, equation)
whereoperands
is a list or tuple.Finitary
funsor interface has changed from variously-many args to a single tuple arg.Tested