Replies: 2 comments 3 replies
-
How is that different form what |
Beta Was this translation helpful? Give feedback.
0 replies
-
Here is roughly your first example: from functools import partial
import pytensor
import pytensor.tensor as pt
from pytensor.compile.builders import OpFromGraph
x1 = pt.dvector("x1")
x2 = pt.dvector("x2")
y = 2 * x2
y.name = "y"
z = x1 ** 2 * y
z.name = "z"
op = OpFromGraph([y, x1], [z])
op = partial(op, y)
out = op(3 * x1) + op(2 * x1)
func = pytensor.function([x1, x2], [out])
pytensor.dprint(func)
Which reminds me, we could add a helper that collects missing inputs and assume they are constant, like we do with scan. And we should add some rewrites to pull common computations out of multiple OpFromGraph (or merge them), again like we do with Scan. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
It seems to me that most reasons why anyone would clone graphs is to change the input of a part of a graph. Pretty much anywhere else we do something like this using a function call. I'm guessing that a lot of things would get easier if pytensor had support for functions, and I don't think they should be too difficult to add. Something along the lines of this might work for instance:
Usage could look something like this:
Beta Was this translation helpful? Give feedback.
All reactions