You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I am building a computing graph and creating two tensors eg. A & B . tensor B's shape(ne) is dependent on A's data member, which is unknown before compute_forward the graph and is illegal to reach when building graph.
How can I deal with such cases? As I want to be more dynamic in building the graph. Thanks a lot.
The text was updated successfully, but these errors were encountered:
I think it is not intended for tensor shapes to be set dynamically during the compute graph evaluation. A branching evaluation based on some condition would maybe be possible but this is not implemented (and may be seen as adding too much complexity). What you can do is create two separate compute graphs and change how you construct the second one based on the results of the first one.
I think it is not intended for tensor shapes to be set dynamically during the compute graph evaluation. A branching evaluation based on some condition would maybe be possible but this is not implemented (and may be seen as adding too much complexity). What you can do is create two separate compute graphs and change how you construct the second one based on the results of the first one.
Thanks a lot. And is it possible to build nested compute graphs? Like I have a "big" compute graph, and I build and run a smaller compute graph inside it, using its result in the big graph?
When I am building a computing graph and creating two tensors eg. A & B . tensor B's shape(ne) is dependent on A's data member, which is unknown before compute_forward the graph and is illegal to reach when building graph.
How can I deal with such cases? As I want to be more dynamic in building the graph. Thanks a lot.
The text was updated successfully, but these errors were encountered: