-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement Dot and BatchedDot in PyTensor #878
Merged
Merged
Changes from 9 commits
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
ffad937
Added PyTorch link and unit tests for normal dot
HangenYuu 5121a85
Changed implementation of dot. Renamed tests
HangenYuu 2721c5a
Changed dot implementation
HangenYuu 03bb3a8
Reverted logic to correct scope for math.dot
HangenYuu 2cf0ed2
Reverted folder structure and added BatchedDot
HangenYuu fcb3b79
Merge branch 'main' into torch_dot
HangenYuu 307a3fb
Fixed minor typo in test naming
HangenYuu e2500bf
Merge branch 'torch_dot' of github.com:HangenYuu/pytensor into torch_dot
HangenYuu 143a75a
Fixed __init__.py file for tests to run
HangenYuu 2d74b31
Rewrite test to reuse pytorch function
HangenYuu 4deea70
Removed get_test_value
HangenYuu cab9db8
Changed variable names
HangenYuu f459866
Merge branch 'main' into torch_dot
HangenYuu File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
from pytensor.link.pytorch.linker import PytorchLinker |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
import torch | ||
|
||
from pytensor.link.pytorch.dispatch import pytorch_funcify | ||
from pytensor.tensor.blas import BatchedDot | ||
|
||
|
||
@pytorch_funcify.register(BatchedDot) | ||
def pytorch_funcify_BatchedDot(op, **kwargs): | ||
def batched_dot(a, b): | ||
if a.shape[0] != b.shape[0]: | ||
raise TypeError("Shapes must match in the 0-th dimension") | ||
return torch.bmm(a, b) | ||
|
||
return batched_dot |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
import torch | ||
|
||
from pytensor.link.pytorch.dispatch import pytorch_funcify | ||
from pytensor.tensor.math import Dot | ||
|
||
|
||
@pytorch_funcify.register(Dot) | ||
def pytorch_funcify_Dot(op, **kwargs): | ||
def dot(x, y): | ||
return torch.matmul(x, y) | ||
|
||
return dot |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,36 @@ | ||||||||||||
import numpy as np | ||||||||||||
import pytest | ||||||||||||
|
||||||||||||
from pytensor.compile.function import function | ||||||||||||
from pytensor.compile.mode import Mode | ||||||||||||
from pytensor.configdefaults import config | ||||||||||||
from pytensor.graph.fg import FunctionGraph | ||||||||||||
from pytensor.graph.op import get_test_value | ||||||||||||
from pytensor.graph.rewriting.db import RewriteDatabaseQuery | ||||||||||||
from pytensor.link.pytorch import PytorchLinker | ||||||||||||
from pytensor.tensor import blas as pt_blas | ||||||||||||
from pytensor.tensor.type import tensor3 | ||||||||||||
from tests.link.pytorch.test_basic import compare_pytorch_and_py | ||||||||||||
|
||||||||||||
|
||||||||||||
def test_pytorch_BatchedDot(): | ||||||||||||
# tensor3 . tensor3 | ||||||||||||
a = tensor3("a") | ||||||||||||
a.tag.test_value = ( | ||||||||||||
np.linspace(-1, 1, 10 * 5 * 3).astype(config.floatX).reshape((10, 5, 3)) | ||||||||||||
) | ||||||||||||
b = tensor3("b") | ||||||||||||
b.tag.test_value = ( | ||||||||||||
np.linspace(1, -1, 10 * 3 * 2).astype(config.floatX).reshape((10, 3, 2)) | ||||||||||||
) | ||||||||||||
out = pt_blas.BatchedDot()(a, b) | ||||||||||||
fgraph = FunctionGraph([a, b], [out]) | ||||||||||||
compare_pytorch_and_py(fgraph, [get_test_value(i) for i in fgraph.inputs]) | ||||||||||||
|
||||||||||||
# A dimension mismatch should raise a TypeError for compatibility | ||||||||||||
inputs = [get_test_value(a)[:-1], get_test_value(b)] | ||||||||||||
opts = RewriteDatabaseQuery(include=[None], exclude=["cxx_only", "BlasOpt"]) | ||||||||||||
pytorch_mode = Mode(PytorchLinker(), opts) | ||||||||||||
pytensor_pytorch_fn = function(fgraph.inputs, fgraph.outputs, mode=pytorch_mode) | ||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This does the same?
Suggested change
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. But if I am not mistaken |
||||||||||||
with pytest.raises(TypeError): | ||||||||||||
pytensor_pytorch_fn(*inputs) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,30 @@ | ||
import numpy as np | ||
|
||
from pytensor.configdefaults import config | ||
from pytensor.graph.fg import FunctionGraph | ||
from pytensor.graph.op import get_test_value | ||
from pytensor.tensor.type import matrix, scalar, vector | ||
from tests.link.pytorch.test_basic import compare_pytorch_and_py | ||
|
||
|
||
def test_pytorch_dot(): | ||
y = vector("y") | ||
y.tag.test_value = np.r_[1.0, 2.0].astype(config.floatX) | ||
x = vector("x") | ||
x.tag.test_value = np.r_[3.0, 4.0].astype(config.floatX) | ||
A = matrix("A") | ||
A.tag.test_value = np.array([[6, 3], [3, 0]], dtype=config.floatX) | ||
alpha = scalar("alpha") | ||
alpha.tag.test_value = np.array(3.0, dtype=config.floatX) | ||
beta = scalar("beta") | ||
beta.tag.test_value = np.array(5.0, dtype=config.floatX) | ||
|
||
# 2D * 2D | ||
out = A.dot(A * alpha) + beta * A | ||
fgraph = FunctionGraph([A, alpha, beta], [out]) | ||
compare_pytorch_and_py(fgraph, [get_test_value(i) for i in fgraph.inputs]) | ||
|
||
# 1D * 2D and 1D * 1D | ||
out = y.dot(alpha * A).dot(x) + beta * y | ||
fgraph = FunctionGraph([y, x, A, alpha, beta], [out]) | ||
compare_pytorch_and_py(fgraph, [get_test_value(i) for i in fgraph.inputs]) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are getting rid of the test_value machinery. Just pass these directly to the test function, no point in putting them in the tag to then retrieve it again