Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimizations for Det should also apply to SlogDet #1039

Open
ricardoV94 opened this issue Oct 18, 2024 · 2 comments · May be fixed by #1041
Open

Optimizations for Det should also apply to SlogDet #1039

ricardoV94 opened this issue Oct 18, 2024 · 2 comments · May be fixed by #1041
Labels

Comments

@ricardoV94
Copy link
Member

Description

import pytensor
import pytensor.tensor as pt

x_diag = pt.vector("x_diag")
x = pt.diag(x_diag)
y = pt.log(pt.linalg.det(x))

pytensor.function([x_diag], y).dprint()
# Log [id A] 1
#  └─ Prod{axes=None} [id B] 0
#     └─ x_diag [id C]

_, y = pt.linalg.slogdet(x)
pytensor.function([x_diag], y).dprint(depth=3)
# SLogDet.1 [id A] 4
#  └─ AdvancedSetSubtensor [id B] 3
#     ├─ Alloc [id C] 2
#     ├─ x_diag [id D]
#     ├─ ARange{dtype='int64'} [id E] 1
#     └─ ARange{dtype='int64'} [id E] 1
#        └─ ···
@ricardoV94
Copy link
Member Author

ricardoV94 commented Oct 19, 2024

We probably should have linalg.slogdet just return sign(det(x)), log(abs(det(x))) and only later specialize to the SlogDet Op.

Then we don't need to worry about the two forms of Det during linalg rewrites

@ricardoV94
Copy link
Member Author

Another point brought up by @jessegrabowski is that SlogDet doesn't have grad. One more reason to only introduce it as a specialization.

@tanish1729 tanish1729 linked a pull request Oct 19, 2024 that will close this issue
10 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant