You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, this is more a naive question than an issue so apologies if posted at the wrong place.
I asked on Turing whether they had any plans/interest in supporting Diffractor as a backend, and they mentioned it would in-principle be feasible. I was just wondering if Diffractor could be an interesting option for Turing (with expected speedups, especially in the problematically slow areas of current existing AD backends)? And if so whether there were any long-term plans to work towards this integration, or whether Diffractor was made with different goals in mind than Bayesian sampling. Thanks for the hard work!
The text was updated successfully, but these errors were encountered:
I think it should be feasible.
Right now its mostly interesting as an alternative to ForwardDiff.jl (since the reverse-mode stuff is at least for now on the back-burner).
Once we upstream some stuff we have been working on, it will be especially interesting for computing jacobians.
Hi, this is more a naive question than an issue so apologies if posted at the wrong place.
I asked on Turing whether they had any plans/interest in supporting Diffractor as a backend, and they mentioned it would in-principle be feasible. I was just wondering if Diffractor could be an interesting option for Turing (with expected speedups, especially in the problematically slow areas of current existing AD backends)? And if so whether there were any long-term plans to work towards this integration, or whether Diffractor was made with different goals in mind than Bayesian sampling. Thanks for the hard work!
The text was updated successfully, but these errors were encountered: