-
Notifications
You must be signed in to change notification settings - Fork 615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support differentiable coefficients for observables #6598
Conversation
Hello. You may have forgotten to update the changelog!
|
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #6598 +/- ##
==========================================
- Coverage 99.60% 99.60% -0.01%
==========================================
Files 476 476
Lines 45237 45215 -22
==========================================
- Hits 45060 45035 -25
- Misses 177 180 +3 ☔ View full report in Codecov by Sentry. |
Should we deprecate/remove |
Potentially worth considering now. |
tests/gradients/parameter_shift/test_parameter_shift_shot_vec.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Astral Cai <astral.cai@xanadu.ai>
Co-authored-by: Andrija Paurevic <46359773+andrijapau@users.noreply.github.com>
Context:
With legacy operator arithmetic, we supported trainable coefficients using the
hadamard_grad
transform. This doesn't really work anymore with generic operator arithmetic.Description of the Change:
Uses the
split_to_single_terms
transform in gradient preprocessing if any of the observables have trainable coefficients.In order to get this to pass tests, various bugs in
split_non_commuting
needed fixing.Benefits:
Possible Drawbacks:
Related GitHub Issues:
[sc-71490]