Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add gradient methods to FinDiffGradients #9104

Merged
merged 9 commits into from
Nov 17, 2022

Conversation

ikkoham
Copy link
Contributor

@ikkoham ikkoham commented Nov 9, 2022

Summary

Resolves #9096

Details and comments

@ikkoham ikkoham added Changelog: New Feature Include in the "Added" section of the changelog Changelog: Bugfix Include in the "Fixed" section of the changelog mod: algorithms Related to the Algorithms module labels Nov 9, 2022
@ikkoham ikkoham added this to the 0.23.0 milestone Nov 9, 2022
@ikkoham ikkoham requested review from a team, manoelmarques and woodsp-ibm as code owners November 9, 2022 08:04
@qiskit-bot
Copy link
Collaborator

Thank you for opening a new pull request.

Before your PR can be merged it will first need to pass continuous integration tests and be reviewed. Sometimes the review process can be slow, so please be patient.

While you're waiting, please feel free to review other open PRs. While only a subset of people are authorized to approve pull requests for merging, everyone is encouraged to review open pull requests. Doing reviews helps reduce the burden on the core team and helps make the project's code better for everyone.

One or more of the the following people are requested to review this:

@ikkoham
Copy link
Contributor Author

ikkoham commented Nov 9, 2022

I'd like to add @a-matsuo as a reviewer of this PR. But I can't...

@coveralls
Copy link

coveralls commented Nov 9, 2022

Pull Request Test Coverage Report for Build 3489201436

  • 71 of 75 (94.67%) changed or added relevant lines in 2 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage increased (+0.006%) to 84.609%

Changes Missing Coverage Covered Lines Changed/Added Lines %
qiskit/algorithms/gradients/finite_diff_estimator_gradient.py 26 28 92.86%
qiskit/algorithms/gradients/finite_diff_sampler_gradient.py 45 47 95.74%
Totals Coverage Status
Change from base Build 3489200644: 0.006%
Covered Lines: 62663
Relevant Lines: 74062

💛 - Coveralls

self,
estimator: BaseEstimator,
epsilon: float,
method: Literal["central", "forward", "backward"] = "central",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the gradients have been released already, this new argument needs to go last, otherwise using the arguments positionally

FiniteDiffEstimatorGradient(my_estimator, my_epsilon, my_options)

wouldn't work the same as before anymore 🙂

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure... it's just unnatural. Maybe we should just make it a keyword-only argument?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any strategy around deciding whether to have a parameter direct in a primitive/algorithm constructor versus having it in the options. Here as a parameter its fixed, right, when you build out the gradient. As a field in options it could be altered on the run, if desired, which seems to be where its used to change whats done. Maybe as a parameter exposes it easier to an end user than in options, although it seems less flexible. But I guess the same is true for epsilon. Perhaps this is not something an algorithm using a gradient would want to change for run.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my understanding options is only what affects the circuit evaluation since this is simply forwarded to the underlying primitives. Everything else (including all gradient settings) doesn't go there, which I think is good because then we can clearly document what the effect is.

We could make it keyword only, yes, since we started using that in the algorithms anyways it might be the most natural 🙂

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the options is limited to being values that are passed to underlying primitive run options then its clear. Given these are primitive-like and with some primitives themselves I am sure I have seen some with options more than run options supports I just wanted to check. Having such as explicit parameters seems better to me, it exposes things more directly. It does mean presently though to change method (or epsilon) you need to create a new gradient object as there is no public way to update them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can change the implementation to make it gradient's own run option. It is not necessary to transfer all values to primitives.
(I agree the instance option is fine in this case, but if the user needs the run option...)

[circuit] * n, [observable] * n, plus.tolist() + minus.tolist(), **options
)
if self._method == "central":
plus = parameter_values_ + self._epsilon * offset / 2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should remain as it was: perturb by epsilon in both directions and divide by 2 * epsilon to make sure we don't change what happens. Especially for finite difference on noisy functions, dividing the distance by 2 could make a difference 🤔

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an API breaking. I believe it is a bug fix because it goes against the general definition.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm I'm not sure if this can be considered a bug, in literature you find both versions with eps and eps/2. Personally I think I'm actually more used to see eps and not eps/2, and also SPSA in Qiskit uses an eps shift 🙂

Maybe it would be better to keep it as is, if there is no clear evidence that it's wrong? 😄

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, OK. We just need to adjust epsilon, so personally I don't have the strong opinion, so I'd put it back.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought it might be better to use eps for the denominators of all cases. But, it seems some documents use + an -eps for the central method instead of eps/2. I think it's ok to leave it as is.

@a-matsuo
Copy link
Contributor

LGTM. I like the way using gradient_factories in unittests and I will use it in the future. It's cooler than writing if for FiniteDiffEstimatorGradient and FiniteDiffSamplerGradient 🙂

Cryoris
Cryoris previously approved these changes Nov 10, 2022
Copy link
Contributor

@Cryoris Cryoris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the updates @ikkoham! 🙂

@ikkoham
Copy link
Contributor Author

ikkoham commented Nov 16, 2022

hmm... sphinx is difficult

Co-authored-by: Julien Gacon <gaconju@gmail.com>
@woodsp-ibm
Copy link
Member

The release note just mentions Feature - should the Changelog: Bugfix label be removed just leaving the Feature one?

@ikkoham ikkoham removed the Changelog: Bugfix Include in the "Fixed" section of the changelog label Nov 17, 2022
@ikkoham
Copy link
Contributor Author

ikkoham commented Nov 17, 2022

@woodsp-ibm Thanks. I reverted it in this commit a5fa842 and I forgot to remove the label.

Copy link
Contributor

@Cryoris Cryoris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@mergify mergify bot merged commit 2bd4afa into Qiskit:main Nov 17, 2022
Cryoris added a commit to Cryoris/qiskit-terra that referenced this pull request Jan 12, 2023
* implement gradient methods

* fix docs

* arg order/refactor

* fix docs

* revert central difference

* Update qiskit/algorithms/gradients/finite_diff_estimator_gradient.py

* fix docs

* Update qiskit/algorithms/gradients/finite_diff_sampler_gradient.py

Co-authored-by: Julien Gacon <gaconju@gmail.com>

Co-authored-by: Julien Gacon <gaconju@gmail.com>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
ElePT pushed a commit to ElePT/qiskit that referenced this pull request Jun 27, 2023
* implement gradient methods

* fix docs

* arg order/refactor

* fix docs

* revert central difference

* Update qiskit/algorithms/gradients/finite_diff_estimator_gradient.py

* fix docs

* Update qiskit/algorithms/gradients/finite_diff_sampler_gradient.py

Co-authored-by: Julien Gacon <gaconju@gmail.com>

Co-authored-by: Julien Gacon <gaconju@gmail.com>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
ElePT pushed a commit to ElePT/qiskit-algorithms-test that referenced this pull request Jul 17, 2023
* implement gradient methods

* fix docs

* arg order/refactor

* fix docs

* revert central difference

* Update qiskit/algorithms/gradients/finite_diff_estimator_gradient.py

* fix docs

* Update qiskit/algorithms/gradients/finite_diff_sampler_gradient.py

Co-authored-by: Julien Gacon <gaconju@gmail.com>

Co-authored-by: Julien Gacon <gaconju@gmail.com>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Changelog: New Feature Include in the "Added" section of the changelog mod: algorithms Related to the Algorithms module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add forward and backward options to finite difference gradient
6 participants