-
Notifications
You must be signed in to change notification settings - Fork 376
initializing AQGD optimizer #768
Comments
At some point there was a change to the json schema specification definition for exclusiveMaximum and we corrected our code to match the latest schema spec in 0.6.2. Try installing the latest released version of Aqua which is 0.6.2 |
thanks a lot. I had to upgrade the whole qiskit package to make it work perfectly (Y) |
I'm sorry to close the issue without permission. |
I had expected you to close this issue since you created it and a fix was given which worked - you are permitted to do this as well as re-open, so no problem :) The issue arises now from this line As to momentum its used as per this line where both the value and 1 minus the value are used. Is it this line you are questioning? |
hi, thanks for your reply.
should I change it and make a pull request? it's working now on my machine and also tested it on a windows machine and colab
yeah that's the line. I'm actually accustomed to 0.9 as it's the default value in tensorflow and pytorch libraries. when i was trying to explain quantum gradients to my colleagues it stopped me for a while actually 😅 so i had to implement the whole algorithm from scratch to keep it consistent with other ML libraries flow. it's only a suggestion. |
Hi, sure having a PR to fix the if self_previous_loss test would be great. I recall the defaults were chosen based on some testing that was done - mostly to do analytic gradient of RY var form when used in VQE doing a ground state energy computation, where these seemed to do well. I was looking at pytorch SQD and they default to 0 - were you looking at some other code? |
of course, the default would be zero for a pure SGD as you mentioned sir, but with momentum most of the people use 0.9 like this simple example from keras. the first code. i was not aware of the RY thing. I'll check it some other time. |
Closing as this was fixed by #770 |
hi @woodsp-ibm, i'm now having the same issue because of the while loop 😢 |
Hi @kareem1925 the PR was for the master branch. This has not yet been released but will be part of the upcoming 0.7.0 currently planned for March. In the master the line has changed https://github.com/Qiskit/qiskit-aqua/blob/769ca8f7fbb91fcfb4dd47c956b6358bb53212ef/qiskit/aqua/components/optimizers/aqgd.py#L141 so installing qiskit from source (clones) of the repo should work. So if you installed a stable release 0.6.4 or earlier it will still be as you reported and only master was changed by the PR. |
this for fixing issue qiskit-community#768 that breaks the while loop because of the if condition of the loss value
[Stable] Release 0.6.5 - Remove cvxopt from install, backport fix for issue #768
this for fixing issue qiskit-community/qiskit-aqua#768 that breaks the while loop because of the if condition of the loss value
…ix-issue-#768 Fixing issue qiskit-community/qiskit-aqua#768
this for fixing issue qiskit-community/qiskit-aqua#768 that breaks the while loop because of the if condition of the loss value
…ix-issue-#768 Fixing issue qiskit-community/qiskit-aqua#768
Informations
'qiskit-aqua': '0.6.1'
3.6.9
ubuntu 18.04
What is the current behavior?
an error happens when i try to initialize the AQGD
Steps to reproduce the problem
What is the expected behavior?
I expect that nothing wrong happens just like the following code:
opt = optimizers.ADAM(maxiter=20,lr=0.2)
there is no errors while executing this line
Suggested solutions
I believe that there is something wrong happens while validating the configuration file of this optimizer. I think modifying it may help, I'm still searching for it
The text was updated successfully, but these errors were encountered: