- 
                Notifications
    
You must be signed in to change notification settings  - Fork 2.1k
 
Implement and switch to lazy initval evaluation framework #4983
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement and switch to lazy initval evaluation framework #4983
Conversation
e119af0    to
    ec38672      
    Compare
  
    
          Codecov Report
 
 @@             Coverage Diff             @@
##             main    #4983       +/-   ##
===========================================
+ Coverage   61.45%   78.21%   +16.76%     
===========================================
  Files         130      131        +1     
  Lines       24461    24525       +64     
===========================================
+ Hits        15033    19183     +4150     
+ Misses       9428     5342     -4086     
  | 
    
ec38672    to
    2beeb04      
    Compare
  
    | 
           Could the remaining test failure   | 
    
2beeb04    to
    b371760      
    Compare
  
    | 
           I added two more tests - dependent initvals and resizing works. But something is odd with the random state. This is from the failing  That backend is deprecated and the test doesn't seem to be very targeted. It looks like the  Also what exactly do we want in terms of rng for initial value evaluation? Should it be independent, optionally with their own   | 
    
94764d5    to
    756acec      
    Compare
  
    59f6c3c    to
    5f2ee98      
    Compare
  
    4b25e3d    to
    f5aa61a      
    Compare
  
    | 
           Good time to update the brittle  There's more context here (with a different pseudo-API): #4924 (comment)  | 
    
f8c6562    to
    40aa4fd      
    Compare
  
    | 
           LGTM. What happens when an init point is -inf?  | 
    
          
 What do you mean? The model logp at the initial point? If so, same as before  | 
    
| 
           Both pre-commit and docs worked on master. But that should be easy to fix(?)  | 
    
65294ff    to
    d28f897      
    Compare
  
    
          
 Rebased after #5070. Should be fixed now  | 
    
ea49486    to
    1f6e8f9      
    Compare
  
    Related to pymc-devs#4924
With this commit "moment" or "prior" become legal initvals. Furthermore rv.tag.test_value is no longer assigned or used for initvals. The tolerance on test_mle_jacobian was eased to account for non- deterministic starting points of the optimization.
…odel This function can also handle variable specific jittering and user defined overrides The pm.sampling module was adapted to use the new functionality. This changed the signature of `init_nuts`: + `start` kwarg becomes `initvals` + `initvals` are required to be complete for all chains + `seeds` can now be specified for all chains
The test relied on monkey patching the jitter so that the model initial logp would fail predictably. This does not seem to be possible with the new numpy random generators, so a different test strategy has to be developed
To unblock this PR/branch from the aeppl integration.
1f6e8f9    to
    ff3b7f7      
    Compare
  
    There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't approve since I'm the original author.
That one nitpick comment I made shouldn't hold us back from anything
| rvs_to_jitter : set | ||
| The random variables for which jitter should be added. | ||
| """ | ||
| # TODO: implement this | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Created an issue for this: #5077
| def find_rng_nodes(variables): | ||
| return [ | ||
| node | ||
| for node in graph_inputs(variables) | ||
| if isinstance( | ||
| node, | ||
| ( | ||
| at.random.var.RandomStateSharedVariable, | ||
| at.random.var.RandomGeneratorSharedVariable, | ||
| ), | ||
| ) | ||
| ] | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can be extracted
| 
           🥳  | 
    

Changes
Model.initial_valuesandDistribution(initval=...)can now takeNone,ndarray,Variable,"moment"or"prior".Model.initial_valuesare now managed by RV instead of value var tensorinit_nutssignature changespm.sample()start kwarg change of name and meaningModel.update_start_valswas removed (with informative error message)initial_point.pyToDo
test_initvals.py::TestInitvalEvaluation::test_initval_resizing)Mention changes in the RELEASE-NOTES.mdDoing this in the HackMD documentCloses #4924
Closes #4484