Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor pareto and laplace #4691

Merged
merged 9 commits into from
May 14, 2021
Merged

Conversation

farhanreynaldo
Copy link
Contributor

This PR closes subset of #4686.

It is still unclear to me on the default shape of the RandomVariable class, so I might need suggestion on that part. The tests_to_run parameter on test_distribution_random.py also baffles me, your input is much appreciated.

@ricardoV94
Copy link
Member

ricardoV94 commented May 13, 2021

It seems like the Pareto has been half refactored already here #4593

It should be enough to remove the unused random and _random methods in the class. No need to create a custom pareto op as it already exists on Aesara and is imported here:

https://github.com/pymc-devs/pymc3/blob/351dad0eb5504c3019fe59813c56290ad7123068/pymc3/distributions/continuous.py#L40

The tests in distributions_random look fine.

@twiecki twiecki mentioned this pull request May 13, 2021
26 tasks
@farhanreynaldo farhanreynaldo changed the title refactor pareto refactor pareto and laplace May 13, 2021
@farhanreynaldo
Copy link
Contributor Author

It seems like the Pareto has been half refactored already here #4593

It should be enough to remove the unused random and _random methods in the class. No need to create a custom pareto op as it already exists on Aesara and is imported here:

https://github.com/pymc-devs/pymc3/blob/351dad0eb5504c3019fe59813c56290ad7123068/pymc3/distributions/continuous.py#L40

The tests in distributions_random look fine.

Ah thank you, I just noticed that.

pymc_dist = pm.Pareto
pymc_dist_params = {"alpha": 3.0, "m": 2.0}
expected_rv_op_params = {"alpha": 3.0, "m": 2.0}
reference_dist_params = {"alpha": 3.0, "scale": 2.0}
Copy link
Member

@ricardoV94 ricardoV94 May 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reference_dist_params should be "b" and "scale" I think, that's why the tests are failing

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of seeded_numpy_distribution_builder, I should use seeded_scipy_distribution_builder to match the params, right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you're right. Aesara is using scipy for the Pareto

pymc_dist_params = {"alpha": 3.0, "m": 2.0}
expected_rv_op_params = {"alpha": 3.0, "m": 2.0}
reference_dist_params = {"alpha": 3.0, "scale": 2.0}
size = 15
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to specify size, 15 is already the default

@ricardoV94
Copy link
Member

ricardoV94 commented May 13, 2021

@farhanreynaldo I noticed we lost the default transform of the Pareto in #4593. I pushed a fix to this PR.

Make sure to pull the changes to your local repo before pushing new commits!

pymc_dist_params = {"mu": 0.0, "b": 1.0}
expected_rv_op_params = {"mu": 0.0, "b": 1.0}
reference_dist_params = {"loc": 0.0, "scale": 1.0}
size = 15
Copy link
Member

@ricardoV94 ricardoV94 May 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

size can be removed here too

@ricardoV94
Copy link
Member

ricardoV94 commented May 13, 2021

I'm sorry I just realized the Laplace is already implemented on Aesara, I must have missed it in #4640

You can simply import the op from there no need to recreate it.

https://github.com/pymc-devs/aesara/blob/abf6026c9c2bad19216275e127e3043f30d93360/aesara/tensor/random/basic.py#L448

I suggest you check if I missed any other op that's already implemented there. The Truncated Normal and Wald seem to be already there as well

@farhanreynaldo
Copy link
Contributor Author

@farhanreynaldo I noticed we lost the default transform of the Pareto in #4593. I pushed a fix to this PR.

Make sure to pull the changes to your local repo before pushing new commits!

I am terribly sorry, I did not intend to do so. Should I run the following command before pushing new commits to the current branch?

git fetch upstream
git rebase upstream/v4

@farhanreynaldo
Copy link
Contributor Author

I'm sorry I just realized the Laplace is already implemented on Aesara, I must have missed it in #4640

You can simply import the op from there no need to recreate it.

https://github.com/pymc-devs/aesara/blob/abf6026c9c2bad19216275e127e3043f30d93360/aesara/tensor/random/basic.py#L448

I suggest you check if I missed any other op that's already implemented there. The Truncated Normal and Wald seem to be already there as well

I just saw the link you sent me, but the LaplaceRV class doesn't implement the rng_fn method. Is it going to work as it should be?

@farhanreynaldo
Copy link
Contributor Author

I think I messed up the git merge once again...

@ricardoV94
Copy link
Member

@farhanreynaldo No problem. I'm not very good at git either :p

I think I fixed the commit history in this branch. Let me know if there was anything else you wanted to change or if I can go ahead and merge?

@farhanreynaldo
Copy link
Contributor Author

@farhanreynaldo No problem. I'm not very good at git either :p

I think I fixed the commit history in this branch. Let me know if there was anything else you wanted to change or if I can go ahead and merge?

Yes, you can merge it. Thank you so much for your help on this PR!

@ricardoV94 ricardoV94 merged commit 791a1c4 into pymc-devs:v4 May 14, 2021
@ricardoV94
Copy link
Member

@farhanreynaldo Thanks so much for taking on this! Looking forward to your next PR!

twiecki pushed a commit that referenced this pull request Jun 5, 2021
* refactor pareto
* refactor laplace
* Reintroduce `Pareto` default transform

Co-authored-by: Farhan Reynaldo <farhanreynaldo@gmail.com>
Co-authored-by: Ricardo <ricardo.vieira1994@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants