-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation of the K-support norm proximity operator #71
Conversation
Pull Request Test Coverage Report for Build 216
💛 - Coveralls |
Still Working on some tests to improve the coverage, but it would be cool to have a preliminary review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
WIP PR: Still few things to change
self.ridge = proximity.Ridge(linear.Identity(), weights) | ||
self.elasticnet_alpha_0 = proximity.ElasticNet(linear.Identity(), | ||
alpha=0, | ||
beta=weights) | ||
self.elasticnet_beta_0 = proximity.ElasticNet(linear.Identity(), | ||
alpha=weights, | ||
beta=0) | ||
self.one_support = proximity.KSupportNorm(beta=0.2, k_value=1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I might need to change the beta parameter so it basically rescales wit the ridge in case of k_value = dimension
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you planning to make this change in the current PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think it should be a great idea to have the beta parameter rescaled the same way, to have the same output as the ridge ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you are the expert in this context so I have defer to your judgement. Either way I will be happy to merge the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm making it compatible with the ridge, it will ease the comparison between the two methods.
@sfarrens I'm not sure that I can add more tests to increase the coverage what do you think? |
modopt/opt/proximity.py
Outdated
\underset{y \in \mathbb{C}^{N}}{\text{min}} 0.5*\|x - y|\_2^2 + | ||
\frac{\beta}{2}\text{min}\{\sum_{I \in \mathcal{G}_k}\|v_I\|_2^2: | ||
\text{supp} (v_I) \subseteq I, \sum_{I \in \mathcal{G}_k} v_I = y\} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this will render without the .. math::
command.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The expression is a bit complex, maybe we should just refer the user to the papers in references?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would be completely acceptable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I will refer to the paper
…ordinate has not been found
c624358
to
c7e17a0
Compare
…was wrong on the previous version
This implements the squared k-support norm and solves #68 .
The implementation of the proximity operator is based on this paper although it has originally been proposed here