Author: Nitish Shirish Keskar
OBA is a second-order method for convex L1-regularized optimization with active-set prediction. OBA belongs to the family of Orthant-Based methods (such as OWL) and uses a selective-corrective mechanism which brings about increased efficiency and robustness.
The OBA package
- allows for solving general convex L1-regularized problems including Logistic Regression and LASSO.
- is written in pure-MATLAB with minimal dependencies and emphasizes simplicity and cross-platform compatibility.
- includes both Newton and quasi-Newton options for the proposed algorithm.
The algorithm can be run using the syntax
x = OBA(funObj,lambda,[options]);
Here,
funObj
is an object with member functions for computing the function, gradient and Hessian-vector products at the iterates. Logistic Regression and LASSO classes are provided with the package. The filefunTemplate.m
can be used as a base for designing a custom function.lambda
is the positive scalar for inducing sparsity in the solution.options
is an optional argument for changing the default parameters used in OBA. For ease of use, the user can generate the default options struct usingoptions=GenOptions()
and change the parameters therein before passing it to OBA.
The parameters and their default values are
-
`options.optol`: termination tolerance (default: 1e-6)
-
`options.qn`: Quasi-Newton, 0 (Newton's Method), or 1 (quasi-Newton) (default: 0)
-
`options.mem_size`: quasi-Newton memory size (default: 20)
-
`options.maxiter`: max number of iterations (default: 1000)
-
`options.printlev`: print level, 0 (no printing) or 1 (default: 1)
-
`options.CGtol`: CG termination tolerance (for Newton's Method) (default: 1e-1)
-
`options.maxCGiter`: max number of CG iterations (Newton's Method) (default: 1000).
For a detailed documentation of OBA and its associated functions, use help OBA
.
If you use OBA for your research, please cite the paper
@article{OBA_Keskar2016,
author = {N. Keskar and J. Nocedal and F. Öztoprak and A. Wächter},
title = {A second-order method for convex -regularized optimization with active-set prediction},
journal = {Optimization Methods and Software},
volume = {0},
number = {0},
pages = {1-17},
year = {0},
doi = {10.1080/10556788.2016.1138222},
URL = {http://dx.doi.org/10.1080/10556788.2016.1138222},
eprint = {http://dx.doi.org/10.1080/10556788.2016.1138222}
}