Skip to content

Commit

Permalink
Merge #172
Browse files Browse the repository at this point in the history
172: Add args/kwargs to `optimize_hyperparameters!()` r=odunbar a=odunbar

Resolves #168 to add args + kwargs into `optimize_hyperparameters!()` for `GPJL()` optimizer
## Content
- Adds ability to modify the arguments for `Optim` methods. 
- Adds to docstring, a warning about the positional default argument for `method` forced upon us in `GaussianProcesses.jl`

## Examples 
I've verified these work in `examples/Emulator/GaussianProcess/learn_noise.jl` example. after adding `Optim,LineSearches` to `examples/Emulator/GaussianProcess/Project.toml`)

1. Add kwargs only 
```julia
...
kwargs = (key1="val1", key2="val2")
optimize_hyperparameters!(emulator, kwargs...)
```
2. Modify args only (the user MUST add `Optim` and set the first argument to be the `method`, default `LBFGS()`)
```julia
using Optim
...
other_args = (LBFGS(), arg1, arg2) 
optimize_hyperparameters!(emulator, args...)
```
3. Modify the method linesearch [option](https://julianlsolvers.github.io/LineSearches.jl/stable/) (relating to #168)
```julia
using Optim, LineSearches
...
method_arg = LBFGS(linesearch=BackTracking())
optimize_hyperparameters!(emulator, method_arg)
```



Co-authored-by: odunbar <odunbar@caltech.edu>
  • Loading branch information
bors[bot] and odunbar authored Jul 28, 2022
2 parents ada708d + df1a70f commit 3f51ea5
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 5 deletions.
4 changes: 2 additions & 2 deletions src/Emulator.jl
Original file line number Diff line number Diff line change
Expand Up @@ -153,8 +153,8 @@ $(DocStringExtensions.TYPEDSIGNATURES)
Optimizes the hyperparameters in the machine learning tool.
"""
function optimize_hyperparameters!(emulator::Emulator{FT}) where {FT <: AbstractFloat}
optimize_hyperparameters!(emulator.machine_learning_tool)
function optimize_hyperparameters!(emulator::Emulator{FT}, args...; kwargs...) where {FT <: AbstractFloat}
optimize_hyperparameters!(emulator.machine_learning_tool, args...; kwargs...)
end


Expand Down
9 changes: 6 additions & 3 deletions src/GaussianProcess.jl
Original file line number Diff line number Diff line change
Expand Up @@ -168,13 +168,16 @@ end
$(DocStringExtensions.TYPEDSIGNATURES)
Optimize Gaussian process hyperparameters using in-build package method.
Warning: if one uses `GPJL()` and wishes to modify positional arguments. The first positional argument must be the `Optim` method (default `LBGFS()`).
"""
function optimize_hyperparameters!(gp::GaussianProcess{GPJL})
function optimize_hyperparameters!(gp::GaussianProcess{GPJL}, args...; kwargs...)
N_models = length(gp.models)
for i in 1:N_models
# always regress with noise_learn=false; if gp was created with noise_learn=true
# we've already explicitly added noise to the kernel
optimize!(gp.models[i], noise = false)

optimize!(gp.models[i], args...; noise = false, kwargs...)
println("optimized hyperparameters of GP: ", i)
println(gp.models[i].kernel)
end
Expand Down Expand Up @@ -270,7 +273,7 @@ function build_models!(
end


function optimize_hyperparameters!(gp::GaussianProcess{SKLJL})
function optimize_hyperparameters!(gp::GaussianProcess{SKLJL}, args...; kwargs...)
println("SKlearn, already trained. continuing...")
end

Expand Down

0 comments on commit 3f51ea5

Please sign in to comment.