-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Visualizing hyperparameter tuning results for arbitrary numbers of parameters #416
Comments
This sounds like a good suggestion to me. |
@ablaom @tlienart
if you run: m= train(Species ~ ., method = "glmnet", data = trainSet)
m
plot(m) You will automatically get scores (Accuracy/Kappa) for 9 combinations of alpha/lambda.
m = train(Species ~ ., method = "glmnet", data = trainSet, tuneLength = 5)
m
plot(m) This automatically generates a grid w/ 5 values of alpha, & 5 for lambda, giving a total grid w/ 25 elements. I'm not sure if this plot option from Caret is what @baggepinnen had in mind, but I kinda like it. I'd love to see an option like "tuneLength" in MLJ but you can also include a smarter option. Perhaps you can include |
Here is the code library(caret); data("iris"); set.seed(123);
my_index= createDataPartition(iris$Sepal.Length, p = 0.75, list = F)
trainSet= iris[my_index, ]; testSet= iris[-my_index, ];
####
#https://github.com/topepo/caret/blob/master/models/files/glmnet.R
getModelInfo("glmnet")
####
set.seed(123)
m= train(Species ~ .,method = "glmnet",data = trainSet )
m #Accuracy/Kappa. alpha =.1/.55/1. lambda= 3 default values
plot(m)
#
set.seed(123)
m = train(Species ~ .,method = "glmnet",data = trainSet,tuneLength = 5)
m
plot(m) |
@azev77 The key challenge for us would be setting up default grids for our existing models. Is possible to scrape a list of default grids for each caret model? This could be quite useful for MLJ devs. (Although, in the case of nominal parameters, I am proposing we specify default ranges (ParamRange objects), which (roughly) specify the search space without specifying the resolution. These are bounded intervals, or, in the semi-bounded case, an upper/lower limit plus an "origin" and "unit". From these either grids or pdfs could be constructed, depending on further parameters appropriate to the particular tuning strategy - random, latin cube, Bayesian, and so forth). |
@azev77. Thanks for that. Suggestion noted. As Tuning is iterative, control is to be externalised, for he plan for implementing any kind of control of any iterative model (including the See here: #139 |
Btw, just to be clear I don't mean a timelimit just for tuning, I mean a timelimit for training in general. |
Facebook AI has a new take on hyperparameter Visualization |
Hey guys my conversation w/ @yalwan-iqvia about TreeParzen.jl got me thinking about HP optimization frameworks. A nice comparison w/ Hyperopt shows what can be done for HP visualization: Here are a few snips: A 3 minute clip: https://www.youtube.com/watch?v=-UeC4MR3PHM It would really be amazing for MLJ to incorporate this! |
Yeah Optuna is cool and the team behind it is pretty solid. This is a project by itself though: to do a Optuna.jl (with interface to MLJ). Maybe something worth announcing on discourse to see if there’s any takers |
@azev77 Could you please re-post this suggestion at MLJTuning.jl? Thanks |
Suggestion of @baggepinnen, copied from #85:
The text was updated successfully, but these errors were encountered: