-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lazy activation of models not working from within packages #22
Comments
I'm afraid I cannot reproduce your problem: julia> module Abc
import XGBoost: dump_model, save, Booster
using MLJ
using MLJBase
import MLJModels
using MLJModels.XGBoost_
function __init__()
@info "Abc"
end
end
[ Info: Recompiling stale cache file /Users/anthony/.julia/compiled/v1.1/XGBoost/rSeEh.ji for XGBoost [009559a3-9522-5dbb-924b-0b6ed2b22bb9]
[ Info: Abc
Main.Abc
julia> using MLJ
julia> task = load_boston()
SupervisedTask @ 5…85
julia> model = Abc.XGBoostRegressor()
MLJModels.XGBoost_.XGBoostRegressor(num_round = 1,
booster = "gbtree",
disable_default_eval_metric = 0,
eta = 0.3,
gamma = 0.0,
max_depth = 6,
min_child_weight = 1.0,
max_delta_step = 0.0,
subsample = 1.0,
colsample_bytree = 1.0,
colsample_bylevel = 1.0,
lambda = 1.0,
alpha = 0.0,
tree_method = "auto",
sketch_eps = 0.03,
scale_pos_weight = 1.0,
updater = "grow_colmaker",
refresh_leaf = 1,
process_type = "default",
grow_policy = "depthwise",
max_leaves = 0,
max_bin = 256,
predictor = "cpu_predictor",
sample_type = "uniform",
normalize_type = "tree",
rate_drop = 0.0,
one_drop = 0,
skip_drop = 0.0,
feature_selector = "cyclic",
top_k = 0,
tweedie_variance_power = 1.5,
objective = "reg:linear",
base_score = 0.5,
eval_metric = "rmse",
seed = 0,) @ 1…89
julia> mach = machine(model, task)
Machine{XGBoostRegressor} @ 1…99
julia> julia> evaluate!(mach)
┌ Info: Evaluating using cross-validation.
│ nfolds=6.
│ shuffle=false
│ measure=MLJ.rms
│ operation=StatsBase.predict
└ Resampling from all rows.
Cross-validating: 100%[=========================] Time: 0:00:01
6-element Array{Float64,1}:
15.071084701486205
16.70750413097405
22.12771143813795
20.89991496287021
15.434870166858115
11.602463981185641 Have you got MLJModels in your load path? You need MLJModels and MLJ in your project. Perhaps send me the result of |
Please see attached package.
It might be concurrency issue and be unstable. I can not say that I see it always. But in most cases it is present. julia version 1.0.3. MacOS
|
Strange. I still can't reproduce your problem after activating the environment you sent: (working) pkg> activate .
(Abc) pkg> instantiate
Updating registry at `~/.julia/registries/General`
Updating git-repo `https://github.com/JuliaRegistries/General.git`
julia> module Abc
import XGBoost: dump_model, save, Booster
using MLJ
using MLJBase
import MLJModels
using MLJModels.XGBoost_
function __init__()
@info "Abc"
end
end
Main.Abc
julia> using MLJ
julia> task = load_boston()
model = SupervisedTask{} @ 1…38
julia> model = Abc.XGBoostRegressor()
MLJModels.XGBoost_.XGBoostRegressor(num_round = 1,
booster = "gbtree",
disable_default_eval_metric = 0,
eta = 0.3,
gamma = 0.0,
max_depth = 6,
min_child_weight = 1.0,
max_delta_step = 0.0,
subsample = 1.0,
colsample_bytree = 1.0,
colsample_bylevel = 1.0,
lambda = 1.0,
alpha = 0.0,
tree_method = "auto",
sketch_eps = 0.03,
scale_pos_weight = 1.0,
updater = "grow_colmaker",
refresh_leaf = 1,
process_type = "default",
grow_policy = "depthwise",
max_leaves = 0,
max_bin = 256,
predictor = "cpu_predictor",
sample_type = "uniform",
normalize_type = "tree",
rate_drop = 0.0,
one_drop = 0,
skip_drop = 0.0,
feature_selector = "cyclic",
top_k = 0,
tweedie_variance_power = 1.5,
objective = "reg:linear",
base_score = 0.5,
eval_metric = "rmse",
seed = 0,) @ 5…98
julia> mach = machine(model, task)
Machine{XGBoostRegressor} @ 1…64
julia> evaluate!(mach)
┌ Info: Evaluating using cross-validation.
│ nfolds=6.
│ shuffle=false
│ measure=MLJ.rms
│ operation=StatsBase.predict
└ Resampling from all rows.
Cross-validating: 100%[=========================] Time: 0:00:02
6-element Array{Float64,1}:
15.071084701486205
16.70750413097405
22.12771143813795
20.89991496287021
15.434870166858115
11.602463981185641
julia> versioninfo()
Julia Version 1.0.3
Commit 099e826241 (2018-12-18 01:34 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin14.5.0)
CPU: Intel(R) Core(TM) i7-8850H CPU @ 2.60GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.0 (ORCJIT, skylake)
Environment:
JULIA_PATH = /Applications/Julia-1.1.app/Contents/Resources/julia/bin/julia
Run on MacOS. |
Can you try to run it without REPL from command line with ./build.jl only? Again, I think something like concurrency issue is here. Also I have a little bit older laptop:
|
Yes, now I can reproduce your issue. Many thanks for this. I would say we have uncovered a limitation of Requires.jl. Do you not agree? A secondary question is whether the edit July 23, 2020: Can confirm that if interface is provided by a package without use of requires, then issue is not there. |
Yes, I it might be restriction of Requires.jl. See also double call of So, some workaround we have. Regarding how to fix, as the issue confirmed, may be just put same issue with my sample to Requires.jl's list of issues if nobody can dive into it now. Regarding loading of models, for now I'm using |
Although I am doubtful, thought it worth mentioning that there was a refactor of @load that possibly resolve this issue. MLJModels 0.4.0 (which now owns the method) incorporates the changes. |
Update: This issue is unresolved under MLJModels 0.5.0. |
@ablaom is this still a (relevant) issue? |
Noted. The long term plan is to "disintegrate" MLJModels into individual packages, eliminating all use of Requires.jl. Then loading a model with glue code currently provided by MLJModels, should be no different from loading models from packages that natively support the MLJ model interface (eg, EvoTrees.jl, MLJLinearModels.jl). In these cases, I am not aware of any issue, but let me know if you discover one. |
Partial workaround is here: JuliaAI/MLJ.jl#613 (comment) |
I think we better start the disintegration of MLJModels fast |
PR's welcome 😄 Happy to provide guidance. The repos are called `MLJGLMInterface.jl, and so forth). If you want to start on one, let me know which and I'll get you commit access. Here's the issue: #244 (comment) |
Great. I will work on them in my spare time. |
Pretty sure this has been resolved by above PR. |
I'm trying to make a module with MLJModel:
but having an error
ERROR: LoadError: UndefVarError: XGBoost_ not defined
.Looks like there is an issue with lazy activation in
One workaround I found is
Also I added debug output into function
__init__
of the module MLJModels and I see that this method is called twice. I have something like:May be it is related to a chain of
__init__
methods.The text was updated successfully, but these errors were encountered: