Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[R-package] Create portable configuration with 'configure' scripts #2960

Closed
jameslamb opened this issue Mar 30, 2020 · 3 comments
Closed

[R-package] Create portable configuration with 'configure' scripts #2960

jameslamb opened this issue Mar 30, 2020 · 3 comments

Comments

@jameslamb
Copy link
Collaborator

With recent changes (see #629 (comment)), the R package is getting closer to being ready for CRAN.

I tried to submit the R package as of #2936 to the win-builder service. This is a free service that allows you to get an early glimpse into how CRAN checks will go for a package, specifically on the various Windows configurations used by CRAN.

The results were not good...the R package cannot be built because CMake isn't available in CRAN check farms.

*** arch - i386
installing via 'install.libs.R' to d:/RCompile/CRANguest/R-devel/lib/00LOCK-lightgbm/00new/lightgbm
[1] "Trying to build with: 'Visual Studio 16 2019'"
Warning in system(paste0(tmp_cmake_cmd, " ..")) : 'cmake' not found
[1] "Trying to build with: 'Visual Studio 15 2017'"
Warning in system(paste0(tmp_cmake_cmd, " ..")) : 'cmake' not found
[1] "Trying to build with: 'Visual Studio 14 2015'"
Warning in system(paste0(tmp_cmake_cmd, " ..")) : 'cmake' not found
Warning in system(paste0(cmake_cmd, " ..")) : 'cmake' not found
Warning in system(build_cmd) : 'cmake' not found
Error in eval(ei, envir) : Cannot find lib_lightgbm.dll
* removing 'd:/RCompile/CRANguest/R-devel/lib/lightgbm'

I see some evidence on GitHub issues that we shouldn't assume CMake is there, but then other packages currently on CRAN do seem to assume it. For example, qtbase, Eigen, and 50+ other packages.

In Writing R Extensions, I see the following:

If your package needs some system-dependent configuration before installation you can include an executable (Bourne) shell script configure in your package which (if present) is executed by R CMD INSTALL before any other action is performed. This can be a script created by the Autoconf mechanism, but may also be a script written by yourself. Use this to detect if any nonstandard libraries are present such that corresponding code in the package can be disabled at install time rather than giving error messages when the package is compiled or used. To summarize, the full power of Autoconf is available for your extension package (including variable substitution, searching for libraries, etc.).

Projects similar to LightGBM have taken this approach.

I think that to get to CRAN, we're going to have to either figure out if there is a way to assume CMake availability or we're going to have to create these configure scripts and create a build script that doesn't explicitly require CMake. To start addressing this issue, it would be good to download a source tarball of xgboost and compare the contents of that package to what is in their repo and what is created with make Rpack from that repo.

cc @Laurae2 @StrikerRUS

@jameslamb jameslamb changed the title [R-package] Create portable configuration with autoconf [R-package] Create portable configuration with 'configure' scripts Mar 30, 2020
@Laurae2 Laurae2 mentioned this issue Mar 30, 2020
12 tasks
@jameslamb
Copy link
Collaborator Author

Added this to #2302 , where we keep all feature requests. If you'd like to contribute this feature (or help with it), leave a comment here and we will re-open the issue.

@jameslamb
Copy link
Collaborator Author

jameslamb commented Apr 8, 2020

I'm making good progress on this one! I got the version on my fork (jameslamb#15) building successfully on 64-bit Windows with R-devel (R 4.0.0 release candidate), with all unit tests passing. A few R CMD CHECK issues but getting very close.

the build link will be valid for about two more days, so I copied the logs here.

install logs (click me)
* installing *source* package 'lightgbm' ...
** using staged installation
** libs

*** arch - i386
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c application/application.cpp -o application/application.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/boosting.cpp -o boosting/boosting.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/gbdt.cpp -o boosting/gbdt.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/gbdt_model_text.cpp -o boosting/gbdt_model_text.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/gbdt_prediction.cpp -o boosting/gbdt_prediction.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/prediction_early_stop.cpp -o boosting/prediction_early_stop.o
boosting/prediction_early_stop.cpp: In function 'LightGBM::PredictionEarlyStopInstance LightGBM::CreatePredictionEarlyStopInstance(const string&, const LightGBM::PredictionEarlyStopConfig&)':
boosting/prediction_early_stop.cpp:86:1: warning: control reaches end of non-void function [-Wreturn-type]
 }
 ^
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/bin.cpp -o io/bin.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/config.cpp -o io/config.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/config_auto.cpp -o io/config_auto.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/dataset.cpp -o io/dataset.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/dataset_loader.cpp -o io/dataset_loader.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/file_io.cpp -o io/file_io.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/json11.cpp -o io/json11.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/metadata.cpp -o io/metadata.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/parser.cpp -o io/parser.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/tree.cpp -o io/tree.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c metric/dcg_calculator.cpp -o metric/dcg_calculator.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c metric/metric.cpp -o metric/metric.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c objective/objective_function.cpp -o objective/objective_function.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/linker_topo.cpp -o network/linker_topo.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/linkers_mpi.cpp -o network/linkers_mpi.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/linkers_socket.cpp -o network/linkers_socket.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/network.cpp -o network/network.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/data_parallel_tree_learner.cpp -o treelearner/data_parallel_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/feature_parallel_tree_learner.cpp -o treelearner/feature_parallel_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/gpu_tree_learner.cpp -o treelearner/gpu_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/serial_tree_learner.cpp -o treelearner/serial_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/tree_learner.cpp -o treelearner/tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/voting_parallel_tree_learner.cpp -o treelearner/voting_parallel_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c c_api.cpp -o c_api.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++  -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c lightgbm_R.cpp -o lightgbm_R.o
d:/Compiler/gcc-4.9.3/mingw_32/bin/g++ -shared -s -static-libgcc -o lightgbm.dll tmp.def application/application.o boosting/boosting.o boosting/gbdt.o boosting/gbdt_model_text.o boosting/gbdt_prediction.o boosting/prediction_early_stop.o io/bin.o io/config.o io/config_auto.o io/dataset.o io/dataset_loader.o io/file_io.o io/json11.o io/metadata.o io/parser.o io/tree.o metric/dcg_calculator.o metric/metric.o objective/objective_function.o network/linker_topo.o network/linkers_mpi.o network/linkers_socket.o network/network.o treelearner/data_parallel_tree_learner.o treelearner/feature_parallel_tree_learner.o treelearner/gpu_tree_learner.o treelearner/serial_tree_learner.o treelearner/tree_learner.o treelearner/voting_parallel_tree_learner.o c_api.o lightgbm_R.o -fopenmp -pthread -lws2_32 -lIphlpapi -Ld:/Compiler/gcc-4.9.3/local330/lib/i386 -Ld:/Compiler/gcc-4.9.3/local330/lib -LD:/RCompile/recent/R/bin/i386 -lR
installing to d:/RCompile/CRANguest/R-devel/lib/00LOCK-lightgbm/00new/lightgbm/libs/i386

*** arch - x64
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c application/application.cpp -o application/application.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/boosting.cpp -o boosting/boosting.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/gbdt.cpp -o boosting/gbdt.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/gbdt_model_text.cpp -o boosting/gbdt_model_text.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/gbdt_prediction.cpp -o boosting/gbdt_prediction.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c boosting/prediction_early_stop.cpp -o boosting/prediction_early_stop.o
boosting/prediction_early_stop.cpp: In function 'LightGBM::PredictionEarlyStopInstance LightGBM::CreatePredictionEarlyStopInstance(const string&, const LightGBM::PredictionEarlyStopConfig&)':
boosting/prediction_early_stop.cpp:86:1: warning: control reaches end of non-void function [-Wreturn-type]
 }
 ^
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/bin.cpp -o io/bin.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/config.cpp -o io/config.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/config_auto.cpp -o io/config_auto.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/dataset.cpp -o io/dataset.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/dataset_loader.cpp -o io/dataset_loader.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/file_io.cpp -o io/file_io.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/json11.cpp -o io/json11.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/metadata.cpp -o io/metadata.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/parser.cpp -o io/parser.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c io/tree.cpp -o io/tree.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c metric/dcg_calculator.cpp -o metric/dcg_calculator.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c metric/metric.cpp -o metric/metric.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c objective/objective_function.cpp -o objective/objective_function.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/linker_topo.cpp -o network/linker_topo.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/linkers_mpi.cpp -o network/linkers_mpi.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/linkers_socket.cpp -o network/linkers_socket.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c network/network.cpp -o network/network.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/data_parallel_tree_learner.cpp -o treelearner/data_parallel_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/feature_parallel_tree_learner.cpp -o treelearner/feature_parallel_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/gpu_tree_learner.cpp -o treelearner/gpu_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/serial_tree_learner.cpp -o treelearner/serial_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/tree_learner.cpp -o treelearner/tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c treelearner/voting_parallel_tree_learner.cpp -o treelearner/voting_parallel_tree_learner.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c c_api.cpp -o c_api.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -std=gnu++11 -I"D:/RCompile/recent/R/include" -DNDEBUG -I./include -DR_VER_ABOVE_35 -DUSE_SOCKET -DLGB_R_BUILD -DMM_MALLOC=1 -DMM_PREFETCH=1 -funroll-loops -Wno-unknown-pragmas -static-libstdc++ -O3    -I"d:/Compiler/gcc-4.9.3/local330/include"  -fopenmp -pthread -static-libstdc++   -O2 -Wall  -mtune=core2 -c lightgbm_R.cpp -o lightgbm_R.o
d:/Compiler/gcc-4.9.3/mingw_64/bin/g++ -m64 -shared -s -static-libgcc -o lightgbm.dll tmp.def application/application.o boosting/boosting.o boosting/gbdt.o boosting/gbdt_model_text.o boosting/gbdt_prediction.o boosting/prediction_early_stop.o io/bin.o io/config.o io/config_auto.o io/dataset.o io/dataset_loader.o io/file_io.o io/json11.o io/metadata.o io/parser.o io/tree.o metric/dcg_calculator.o metric/metric.o objective/objective_function.o network/linker_topo.o network/linkers_mpi.o network/linkers_socket.o network/network.o treelearner/data_parallel_tree_learner.o treelearner/feature_parallel_tree_learner.o treelearner/gpu_tree_learner.o treelearner/serial_tree_learner.o treelearner/tree_learner.o treelearner/voting_parallel_tree_learner.o c_api.o lightgbm_R.o -fopenmp -pthread -lws2_32 -lIphlpapi -Ld:/Compiler/gcc-4.9.3/local330/lib/x64 -Ld:/Compiler/gcc-4.9.3/local330/lib -LD:/RCompile/recent/R/bin/x64 -lR
installing to d:/RCompile/CRANguest/R-devel/lib/00LOCK-lightgbm/00new/lightgbm/libs/x64
** R
** data
** demo
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded from temporary location
*** arch - i386
*** arch - x64
** testing if installed package can be loaded from final location
*** arch - i386
*** arch - x64
** testing if installed package keeps a record of temporary installation path
* MD5 sums
packaged installation of 'lightgbm' as lightgbm_2.3.2.zip
* DONE (lightgbm)
CHECK logs (click me)
* using log directory 'd:/RCompile/CRANguest/R-devel/lightgbm.Rcheck'
* using R version 4.0.0 alpha (2020-03-26 r78078)
* using platform: x86_64-w64-mingw32 (64-bit)
* using session charset: ISO8859-1
* checking for file 'lightgbm/DESCRIPTION' ... OK
* checking extension type ... Package
* this is package 'lightgbm' version '2.3.2'
* package encoding: UTF-8
* checking CRAN incoming feasibility ... NOTE
Maintainer: 'James Lamb <jaylamb20@gmail.com>'

New submission

License components with restrictions and base license permitting such:
  MIT + file LICENSE
File 'LICENSE':
  The MIT License (MIT)
  
  Copyright (c) Microsoft Corporation
  
  Permission is hereby granted, free of charge, to any person obtaining a copy
  of this software and associated documentation files (the "Software"), to deal
  in the Software without restriction, including without limitation the rights
  to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
  copies of the Software, and to permit persons to whom the Software is
  furnished to do so, subject to the following conditions:
  
  The above copyright notice and this permission notice shall be included in all
  copies or substantial portions of the Software.
  
  THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
  IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
  FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
  AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
  LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
  OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
  SOFTWARE.

Possibly mis-spelled words in DESCRIPTION:
  LightGBM (11:88, 18:41, 19:60, 19:264)

The Date field is over a month old.
* checking package namespace information ... OK
* checking package dependencies ... OK
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... OK
* checking serialization versions ... OK
* checking whether package 'lightgbm' can be installed ... WARNING
Found the following significant warnings:
  boosting/prediction_early_stop.cpp:86:1: warning: control reaches end of non-void function [-Wreturn-type]
See 'd:/RCompile/CRANguest/R-devel/lightgbm.Rcheck/00install.out' for details.
* checking installed package size ... OK
* checking package directory ... OK
* checking for future file timestamps ... OK
* checking DESCRIPTION meta-information ... NOTE
Authors@R field gives no person with name and author role
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* loading checks for arch 'i386'
** checking whether the package can be loaded ... OK
** checking whether the package can be loaded with stated dependencies ... OK
** checking whether the package can be unloaded cleanly ... OK
** checking whether the namespace can be loaded with stated dependencies ... OK
** checking whether the namespace can be unloaded cleanly ... OK
** checking loading without being on the library search path ... OK
** checking use of S3 registration ... OK
* loading checks for arch 'x64'
** checking whether the package can be loaded ... OK
** checking whether the package can be loaded with stated dependencies ... OK
** checking whether the package can be unloaded cleanly ... OK
** checking whether the namespace can be loaded with stated dependencies ... OK
** checking whether the namespace can be unloaded cleanly ... OK
** checking loading without being on the library search path ... OK
** checking use of S3 registration ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... [8s] OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd line widths ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking contents of 'data' directory ... OK
* checking data for non-ASCII characters ... OK
* checking data for ASCII and uncompressed saves ... OK
* checking line endings in shell scripts ... OK
* checking line endings in C/C++/Fortran sources/headers ... OK
* checking line endings in Makefiles ... OK
* checking compilation flags in Makevars ... WARNING
Non-portable flags in variable 'PKG_CPPFLAGS':
  -fPIC -funroll-loops -Wno-unknown-pragmas
* checking for GNU extensions in Makefiles ... OK
* checking for portable use of $(BLAS_LIBS) and $(LAPACK_LIBS) ... OK
* checking use of PKG_*FLAGS in Makefiles ... OK
* checking use of SHLIB_OPENMP_*FLAGS in Makefiles ... OK
* checking pragmas in C/C++ headers and code ... OK
* checking compiled code ... OK
* checking examples ...
** running examples for arch 'i386' ... ERROR
Running examples in 'lightgbm-Ex.R' failed
The error most likely occurred in:

> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: dimnames.lgb.Dataset
> ### Title: Handling of column names of 'lgb.Dataset'
> ### Aliases: dimnames.lgb.Dataset dimnames<-.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.construct(dtrain)
Error in dataset$construct() : 
  lgb.Dataset.construct: cannot create Dataset handle
Calls: lgb.Dataset.construct -> <Anonymous>
Execution halted
** running examples for arch 'x64' ... [14s] OK
* checking for unstated dependencies in 'tests' ... OK
* checking tests ...
** running tests for arch 'i386' ... [9s] ERROR
  Running 'testthat.R' [7s]
Running the tests in 'tests/testthat.R' failed.
Complete output:
  > library(testthat)
  > library(lightgbm)
  Loading required package: R6
  > 
  > test_check(
  +     package = "lightgbm"
  +     , stop_on_failure = TRUE
  +     , stop_on_warning = FALSE
  + )
  -- 1. Error: train and predict binary classification (@test_basic.R#12)  -------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lightgbm(...)
   4. data$construct()
  
  [LightGBM] [Warning] Unknown parameter: min_hess
  -- 2. Error: train and predict softmax (@test_basic.R#38)  ---------------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lightgbm(...)
   4. data$construct()
  
  -- 3. Error: use of multiple eval metrics works (@test_basic.R#62)  ------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lightgbm(...)
   4. data$construct()
  
  -- 4. Error: lgb.Booster.upper_bound() and lgb.Booster.lower_bound() work as exp
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lightgbm(...)
   4. data$construct()
  
  -- 5. Error: lgb.Booster.upper_bound() and lgb.Booster.lower_bound() work as exp
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lightgbm(...)
   4. data$construct()
  
  -- 6. Error: lightgbm() performs evaluation on validation sets if they are provi
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lightgbm(...)
   4. data$construct()
  
  -- 7. Error: cv works (@test_basic.R#203)  -------------------------------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.cv(...)
   2. data$construct()
  
  -- 8. Error: lgb.train() works as expected with multiple eval metrics (@test_bas
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 9. Error: lgb.train() works with force_col_wise and force_row_wise (@test_bas
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(params = params, data = dtrain, nrounds = nrounds)
   2. data$construct()
  
  -- 10. Error: lgb.train() works as expected with sparse features (@test_basic.R#
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 11. Error: lgb.train() works with early stopping for classification (@test_ba
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 12. Error: lgb.train() works with early stopping for regression (@test_basic.
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 13. Error: custom objective works (@test_custom_objective.R#41)  ------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(param, dtrain, num_round, watchlist, eval = evalerror)
   2. data$construct()
  
  -- 14. Error: lgb.Dataset: basic construction, saving, loading (@test_dataset.R#
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.Dataset.save(dtest1, tmp_file)
   2. dataset$save_binary(fname)
   3. self$construct()
  
  -- 15. Error: lgb.Dataset: getinfo & setinfo (@test_dataset.R#29)  -------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. dtest$construct()
  
  -- 16. Error: lgb.Dataset: slice, dim (@test_dataset.R#44)  --------------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.Dataset.construct(dtest)
   2. dataset$construct()
  
  -- 17. Error: lgb.Dataset: colnames (@test_dataset.R#55)  ----------------------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.Dataset.construct(dtest)
   2. dataset$construct()
  
  -- 18. Failure: lgb.Dataset: Dataset should be able to construct from matrix and
  is.na(handle) isn't false.
  
  -- 19. Error: lgb.Dataset$setinfo() should convert 'group' to integer (@test_dat
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. ds$construct()
  
  -- 20. Error: learning-to-rank with lgb.train() works as expected (@test_learnin
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(params = params, data = dtrain, nrounds = 10L)
   2. data$construct()
  
  -- 21. Error: learning-to-rank with lgb.cv() works as expected (@test_learning_t
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.cv(...)
   2. data$construct()
  
  -- 22. Error: lgb.get.eval.result() should throw an informative error for incorr
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 23. Error: lgb.get.eval.result() should throw an informative error for incorr
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 24. Error: lgb.intereprete works as expected for binary classification (@test
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(params = params, data = dtrain, nrounds = 10L)
   2. data$construct()
  
  -- 25. Error: lgb.intereprete works as expected for multiclass classification (@
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 26. Error: lgb.plot.importance() should run without error for well-formed inp
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(params, dtrain, 10L)
   2. data$construct()
  
  -- 27. Error: lgb.plot.interepretation works as expected for binary classificati
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(params = params, data = dtrain, nrounds = 10L)
   2. data$construct()
  
  -- 28. Error: lgb.plot.interepretation works as expected for multiclass classifi
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. lightgbm::lgb.train(...)
   2. data$construct()
  
  -- 29. Error: Feature penalties work properly (@test_parameters.R#14)  ---------
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. base::lapply(...)
   2. lightgbm:::FUN(X[[i]], ...)
   3. lightgbm::lightgbm(...)
   6. data$construct()
  
  -- 30. Error: training should warn if you use 'dart' boosting, specified with 'b
  lgb.Dataset.construct: cannot create Dataset handle
  Backtrace:
   1. testthat::expect_warning(...)
   6. lightgbm::lightgbm(...)
   9. data$construct()
  
  == testthat results  ===========================================================
  [ OK: 495 | SKIPPED: 1 | WARNINGS: 0 | FAILED: 30 ]
  1. Error: train and predict binary classification (@test_basic.R#12) 
  2. Error: train and predict softmax (@test_basic.R#38) 
  3. Error: use of multiple eval metrics works (@test_basic.R#62) 
  4. Error: lgb.Booster.upper_bound() and lgb.Booster.lower_bound() work as expected for binary classification (@test_basic.R#83) 
  5. Error: lgb.Booster.upper_bound() and lgb.Booster.lower_bound() work as expected for regression (@test_basic.R#98) 
  6. Error: lightgbm() performs evaluation on validation sets if they are provided (@test_basic.R#135) 
  7. Error: cv works (@test_basic.R#203) 
  8. Error: lgb.train() works as expected with multiple eval metrics (@test_basic.R#257) 
  9. Error: lgb.train() works with force_col_wise and force_row_wise (@test_basic.R#375) 
  1. ...
  
  Error: testthat unit tests failed
  Execution halted
** running tests for arch 'x64' ... [25s] OK
  Running 'testthat.R' [25s]
* checking PDF version of manual ... OK
* checking for detritus in the temp directory ... OK
* DONE
Status: 2 ERRORs, 2 WARNINGs, 2 NOTEs
test logs on x64 (click me)

R version 4.0.0 alpha (2020-03-26 r78078)
Copyright (C) 2020 The R Foundation for Statistical Computing
Platform: x86_64-w64-mingw32/x64 (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> pkgname <- "lightgbm"
> source(file.path(R.home("share"), "R", "examples-header.R"))
> options(warn = 1)
> options(pager = "console")
> base::assign(".ExTimings", "lightgbm-Ex.timings", pos = 'CheckExEnv')
> base::cat("name\tuser\tsystem\telapsed\n", file=base::get(".ExTimings", pos = 'CheckExEnv'))
> base::assign(".format_ptime",
+ function(x) {
+   if(!is.na(x[4L])) x[1L] <- x[1L] + x[4L]
+   if(!is.na(x[5L])) x[2L] <- x[2L] + x[5L]
+   options(OutDec = '.')
+   format(x[1L:3L], digits = 7L)
+ },
+ pos = 'CheckExEnv')
> 
> ### * </HEADER>
> library('lightgbm')
Loading required package: R6
> 
> base::assign(".oldSearch", base::search(), pos = 'CheckExEnv')
> base::assign(".old_wd", base::getwd(), pos = 'CheckExEnv')
> cleanEx()
> nameEx("dim")
> ### * dim
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: dim.lgb.Dataset
> ### Title: Dimensions of an 'lgb.Dataset'
> ### Aliases: dim.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
Loading required package: Matrix
> 
> stopifnot(nrow(dtrain) == nrow(train$data))
> stopifnot(ncol(dtrain) == ncol(train$data))
> stopifnot(all(dim(dtrain) == dim(train$data)))
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("dim", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()

detaching 'package:Matrix'

> nameEx("dimnames.lgb.Dataset")
> ### * dimnames.lgb.Dataset
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: dimnames.lgb.Dataset
> ### Title: Handling of column names of 'lgb.Dataset'
> ### Aliases: dimnames.lgb.Dataset dimnames<-.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.construct(dtrain)
> dimnames(dtrain)
[[1]]
NULL

[[2]]
  [1] "cap-shape=bell"                   "cap-shape=conical"               
  [3] "cap-shape=convex"                 "cap-shape=flat"                  
  [5] "cap-shape=knobbed"                "cap-shape=sunken"                
  [7] "cap-surface=fibrous"              "cap-surface=grooves"             
  [9] "cap-surface=scaly"                "cap-surface=smooth"              
 [11] "cap-color=brown"                  "cap-color=buff"                  
 [13] "cap-color=cinnamon"               "cap-color=gray"                  
 [15] "cap-color=green"                  "cap-color=pink"                  
 [17] "cap-color=purple"                 "cap-color=red"                   
 [19] "cap-color=white"                  "cap-color=yellow"                
 [21] "bruises?=bruises"                 "bruises?=no"                     
 [23] "odor=almond"                      "odor=anise"                      
 [25] "odor=creosote"                    "odor=fishy"                      
 [27] "odor=foul"                        "odor=musty"                      
 [29] "odor=none"                        "odor=pungent"                    
 [31] "odor=spicy"                       "gill-attachment=attached"        
 [33] "gill-attachment=descending"       "gill-attachment=free"            
 [35] "gill-attachment=notched"          "gill-spacing=close"              
 [37] "gill-spacing=crowded"             "gill-spacing=distant"            
 [39] "gill-size=broad"                  "gill-size=narrow"                
 [41] "gill-color=black"                 "gill-color=brown"                
 [43] "gill-color=buff"                  "gill-color=chocolate"            
 [45] "gill-color=gray"                  "gill-color=green"                
 [47] "gill-color=orange"                "gill-color=pink"                 
 [49] "gill-color=purple"                "gill-color=red"                  
 [51] "gill-color=white"                 "gill-color=yellow"               
 [53] "stalk-shape=enlarging"            "stalk-shape=tapering"            
 [55] "stalk-root=bulbous"               "stalk-root=club"                 
 [57] "stalk-root=cup"                   "stalk-root=equal"                
 [59] "stalk-root=rhizomorphs"           "stalk-root=rooted"               
 [61] "stalk-root=missing"               "stalk-surface-above-ring=fibrous"
 [63] "stalk-surface-above-ring=scaly"   "stalk-surface-above-ring=silky"  
 [65] "stalk-surface-above-ring=smooth"  "stalk-surface-below-ring=fibrous"
 [67] "stalk-surface-below-ring=scaly"   "stalk-surface-below-ring=silky"  
 [69] "stalk-surface-below-ring=smooth"  "stalk-color-above-ring=brown"    
 [71] "stalk-color-above-ring=buff"      "stalk-color-above-ring=cinnamon" 
 [73] "stalk-color-above-ring=gray"      "stalk-color-above-ring=orange"   
 [75] "stalk-color-above-ring=pink"      "stalk-color-above-ring=red"      
 [77] "stalk-color-above-ring=white"     "stalk-color-above-ring=yellow"   
 [79] "stalk-color-below-ring=brown"     "stalk-color-below-ring=buff"     
 [81] "stalk-color-below-ring=cinnamon"  "stalk-color-below-ring=gray"     
 [83] "stalk-color-below-ring=orange"    "stalk-color-below-ring=pink"     
 [85] "stalk-color-below-ring=red"       "stalk-color-below-ring=white"    
 [87] "stalk-color-below-ring=yellow"    "veil-type=partial"               
 [89] "veil-type=universal"              "veil-color=brown"                
 [91] "veil-color=orange"                "veil-color=white"                
 [93] "veil-color=yellow"                "ring-number=none"                
 [95] "ring-number=one"                  "ring-number=two"                 
 [97] "ring-type=cobwebby"               "ring-type=evanescent"            
 [99] "ring-type=flaring"                "ring-type=large"                 
[101] "ring-type=none"                   "ring-type=pendant"               
[103] "ring-type=sheathing"              "ring-type=zone"                  
[105] "spore-print-color=black"          "spore-print-color=brown"         
[107] "spore-print-color=buff"           "spore-print-color=chocolate"     
[109] "spore-print-color=green"          "spore-print-color=orange"        
[111] "spore-print-color=purple"         "spore-print-color=white"         
[113] "spore-print-color=yellow"         "population=abundant"             
[115] "population=clustered"             "population=numerous"             
[117] "population=scattered"             "population=several"              
[119] "population=solitary"              "habitat=grasses"                 
[121] "habitat=leaves"                   "habitat=meadows"                 
[123] "habitat=paths"                    "habitat=urban"                   
[125] "habitat=waste"                    "habitat=woods"                   

> colnames(dtrain)
  [1] "cap-shape=bell"                   "cap-shape=conical"               
  [3] "cap-shape=convex"                 "cap-shape=flat"                  
  [5] "cap-shape=knobbed"                "cap-shape=sunken"                
  [7] "cap-surface=fibrous"              "cap-surface=grooves"             
  [9] "cap-surface=scaly"                "cap-surface=smooth"              
 [11] "cap-color=brown"                  "cap-color=buff"                  
 [13] "cap-color=cinnamon"               "cap-color=gray"                  
 [15] "cap-color=green"                  "cap-color=pink"                  
 [17] "cap-color=purple"                 "cap-color=red"                   
 [19] "cap-color=white"                  "cap-color=yellow"                
 [21] "bruises?=bruises"                 "bruises?=no"                     
 [23] "odor=almond"                      "odor=anise"                      
 [25] "odor=creosote"                    "odor=fishy"                      
 [27] "odor=foul"                        "odor=musty"                      
 [29] "odor=none"                        "odor=pungent"                    
 [31] "odor=spicy"                       "gill-attachment=attached"        
 [33] "gill-attachment=descending"       "gill-attachment=free"            
 [35] "gill-attachment=notched"          "gill-spacing=close"              
 [37] "gill-spacing=crowded"             "gill-spacing=distant"            
 [39] "gill-size=broad"                  "gill-size=narrow"                
 [41] "gill-color=black"                 "gill-color=brown"                
 [43] "gill-color=buff"                  "gill-color=chocolate"            
 [45] "gill-color=gray"                  "gill-color=green"                
 [47] "gill-color=orange"                "gill-color=pink"                 
 [49] "gill-color=purple"                "gill-color=red"                  
 [51] "gill-color=white"                 "gill-color=yellow"               
 [53] "stalk-shape=enlarging"            "stalk-shape=tapering"            
 [55] "stalk-root=bulbous"               "stalk-root=club"                 
 [57] "stalk-root=cup"                   "stalk-root=equal"                
 [59] "stalk-root=rhizomorphs"           "stalk-root=rooted"               
 [61] "stalk-root=missing"               "stalk-surface-above-ring=fibrous"
 [63] "stalk-surface-above-ring=scaly"   "stalk-surface-above-ring=silky"  
 [65] "stalk-surface-above-ring=smooth"  "stalk-surface-below-ring=fibrous"
 [67] "stalk-surface-below-ring=scaly"   "stalk-surface-below-ring=silky"  
 [69] "stalk-surface-below-ring=smooth"  "stalk-color-above-ring=brown"    
 [71] "stalk-color-above-ring=buff"      "stalk-color-above-ring=cinnamon" 
 [73] "stalk-color-above-ring=gray"      "stalk-color-above-ring=orange"   
 [75] "stalk-color-above-ring=pink"      "stalk-color-above-ring=red"      
 [77] "stalk-color-above-ring=white"     "stalk-color-above-ring=yellow"   
 [79] "stalk-color-below-ring=brown"     "stalk-color-below-ring=buff"     
 [81] "stalk-color-below-ring=cinnamon"  "stalk-color-below-ring=gray"     
 [83] "stalk-color-below-ring=orange"    "stalk-color-below-ring=pink"     
 [85] "stalk-color-below-ring=red"       "stalk-color-below-ring=white"    
 [87] "stalk-color-below-ring=yellow"    "veil-type=partial"               
 [89] "veil-type=universal"              "veil-color=brown"                
 [91] "veil-color=orange"                "veil-color=white"                
 [93] "veil-color=yellow"                "ring-number=none"                
 [95] "ring-number=one"                  "ring-number=two"                 
 [97] "ring-type=cobwebby"               "ring-type=evanescent"            
 [99] "ring-type=flaring"                "ring-type=large"                 
[101] "ring-type=none"                   "ring-type=pendant"               
[103] "ring-type=sheathing"              "ring-type=zone"                  
[105] "spore-print-color=black"          "spore-print-color=brown"         
[107] "spore-print-color=buff"           "spore-print-color=chocolate"     
[109] "spore-print-color=green"          "spore-print-color=orange"        
[111] "spore-print-color=purple"         "spore-print-color=white"         
[113] "spore-print-color=yellow"         "population=abundant"             
[115] "population=clustered"             "population=numerous"             
[117] "population=scattered"             "population=several"              
[119] "population=solitary"              "habitat=grasses"                 
[121] "habitat=leaves"                   "habitat=meadows"                 
[123] "habitat=paths"                    "habitat=urban"                   
[125] "habitat=waste"                    "habitat=woods"                   
> colnames(dtrain) <- make.names(seq_len(ncol(train$data)))
> print(dtrain, verbose = TRUE)
<lgb.Dataset>
  Public:
    construct: function () 
    create_valid: function (data, info = list(), ...) 
    dim: function () 
    finalize: function () 
    get_colnames: function () 
    get_params: function () 
    getinfo: function (name) 
    initialize: function (data, params = list(), reference = NULL, colnames = NULL, 
    save_binary: function (fname) 
    set_categorical_feature: function (categorical_feature) 
    set_colnames: function (colnames) 
    set_reference: function (reference) 
    setinfo: function (name, info) 
    slice: function (idxset, ...) 
    update_params: function (params) 
  Private:
    categorical_feature: NULL
    colnames: X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X ...
    free_raw_data: TRUE
    get_handle: function () 
    handle: 6.94528394338397e-316
    info: list
    params: list
    predictor: NULL
    raw_data: NULL
    reference: NULL
    set_predictor: function (predictor) 
    used_indices: NULL
    version: 1
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("dimnames.lgb.Dataset", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("getinfo")
> ### * getinfo
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: getinfo
> ### Title: Get information of an 'lgb.Dataset' object
> ### Aliases: getinfo getinfo.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.construct(dtrain)
> 
> labels <- lightgbm::getinfo(dtrain, "label")
> lightgbm::setinfo(dtrain, "label", 1 - labels)
> 
> labels2 <- lightgbm::getinfo(dtrain, "label")
> stopifnot(all(labels2 == 1 - labels))
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("getinfo", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.Dataset")
> ### * lgb.Dataset
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.Dataset
> ### Title: Construct 'lgb.Dataset' object
> ### Aliases: lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.save(dtrain, "lgb.Dataset.data")
[LightGBM] [Info] Saving data to binary file lgb.Dataset.data
> dtrain <- lgb.Dataset("lgb.Dataset.data")
> lgb.Dataset.construct(dtrain)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.Dataset", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.Dataset.construct")
> ### * lgb.Dataset.construct
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.Dataset.construct
> ### Title: Construct Dataset explicitly
> ### Aliases: lgb.Dataset.construct
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.construct(dtrain)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.Dataset.construct", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.Dataset.create.valid")
> ### * lgb.Dataset.create.valid
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.Dataset.create.valid
> ### Title: Construct validation data
> ### Aliases: lgb.Dataset.create.valid
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.Dataset.create.valid", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.Dataset.save")
> ### * lgb.Dataset.save
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.Dataset.save
> ### Title: Save 'lgb.Dataset' to a binary file
> ### Aliases: lgb.Dataset.save
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.save(dtrain, "data.bin")
[LightGBM] [Info] Saving data to binary file data.bin
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.Dataset.save", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.Dataset.set.categorical")
> ### * lgb.Dataset.set.categorical
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.Dataset.set.categorical
> ### Title: Set categorical feature of 'lgb.Dataset'
> ### Aliases: lgb.Dataset.set.categorical
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.save(dtrain, "lgb.Dataset.data")
[LightGBM] [Warning] File lgb.Dataset.data exists, cannot save binary to it
> dtrain <- lgb.Dataset("lgb.Dataset.data")
> lgb.Dataset.set.categorical(dtrain, 1L:2L)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.Dataset.set.categorical", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.Dataset.set.reference")
> ### * lgb.Dataset.set.reference
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.Dataset.set.reference
> ### Title: Set reference of 'lgb.Dataset'
> ### Aliases: lgb.Dataset.set.reference
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package ="lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset(test$data, test = train$label)
> lgb.Dataset.set.reference(dtest, dtrain)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.Dataset.set.reference", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.cv")
> ### * lgb.cv
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.cv
> ### Title: Main CV logic for LightGBM
> ### Aliases: lgb.cv
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> params <- list(objective = "regression", metric = "l2")
> model <- lgb.cv(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , nfold = 3L
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001989 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001922 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002038 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
[LightGBM] [Info] Start training from score 0.476739
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Start training from score 0.482497
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Start training from score 0.487103
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	valid's l2:0.000307078+0.000434274 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	valid's l2:0.000307078+0.000434274 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	valid's l2:0.000307078+0.000434274 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[4]:	valid's l2:0.000307078+0.000434274 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[5]:	valid's l2:0.000307078+0.000434274 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[6]:	valid's l2:0.000307078+0.000434274 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[7]:	valid's l2:0.000307078+0.000434274 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.cv", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.dump")
> ### * lgb.dump
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.dump
> ### Title: Dump LightGBM model to json
> ### Aliases: lgb.dump
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002575 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> json_model <- lgb.dump(model)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.dump", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.get.eval.result")
> ### * lgb.get.eval.result
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.get.eval.result
> ### Title: Get record evaluation result from booster
> ### Aliases: lgb.get.eval.result
> 
> ### ** Examples
> 
> # train a regression model
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002617 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> 
> # Examine valid data_name values
> print(setdiff(names(model$record_evals), "start_iter"))
[1] "test"
> 
> # Examine valid eval_name values for dataset "test"
> print(names(model$record_evals[["test"]]))
[1] "l2"
> 
> # Get L2 values for "test" dataset
> lgb.get.eval.result(model, "test", "l2")
[1] 6.441652e-17 1.972152e-31 0.000000e+00 0.000000e+00 0.000000e+00
[6] 0.000000e+00 0.000000e+00 0.000000e+00
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.get.eval.result", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.importance")
> ### * lgb.importance
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.importance
> ### Title: Compute feature importance in a model
> ### Aliases: lgb.importance
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> 
> params <- list(
+   objective = "binary"
+   , learning_rate = 0.01
+   , num_leaves = 63L
+   , max_depth = -1L
+   , min_data_in_leaf = 1L
+   , min_sum_hessian_in_leaf = 1.0
+ )
> model <- lgb.train(params, dtrain, 10L)
[LightGBM] [Warning] Starting from the 2.1.2 version, default value for the "boost_from_average" parameter in "binary" objective is true.
This may cause significantly different results comparing to the previous versions of LightGBM.
Try to set boost_from_average=false, if your old models produce bad results
[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002725 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
[LightGBM] [Info] Start training from score -0.071580
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
> 
> tree_imp1 <- lgb.importance(model, percentage = TRUE)
> tree_imp2 <- lgb.importance(model, percentage = FALSE)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.importance", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.interprete")
> ### * lgb.interprete
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.interprete
> ### Title: Compute feature contribution of prediction
> ### Aliases: lgb.interprete
> 
> ### ** Examples
> 
> Sigmoid <- function(x) 1.0 / (1.0 + exp(-x))
> Logit <- function(x) log(x / (1.0 - x))
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> setinfo(dtrain, "init_score", rep(Logit(mean(train$label)), length(train$label)))
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> 
> params <- list(
+     objective = "binary"
+     , learning_rate = 0.01
+     , num_leaves = 63L
+     , max_depth = -1L
+     , min_data_in_leaf = 1L
+     , min_sum_hessian_in_leaf = 1.0
+ )
> model <- lgb.train(params, dtrain, 10L)
[LightGBM] [Warning] Starting from the 2.1.2 version, default value for the "boost_from_average" parameter in "binary" objective is true.
This may cause significantly different results comparing to the previous versions of LightGBM.
Try to set boost_from_average=false, if your old models produce bad results
[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.003009 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
> 
> tree_interpretation <- lgb.interprete(model, test$data, 1L:5L)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.interprete", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.load")
> ### * lgb.load
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.load
> ### Title: Load LightGBM model
> ### Aliases: lgb.load
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001811 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> lgb.save(model, "model.txt")
> load_booster <- lgb.load(filename = "model.txt")
> model_string <- model$save_model_to_string(NULL) # saves best iteration
> load_booster_from_str <- lgb.load(model_str = model_string)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.load", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.model.dt.tree")
> ### * lgb.model.dt.tree
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.model.dt.tree
> ### Title: Parse a LightGBM model json dump
> ### Aliases: lgb.model.dt.tree
> 
> ### ** Examples
> 
> 
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> 
> params <- list(
+   objective = "binary"
+   , learning_rate = 0.01
+   , num_leaves = 63L
+   , max_depth = -1L
+   , min_data_in_leaf = 1L
+   , min_sum_hessian_in_leaf = 1.0
+ )
> model <- lgb.train(params, dtrain, 10L)
[LightGBM] [Warning] Starting from the 2.1.2 version, default value for the "boost_from_average" parameter in "binary" objective is true.
This may cause significantly different results comparing to the previous versions of LightGBM.
Try to set boost_from_average=false, if your old models produce bad results
[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001908 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
[LightGBM] [Info] Start training from score -0.071580
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
> 
> tree_dt <- lgb.model.dt.tree(model)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.model.dt.tree", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.plot.importance")
> ### * lgb.plot.importance
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.plot.importance
> ### Title: Plot feature importance as a bar graph
> ### Aliases: lgb.plot.importance
> 
> ### ** Examples
> 
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> 
> params <- list(
+     objective = "binary"
+     , learning_rate = 0.01
+     , num_leaves = 63L
+     , max_depth = -1L
+     , min_data_in_leaf = 1L
+     , min_sum_hessian_in_leaf = 1.0
+ )
> 
> model <- lgb.train(params, dtrain, 10L)
[LightGBM] [Warning] Starting from the 2.1.2 version, default value for the "boost_from_average" parameter in "binary" objective is true.
This may cause significantly different results comparing to the previous versions of LightGBM.
Try to set boost_from_average=false, if your old models produce bad results
[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002576 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
[LightGBM] [Info] Start training from score -0.071580
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
> 
> tree_imp <- lgb.importance(model, percentage = TRUE)
> lgb.plot.importance(tree_imp, top_n = 10L, measure = "Gain")
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.plot.importance", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.plot.interpretation")
> ### * lgb.plot.interpretation
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.plot.interpretation
> ### Title: Plot feature contribution as a bar graph
> ### Aliases: lgb.plot.interpretation
> 
> ### ** Examples
> 
> library(lightgbm)
> Sigmoid <- function(x) {1.0 / (1.0 + exp(-x))}
> Logit <- function(x) {log(x / (1.0 - x))}
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> setinfo(dtrain, "init_score", rep(Logit(mean(train$label)), length(train$label)))
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> 
> params <- list(
+   objective = "binary"
+   , learning_rate = 0.01
+   , num_leaves = 63L
+   , max_depth = -1L
+   , min_data_in_leaf = 1L
+   , min_sum_hessian_in_leaf = 1.0
+ )
> model <- lgb.train(params, dtrain, 10L)
[LightGBM] [Warning] Starting from the 2.1.2 version, default value for the "boost_from_average" parameter in "binary" objective is true.
This may cause significantly different results comparing to the previous versions of LightGBM.
Try to set boost_from_average=false, if your old models produce bad results
[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002502 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
> 
> tree_interpretation <- lgb.interprete(model, test$data, 1L:5L)
> lgb.plot.interpretation(tree_interpretation[[1L]], top_n = 10L)
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.plot.interpretation", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.prepare")
> ### * lgb.prepare
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.prepare
> ### Title: Data preparator for LightGBM datasets (numeric)
> ### Aliases: lgb.prepare
> 
> ### ** Examples
> 
> library(lightgbm)
> data(iris)
> 
> str(iris)
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : Factor w/ 3 levels "setosa","versicolor",..: 1 1 1 1 1 1 1 1 1 1 ...
> 
> str(lgb.prepare(data = iris)) # Convert all factors/chars to numeric
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : num  1 1 1 1 1 1 1 1 1 1 ...
> 
> ## Not run: 
> ##D # When lightgbm package is installed, and you do not want to load it
> ##D # You can still use the function!
> ##D lgb.unloader()
> ##D str(lightgbm::lgb.prepare(data = iris))
> ##D # 'data.frame':	150 obs. of  5 variables:
> ##D # $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
> ##D # $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
> ##D # $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
> ##D # $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
> ##D # $ Species     : num  1 1 1 1 1 1 1 1 1 1 ...
> ## End(Not run)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.prepare", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.prepare2")
> ### * lgb.prepare2
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.prepare2
> ### Title: Data preparator for LightGBM datasets (integer)
> ### Aliases: lgb.prepare2
> 
> ### ** Examples
> 
> library(lightgbm)
> data(iris)
> 
> str(iris)
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : Factor w/ 3 levels "setosa","versicolor",..: 1 1 1 1 1 1 1 1 1 1 ...
> 
> # Convert all factors/chars to integer
> str(lgb.prepare2(data = iris))
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : int  1 1 1 1 1 1 1 1 1 1 ...
> 
> ## Not run: 
> ##D # When lightgbm package is installed, and you do not want to load it
> ##D # You can still use the function!
> ##D lgb.unloader()
> ##D str(lightgbm::lgb.prepare2(data = iris))
> ##D # 'data.frame':	150 obs. of  5 variables:
> ##D # $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
> ##D # $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
> ##D # $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
> ##D # $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
> ##D # $ Species     : int  1 1 1 1 1 1 1 1 1 1 ...
> ## End(Not run)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.prepare2", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.prepare_rules")
> ### * lgb.prepare_rules
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.prepare_rules
> ### Title: Data preparator for LightGBM datasets with rules (numeric)
> ### Aliases: lgb.prepare_rules
> 
> ### ** Examples
> 
> library(lightgbm)
> data(iris)
> 
> str(iris)
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : Factor w/ 3 levels "setosa","versicolor",..: 1 1 1 1 1 1 1 1 1 1 ...
> 
> new_iris <- lgb.prepare_rules(data = iris) # Autoconverter
> str(new_iris$data)
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : num  1 1 1 1 1 1 1 1 1 1 ...
> 
> data(iris) # Erase iris dataset
> iris$Species[1L] <- "NEW FACTOR" # Introduce junk factor (NA)
Warning in `[<-.factor`(`*tmp*`, 1L, value = "NEW FACTOR") :
  invalid factor level, NA generated
> 
> # Use conversion using known rules
> # Unknown factors become 0, excellent for sparse datasets
> newer_iris <- lgb.prepare_rules(data = iris, rules = new_iris$rules)
> 
> # Unknown factor is now zero, perfect for sparse datasets
> newer_iris$data[1L, ] # Species became 0 as it is an unknown factor
  Sepal.Length Sepal.Width Petal.Length Petal.Width Species
1          5.1         3.5          1.4         0.2       0
> 
> newer_iris$data[1L, 5L] <- 1.0 # Put back real initial value
> 
> # Is the newly created dataset equal? YES!
> all.equal(new_iris$data, newer_iris$data)
[1] TRUE
> 
> # Can we test our own rules?
> data(iris) # Erase iris dataset
> 
> # We remapped values differently
> personal_rules <- list(Species = c("setosa" = 3L,
+                                    "versicolor" = 2L,
+                                    "virginica" = 1L))
> newest_iris <- lgb.prepare_rules(data = iris, rules = personal_rules)
> str(newest_iris$data) # SUCCESS!
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : int  3 3 3 3 3 3 3 3 3 3 ...
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.prepare_rules", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.prepare_rules2")
> ### * lgb.prepare_rules2
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.prepare_rules2
> ### Title: Data preparator for LightGBM datasets with rules (integer)
> ### Aliases: lgb.prepare_rules2
> 
> ### ** Examples
> 
> library(lightgbm)
> data(iris)
> 
> str(iris)
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : Factor w/ 3 levels "setosa","versicolor",..: 1 1 1 1 1 1 1 1 1 1 ...
> 
> new_iris <- lgb.prepare_rules2(data = iris) # Autoconverter
> str(new_iris$data)
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : int  1 1 1 1 1 1 1 1 1 1 ...
> 
> data(iris) # Erase iris dataset
> iris$Species[1L] <- "NEW FACTOR" # Introduce junk factor (NA)
Warning in `[<-.factor`(`*tmp*`, 1L, value = "NEW FACTOR") :
  invalid factor level, NA generated
> 
> # Use conversion using known rules
> # Unknown factors become 0, excellent for sparse datasets
> newer_iris <- lgb.prepare_rules2(data = iris, rules = new_iris$rules)
> 
> # Unknown factor is now zero, perfect for sparse datasets
> newer_iris$data[1L, ] # Species became 0 as it is an unknown factor
  Sepal.Length Sepal.Width Petal.Length Petal.Width Species
1          5.1         3.5          1.4         0.2       0
> 
> newer_iris$data[1L, 5L] <- 1.0 # Put back real initial value
> 
> # Is the newly created dataset equal? YES!
> all.equal(new_iris$data, newer_iris$data)
[1] TRUE
> 
> # Can we test our own rules?
> data(iris) # Erase iris dataset
> 
> # We remapped values differently
> personal_rules <- list(
+   Species = c(
+     "setosa" = 3L
+     , "versicolor" = 2L
+     , "virginica" = 1L
+   )
+ )
> newest_iris <- lgb.prepare_rules2(data = iris, rules = personal_rules)
> str(newest_iris$data) # SUCCESS!
'data.frame':	150 obs. of  5 variables:
 $ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
 $ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
 $ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
 $ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
 $ Species     : int  3 3 3 3 3 3 3 3 3 3 ...
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.prepare_rules2", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.save")
> ### * lgb.save
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.save
> ### Title: Save LightGBM model
> ### Aliases: lgb.save
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001908 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> lgb.save(model, "model.txt")
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.save", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.train")
> ### * lgb.train
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.train
> ### Title: Main training logic for LightGBM
> ### Aliases: lgb.train
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002022 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.train", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("lgb.unloader")
> ### * lgb.unloader
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: lgb.unloader
> ### Title: LightGBM unloading error fix
> ### Aliases: lgb.unloader
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002355 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> 
> ## Not run: 
> ##D lgb.unloader(restore = FALSE, wipe = FALSE, envir = .GlobalEnv)
> ##D rm(model, dtrain, dtest) # Not needed if wipe = TRUE
> ##D gc() # Not needed if wipe = TRUE
> ##D 
> ##D library(lightgbm)
> ##D # Do whatever you want again with LightGBM without object clashing
> ## End(Not run)
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("lgb.unloader", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("predict.lgb.Booster")
> ### * predict.lgb.Booster
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: predict.lgb.Booster
> ### Title: Predict method for LightGBM model
> ### Aliases: predict.lgb.Booster
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002516 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> preds <- predict(model, test$data)
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("predict.lgb.Booster", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("readRDS.lgb.Booster")
> ### * readRDS.lgb.Booster
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: readRDS.lgb.Booster
> ### Title: readRDS for 'lgb.Booster' models
> ### Aliases: readRDS.lgb.Booster
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+   params = params
+   , data = dtrain
+   , nrounds = 10L
+   , valids = valids
+   , min_data = 1L
+   , learning_rate = 1.0
+   , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001810 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> saveRDS.lgb.Booster(model, "model.rds")
> new_model <- readRDS.lgb.Booster("model.rds")
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("readRDS.lgb.Booster", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("saveRDS.lgb.Booster")
> ### * saveRDS.lgb.Booster
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: saveRDS.lgb.Booster
> ### Title: saveRDS for 'lgb.Booster' models
> ### Aliases: saveRDS.lgb.Booster
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> data(agaricus.test, package = "lightgbm")
> test <- agaricus.test
> dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
> params <- list(objective = "regression", metric = "l2")
> valids <- list(test = dtest)
> model <- lgb.train(
+     params = params
+     , data = dtrain
+     , nrounds = 10L
+     , valids = valids
+     , min_data = 1L
+     , learning_rate = 1.0
+     , early_stopping_rounds = 5L
+ )
[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.002494 seconds.
You can set `force_row_wise=true` to remove the overhead.
And if memory is not enough, you can set `force_col_wise=true`.
[LightGBM] [Info] Total Bins 232
[LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
[LightGBM] [Info] Start training from score 0.482113
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[1]:	test's l2:6.44165e-17 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[2]:	test's l2:1.97215e-31 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[3]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[4]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[5]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[6]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[7]:	test's l2:0 
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[8]:	test's l2:0 
> saveRDS.lgb.Booster(model, "model.rds")
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("saveRDS.lgb.Booster", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("setinfo")
> ### * setinfo
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: setinfo
> ### Title: Set information of an 'lgb.Dataset' object
> ### Aliases: setinfo setinfo.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.construct(dtrain)
> 
> labels <- lightgbm::getinfo(dtrain, "label")
> lightgbm::setinfo(dtrain, "label", 1 - labels)
> 
> labels2 <- lightgbm::getinfo(dtrain, "label")
> stopifnot(all.equal(labels2, 1 - labels))
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("setinfo", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()
> nameEx("slice")
> ### * slice
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: slice
> ### Title: Slice a dataset
> ### Aliases: slice slice.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> 
> dsub <- lightgbm::slice(dtrain, seq_len(42L))
> lgb.Dataset.construct(dsub)
> labels <- lightgbm::getinfo(dsub, "label")
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("slice", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> ### * <FOOTER>
> ###
> cleanEx()
> options(digits = 7L)
> base::cat("Time elapsed: ", proc.time() - base::get("ptime", pos = 'CheckExEnv'),"\n")
Time elapsed:  12.38 1.34 13.51 NA NA 
> grDevices::dev.off()
null device 
          1 
> ###
> ### Local variables: ***
> ### mode: outline-minor ***
> ### outline-regexp: "\\(> \\)?### [*]+" ***
> ### End: ***
> quit('no')
test logs on i386 (click me)

R version 4.0.0 alpha (2020-03-26 r78078)
Copyright (C) 2020 The R Foundation for Statistical Computing
Platform: i386-w64-mingw32/i386 (32-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> pkgname <- "lightgbm"
> source(file.path(R.home("share"), "R", "examples-header.R"))
> options(warn = 1)
> options(pager = "console")
> base::assign(".ExTimings", "lightgbm-Ex.timings", pos = 'CheckExEnv')
> base::cat("name\tuser\tsystem\telapsed\n", file=base::get(".ExTimings", pos = 'CheckExEnv'))
> base::assign(".format_ptime",
+ function(x) {
+   if(!is.na(x[4L])) x[1L] <- x[1L] + x[4L]
+   if(!is.na(x[5L])) x[2L] <- x[2L] + x[5L]
+   options(OutDec = '.')
+   format(x[1L:3L], digits = 7L)
+ },
+ pos = 'CheckExEnv')
> 
> ### * </HEADER>
> library('lightgbm')
Loading required package: R6
> 
> base::assign(".oldSearch", base::search(), pos = 'CheckExEnv')
> base::assign(".old_wd", base::getwd(), pos = 'CheckExEnv')
> cleanEx()
> nameEx("dim")
> ### * dim
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: dim.lgb.Dataset
> ### Title: Dimensions of an 'lgb.Dataset'
> ### Aliases: dim.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
Loading required package: Matrix
> 
> stopifnot(nrow(dtrain) == nrow(train$data))
> stopifnot(ncol(dtrain) == ncol(train$data))
> stopifnot(all(dim(dtrain) == dim(train$data)))
> 
> 
> 
> 
> base::assign(".dptime", (proc.time() - get(".ptime", pos = "CheckExEnv")), pos = "CheckExEnv")
> base::cat("dim", base::get(".format_ptime", pos = 'CheckExEnv')(get(".dptime", pos = "CheckExEnv")), "\n", file=base::get(".ExTimings", pos = 'CheckExEnv'), append=TRUE, sep="\t")
> cleanEx()

detaching 'package:Matrix'

> nameEx("dimnames.lgb.Dataset")
> ### * dimnames.lgb.Dataset
> 
> flush(stderr()); flush(stdout())
> 
> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
> ### Name: dimnames.lgb.Dataset
> ### Title: Handling of column names of 'lgb.Dataset'
> ### Aliases: dimnames.lgb.Dataset dimnames<-.lgb.Dataset
> 
> ### ** Examples
> 
> library(lightgbm)
> data(agaricus.train, package = "lightgbm")
> train <- agaricus.train
> dtrain <- lgb.Dataset(train$data, label = train$label)
> lgb.Dataset.construct(dtrain)
Error in dataset$construct() : 
  lgb.Dataset.construct: cannot create Dataset handle
Calls: lgb.Dataset.construct -> <Anonymous>
Execution halted

@jameslamb
Copy link
Collaborator Author

opening this now that there is an open pull request for it, #3188

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant