Skip to content
/ xgb Public
forked from kimtang/xgb

greedy gradient boosting in kdb+/q

License

Notifications You must be signed in to change notification settings

baoshuang/xgb

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

XGB in kdb+/q

XGB is a distributed and greedy gradient boosting library in kdb+/q designed to be highly efficient . It is a native implementation in kdb+/q and does not depende on other module. It implements machine learning algorithms under the Gradient Boosting framework. XGB provides a native distributed tree boosting that solve many data science problems in a fast and accurate way. You can use xgb as long as the data is in memory or parted on disk.

Why do you reimplement an existing framework in kdb+/q and not just use xgboost from Python/R inside kdb+/q?

Python/R require that the data fits into memory. The data size is the limitation. As long as the data is parted on disk you can apply xgb to your dataset. I am using it to participate the kaggle bosch competition.

Reference

About

greedy gradient boosting in kdb+/q

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published