Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Dart booster. #1220

Merged
merged 1 commit into from
Jun 8, 2016
Merged

Add Dart booster. #1220

merged 1 commit into from
Jun 8, 2016

Conversation

marugari
Copy link
Contributor

@@ -313,8 +352,9 @@ class GBTree : public GradientBooster {
}
}
// commit new trees all at once
inline void CommitModel(std::vector<std::unique_ptr<RegTree> >&& new_trees,
int bst_group) {
inline virtual void
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove inline, inline and virtual are conflicting with each other

@tqchen
Copy link
Member

tqchen commented May 22, 2016

@marugari Thanks for the updates. You donot need to open another PR normally, simply update the contents in your branch will update this PR

I have made a few more comments. Please also fix the lint error as indicated in https://travis-ci.org/dmlc/xgboost/jobs/132091950

You can reproduce the style check locally by typing make lint

@tqchen
Copy link
Member

tqchen commented May 29, 2016

Any updates on this ?

@marugari marugari force-pushed the prototype_dart branch 7 times, most recently from da6b5c4 to a99b7e8 Compare June 5, 2016 15:56
@marugari
Copy link
Contributor Author

marugari commented Jun 5, 2016

Update.
Sorry for my slow-paced work.

- type of sampling algorithm.
- "uniform": dropped trees are selected uniformly.
- "weighted": dropped trees are selected in proportion to weight.
* normalize_type [default=0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

default= "original"

@marugari
Copy link
Contributor Author

marugari commented Jun 7, 2016

I fixed it.

@tqchen
Copy link
Member

tqchen commented Jun 7, 2016

One last thing. The name original gbtree and learning rate for normalize
type can be a bit confusing as they are terms used elsewhere. Any idea of
better terms ? What are the terms used by the paper ?
On Mon, Jun 6, 2016 at 5:06 PM Yoshinori Nakano notifications@github.com
wrote:

I fixed it.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
#1220 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/ACdUIOslVONMZxMXE_OoU5Tm642txlLvks5qJLXrgaJpZM4IkBFL
.

@marugari
Copy link
Contributor Author

marugari commented Jun 7, 2016

In the paper, the authors propose a normalize type. ("original" in my implementation)
It is applicable for learning_rate = 1.
For learning_rate < 1, I add "learning_rate".

I will consider appropriate names.

@tqchen
Copy link
Member

tqchen commented Jun 7, 2016

Does that mean we should automatically choosing the current "learning_rate" by default?

@marugari
Copy link
Contributor Author

marugari commented Jun 8, 2016

If My calculation is correct, we should choose "learning_rate" as normalize type.

@tqchen
Copy link
Member

tqchen commented Jun 8, 2016

Let us directly remove normalize type for now, and go with your implementation by default then :)

@tqchen
Copy link
Member

tqchen commented Jun 8, 2016

please make the update and i will merge the changes in

@tqchen tqchen merged commit 949d1e3 into dmlc:master Jun 8, 2016
@tqchen
Copy link
Member

tqchen commented Jun 8, 2016

Thanks this is merged!

@tqchen
Copy link
Member

tqchen commented Jun 8, 2016

@marugari Thanks for the great job of bringing in the DART trainer.

To make it more well known and widely accessible to users, I would like to invite you to write a markdown guest blogpost introducing what DART is and how you can use it in xgboost with a few code examples. We can post it on the DMLC website as well documents of XGBoost. Please let me know how you think about it and let me know if you need help in reviewing it.

You can first create a PR to https://github.com/dmlc/xgboost/tree/master/doc/tutorials

@marugari
Copy link
Contributor Author

marugari commented Jun 8, 2016

Thank you for your great support.

I'm writing a Japanese blog post. Documents will be submitted afterward.

@marugari
Copy link
Contributor Author

marugari commented Jun 9, 2016

I found my mistakes.
marugari@17bafbe
https://github.com/marugari/Notebooks/blob/master/test_dart.ipynb

I apologize for the trouble.

@tqchen
Copy link
Member

tqchen commented Jun 9, 2016

no problem, please feel free to open a pr to update the code

@tqchen
Copy link
Member

tqchen commented Jun 15, 2016

@marugari Any updates on possible guest blogpost?

@tqchen tqchen mentioned this pull request Jul 29, 2016
@lock lock bot locked as resolved and limited conversation to collaborators Jan 19, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants