-
Notifications
You must be signed in to change notification settings - Fork 735
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Non Parametric DML with the weighting trick and ForestDML, ForestDRLearner with CIs #170
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…in bootstrap is fixed.
Is there a reason to combine both of these changes into one PR? It makes for a lot of code to review at once... |
…me minor changes to the class hierarchy to remove code duplication.
kbattocchi
approved these changes
Nov 28, 2019
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…lue when set to 'auto' and so next call to the estimator wouldn't recalculate it. Now a new attribute is created at trainign time and the original property remains to be 'auto'.
…on of attribute subsample_fr_
moprescu
approved these changes
Dec 3, 2019
…conML into vasilis/forest_dml
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Added a
SubsampledHonestForest
scikit-learn extension, which is a regression forest that implements honesty and instead of bootstrap, performs subsampling to construct each tree. It also offerspredict_interval
via the bootstrap of little bags approach and the asymptotic normal characterization of the prediction estimate.Added
NonParamDMLCateEstimator
, which is essentially another meta-learner that has an arbitrary final stage that supports fit and predict (albeit fit must accept sample_weight). This is based on the observation that, when treatment is single-dimensional or binary one can view the RLearner problem as a weighted regression.Added
ForestDMLCateEstimator
(which is essentially a causal forest implemented slightly differently via viewing it as a weighted non-parametric regression and piggy backing on scikit-learn tree construction) and has bootstrap of little bags based inference. This is essentially aNonParamDMLCateEstimator
with aSubsampledHonestForest
final model.Also added
ForestDRLearner
, which uses the doubly robust approach and uses an honest forest for each pseudo outcome regression. This also offers non-parametric confidence intervals to the Doubly Robust estimation classes. This is essentially aDRLearner
with aSubsampledHonestForest
final model.Side additions:
LinearDMLCateEstimator
andSubsampledHonestForest