Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add exogenous regressor support to sklearn models. #128

Merged
merged 2 commits into from
Oct 26, 2022
Merged

Conversation

aadyotb
Copy link
Contributor

@aadyotb aadyotb commented Oct 20, 2022

This pull request upgrades the SKLearnForecasterBase class to support exogenous regressors. This extends exogenous regressor support to tree models, and it should provide a template of how to add the feature to deep learning models in the future.

Also update length to be the number of batches, rather than the number
of data points.
@aadyotb aadyotb requested a review from Emerald01 October 25, 2022 20:36
@@ -278,4 +298,4 @@ def _get_immedidate_forecasting_prior(self, data):
if isinstance(data, TimeSeries):
data = data.to_pd()
data = data.values
return data[-self.maxlags :].reshape(-1, order="F")
Copy link
Collaborator

@Emerald01 Emerald01 Oct 26, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I used order="F" to have the flattened array place the data coming from the same sequence altogether, so it is like [first sequence, second sequence, ...] this is making my debug easier (I can locate those sequences in the rolling window much easier than if it it flattened along the time axis).
Since we have confirmed that the manipulation is correct, as long as RollingWindow class and sklearn_base used the same reshape order, it should be fine.

Do we run unittest to confirm the values are the same with the same parameter settings?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've confirmed that the values are the same with equivalent parameter settings. Actually I was running into bugs when I called x.reshape(..., order="F") multiple times on a single array (I think each subsequent call changes the underlying data order), so this is the reason I changed it. This method was easier for me to understand the logic of.

@aadyotb aadyotb merged commit 707f9e1 into main Oct 26, 2022
@aadyotb aadyotb deleted the aadyot/exog_upgrade branch October 26, 2022 21:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants