Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Future features #1757

Open
wants to merge 5 commits into
base: dev
Choose a base branch
from

Conversation

mbohlkeschneider
Copy link
Contributor

Issue #, if available:

Addresses issue #1657.

Description of changes:

Modified DeepAR to take optional past_feat_dynamic_real features that are only known in training range. This required some changes in the InstanceSplitter to accommodate that feature as a separate field. Additionally, some changes in the ForecastGenerator and SelectFields transformation where required to support optional fields.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Please tag this pr with at least one of these labels to make our release process faster: BREAKING, new feature, bug fix, other change, dev setup

@mbohlkeschneider mbohlkeschneider added the new feature (one of pr required labels) label Nov 5, 2021
@dcmaddix dcmaddix requested a review from lovvge November 9, 2021 18:27
@dcmaddix
Copy link
Contributor

dcmaddix commented Nov 9, 2021

Nice, thanks @mbohlkeschneider! @lovvge Michael has extended DeepAR to also support past dynamic features, similar to MQ-CNN.

)
self.rnn.add(cell)
self.rnn.cast(dtype=dtype)
self.rnn = self.rnn_block(
Copy link
Contributor

@rshyamsundar rshyamsundar Nov 18, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe rename this as self.encoder for better readability? Then self.decoder is either a reference to it or an entirely different block depending on past only features.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you. Yes, I struggled with the naming a bit. In the canonical DeepAR, encoder would sound weird (although I think this would still be fine).

future_time_feat: Tensor,
future_target: Tensor,
future_observed_values: Tensor,
past_past_feat_dynamic_real: Optional[Tensor] = None,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

past two times? You mean past_only_feat_dynamic_real or past_only_time_feat? Also docstring needs to be updated for this new argument.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes this is weird as well. My goal was to use the build-in feature naming from FieldName for the hybrid_forward call: https://github.com/awslabs/gluon-ts/blob/master/src/gluonts/dataset/field_names.py#L30, which would be past_feat_dynamic_real. This then gets turned to past_past_feat_dynamic_real in the InstanceSplitterhttps://github.com/awslabs/gluon-ts/blob/master/src/gluonts/transform/split.py#L179.

One way to circumvent is to have a different channel in the InstanceSplitter that does not do the field name modification to that channel (past only time series channel)?

Do you have other suggestions?

@@ -1140,6 +1209,8 @@ def hybrid_forward(
Tensor
Predicted samples
"""
if past_past_feat_dynamic_real is not None:
past_past_feat_dynamic_real.mean()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this required?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for spotting this. Will remove it, this was to test whether the data gets actually passed.

@@ -127,4 +127,4 @@ def __init__(self, input_fields: List[str]) -> None:
self.input_fields = input_fields

def map_transform(self, data: DataEntry, is_train: bool) -> DataEntry:
return {f: data[f] for f in self.input_fields}
return {f: data[f] for f in self.input_fields if f in data.keys()}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe this results in silent errors in other cases (ignoring required fields simply because they are not in the data)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I don't think the error would be silent in this case (at least mxnet models will complain for sure!). This change is required to have optional fields in the network forward passes, though. I think the fundamental question that we have to ask:

Do we want to have optional inputs for the networks or not?

@DushyantSahoo
Copy link

Is there a timeline for this feature?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature (one of pr required labels)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants