-
Notifications
You must be signed in to change notification settings - Fork 755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add iTransformer multivariate forecaster #3017
Conversation
@lostella I realize that the |
@kashif yes, if I read the code correctly you should be able to set |
examples/iTransformer.ipynb
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this examples folder we should remove eventually. How about keeping his notebook close to the model? In its subpackage I mean. Pip install should anyway ignore it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this example folder is nice as its the only one with a multivariate model (showing the grouping and evaluation) as well as plotting some of the dims of the multivariate...
@kashif maybe for tests, we introduce a |
ready! |
Thanks @kashif! 🚀 |
Trying to use this sample with my own dataset, now when trying to use estimator I get following error: Where does it infer to have target dim of 2 and how to change that? |
you cannot change it as iTransformers is an inherently multivariate model... so the dataset has to be of shape: [batch_size, context_length, multivariate_Dim > 1] |
recall that the model takes a rep. of variate, makes a set of vectors out of them, and passes them to an unmasked transformer. The transformer as you know is a set-equivariant layer; since there is no ordering of the variates in a multivariate dataset, this layer makes perfect sense. However, iTransformer doesn't make sense in the univariate setting as there would be just a single vector as input to the transformer, which is then just an MLP... |
Thanks! One more thing, how to define the training set length, I am wondering if dataloader is sampling it somehow, since no matter how long time-series I put training time is same. |
right that is because while training a shared (global) deep learning models where the sizes of the time series can vary, the scheme is to randomly sample some fixed context-length window and subsequent prediction-length window within the training dataset... this way over time the model will learn via all the data. Use the |
Okay, and how to make it learn more over time? |
do more steps of gradient descent and hope it doesn't overfit... |
Thanks so far! Training works, well but I get error when I want to evaluate it in the step
IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed I am thinking that the way I create dataset might be wrong. For example my dataset goes like ` Date | ObjectName | Consumption_Elec | Temperature | Consumption_Heat -- | -- | -- | -- | --2018-01-01 01:00:00 | Object1 | 17.16 | 0.0 | 0.1155 And I am trying to create a dataset by |
Implemented the probabilistic iTransformer model from the paper: https://arxiv.org/pdf/2310.06625.pdf