-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
error in forecasting code. #12
Comments
Hi, I have run the code with the sample electricity dataset and there is no problem. |
Hi, I have the same error as follows: TypeError: embedding(): argument 'indices' (position 2) must be Tensor, not NoneType I use the electricity dataset here. |
I am having the same problem, using the electricity dataset provided |
Hi all, Sorry for the inconvenience. |
@y-tashi I can confirm that it is working now. However, I am getting weird results when predicting. I get inf and all the computed losses are nan. I looked a bit in the results and they seem to blow up more and more from sample to sample. I am running the model on 'mps' device, which is the GPU provided by apple on their Macbook Ms and I am suspecting that is where the error is coming from. If anyone has encounterd this problem and identified any possible solutions, that would be a life saver as I am trying to write my Masters thesis and I don't have a Nvidia GPU. I am also obtaining this type of results when predicting using TimeGrad model as well. |
OK, so I ran it on cpu and indeed I stop getting inf and nan values. Any idea what (maybe a particular pytorch module like LayerNorm or smth) is causing mps to blow up? I am thinking I could rewrite the module and maybe than it will work. |
Hi,
when I run exe_forecasting.py it gives me type error.
The type error is :
TypeError: embedding(): argument 'indices' (position 2) must be Tensor, not NoneType
It is from this part of the code:
main_model.py", line 356, in get_side_info
feature_embed = self.embed_layer(feature_id).unsqueeze(1).expand(-1,L,-1,-1)
It would be great if you fix this.
The text was updated successfully, but these errors were encountered: