When training a model in PyTorch using practical examples, I found that there are 9 fewer data in the train_batches. I want to know the position of the missing data in the original dataset. #129
-
Beta Was this translation helpful? Give feedback.
Replies: 5 comments
-
Hi @zhuyuhang4 , this happens when you use a model with a receptive field greater than 1. In your example, the Does that clarify the question? |
Beta Was this translation helpful? Give feedback.
-
Thank you! |
Beta Was this translation helpful? Give feedback.
-
(The high level API also uses padding, so you might want to have a look at the |
Beta Was this translation helpful? Give feedback.
-
If this addresses your Q, please click on "Mark as answer" above :) Otherwise happy to discuss further! |
Beta Was this translation helpful? Give feedback.
Hi @zhuyuhang4 , this happens when you use a model with a receptive field greater than 1. In your example, the
offset10-model
uses a receptive field of 10, and theneural_model.get_offset()
will tell you the number of samples that are removed from the.left
and.right
side of the data.Does that clarify the question?