Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: The size of tensor a (7) must match the size of tensor b (343) at non-singleton dimension 1 #3

Open
shuoshanz opened this issue Jun 6, 2023 · 2 comments

Comments

@shuoshanz
Copy link

I used the parameters:
--data ETTh1
--method fsnet
--test_bsz 1
--seq_len 60
--pred_len 1
The training process works fine, but the testing error is as follows:
“RuntimeError: The size of tensor a (7) must match the size of tensor b (343) at non-singleton dimension 1”
Does anyone know how to fix this? Thanks!

@techzzt
Copy link

techzzt commented Aug 15, 2023

I also same problem has occurred during test process. Have you solved the problem?

@HappyWalkers
Copy link

Replace the _ol_one_batch in the exp/exp_fsnet.py with the following one.

    def _ol_one_batch(self,dataset_object, batch_x, batch_y, batch_x_mark, batch_y_mark):
        batch_y = batch_y.float()
        f_dim = -1 if self.args.features=='MS' else 0
        batch_y = batch_y[:,-self.args.pred_len:,f_dim:].to(self.device)
        true = rearrange(batch_y, 'b t d -> b (t d)').float().to(self.device)
        criterion = self._select_criterion()
        
        x = torch.cat([batch_x.float(), batch_x_mark.float()], dim=-1).to(self.device)
        for _ in range(self.n_inner):
            if self.args.use_amp:
                with torch.cuda.amp.autocast():
                    outputs = self.model(x)
            else:
                outputs = self.model(x)

            # breakpoint()
            loss = criterion(outputs, true)
            loss.backward()
            self.opt.step()       
            self.model.store_grad()
            self.opt.zero_grad()

        return outputs, true

I find the issue is related to the test_batchsize. If the test_batchsize equals one, then the original code works well. But if the test_batchsize is larger than one, then the updated code works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants