2

I would like to put some tensor in a list, and I know if I would like to put nn.Module class into a list, I must use ModuleList to wrap that list. So, Is there anything like 'TensorList’ in pytorch, that I must use to wrap the list containing tensors?

Umang Gupta
  • 15,022
  • 6
  • 48
  • 66
Wu Shiauthie
  • 69
  • 1
  • 9

1 Answers1

2

What are these tensors? Are these tensors parameters of your nn.Module? If so, you need to use the proper container.
For example, using nn.ParameterList. This way calling your module's .paramters() methods will yield these tensors as well. Otherwise you'll get errors like this one.

Shai
  • 111,146
  • 38
  • 238
  • 371
  • If I would like to use `torch.eye` or `torch.zeros` in the `nn.Module`. Whether I need to use `self.register_buffer` to wrap those function or not? – Wu Shiauthie Jan 07 '22 at 08:00
  • I use `torch.chunk` to split a big tensor into many small tensor and store them in a list. When I need to calculate them, I use List Comprehensions like `[torch.mm(x, self.W) for x in model_input]`. Is it a correct operation to deal with those tensors (containing the parameters and model inputs) of my `nn.Module` – Wu Shiauthie Jan 07 '22 at 08:09