1

I create the following simple linear class:

class Decoder(nn.Module):
    def __init__(self, K, h=()):
        super().__init__()
        h = (K,)+h+(K,)
        self.layers = [nn.Linear(h1,h2) for h1,h2 in zip(h, h[1:])]

    def forward(self, x):
        for layer in self.layers[:-1]:
            x = F.relu(layer(x))
        return self.layers[-1](x)

However, when I try to put the parameters in a optimizer class I get the error ValueError: optimizer got an empty parameter list.

decoder = Decoder(4)
LR = 1e-3
opt = optim.Adam(decoder.parameters(), lr=LR)

Is there something I'm doing obviously wrong with the class definition?

sachinruk
  • 9,571
  • 12
  • 55
  • 86
  • Possible duplicate of [Pytorch ValueError: optimizer got an empty parameter list](https://stackoverflow.com/questions/54678896/pytorch-valueerror-optimizer-got-an-empty-parameter-list) – Shai Aug 02 '19 at 08:27
  • Looking at the other answer now, it is certainly the same issue. However, I think my code is a lot simpler to read. So upto the editors if they want to leave this in or delete this. – sachinruk Aug 05 '19 at 05:30

1 Answers1

6

Since you store your layers in a regular pythonic list inside your Decoder, Pytorch has no way of telling these members of the self.list are actually sub modules. Convert this list into pytorch's nn.ModuleList and your problem will be solved

class Decoder(nn.Module):
    def __init__(self, K, h=()):
        super().__init__()
        h = (K,)+h+(K,)
        self.layers = nn.ModuleList(nn.Linear(h1,h2) for h1,h2 in zip(h, h[1:]))
Shai
  • 111,146
  • 38
  • 238
  • 371
  • @NEERAJSWARNKAR if you have a nee question: post it as such, not as a cryptic comment – Shai Apr 25 '21 at 19:20