0

How can I jointly optimize the parameters of a model comprising two distinct neural networks with a single optimizer? What I've tried is the following, after having initialized an optimizer:

optim_global = optim.Adam(zip(model1.parameters(), model2.parameters()))

but I get this error

TypeError: optimizer can only optimize Tensors, but one of the params is tuple
James Arten
  • 523
  • 5
  • 16

1 Answers1

3

These are generator you can control either with the unpacking operator *:

>>> optim.Adam([*model1.parameters(), *model2.parameters()])

Or using itertools.chain

>>> optim.Adam(chain(model1.parameters(), model2.parameters()))
Ivan
  • 34,531
  • 8
  • 55
  • 100