0

I am trying to replicate a tensorflow subclassed model, but I'm having problems accessing to the weights of a layer included in the model. Here's a summarized definition of the model:

class model():

   def __init__(self, dims, size):
       self._dims = dims
       self.size = size

       self.autoencoder= None
       self.encoder = None
       self.decoder = None
       self.model = None

   def initialize(self):
        self.autoencoder, self.encoder, self.decoder = mlp_autoencoder(self.dims)
        output = MyLayer(self.size, name= 'MyLayer')(self.encoder.output)
    
        self.model = Model(inputs= self.autoencoder.input,
                       outputs= [self.autoencoder.output, output])

mlp_autoencoder defines as many encoder and decoder layers as introduced in dims. MyLayer's trainable weights are learnt in the encoder's latent space and are then used to return the second output.

There are no issues accessing to the autoencoder weights, the problem is when trying to get MyLayer's weights. The first time it crashes is in the following part of the code:

@property
def layer_weights(self):
    return self.model.get_layer(name= 'MyLayer').get_weights()

 
# ValueError: No such layer: MyLayer.

By building the model this way a different TFOpLambda Layer is created for each transformation made to the encoder.output in the custom layer. I tried getting the weights through the last TFOpLambda layer (the second output of the model) but get_weights returns an empty list. In summary, these weights are never stored in the model.

I checked if MyLayer is well defined by using it separately, and it creates and stores the variables just fine, I had no issues accessing them. The problem appears when trying to use this layer in model.

Can someone more knowledgable in subclassing tell if there is something wrong in the definition of the model? I've considered using build and call as it seems to be the 'standard' way, but there's gotta be a simpler way...

I can provide more details of the program if needed.

Thanks in advance!

g herb
  • 23
  • 5
  • This question has already been answered. Refer to this link [here](https://stackoverflow.com/questions/43715047/how-do-i-get-the-weights-of-a-layer-in-keras) – insanely_a_ Oct 23 '22 at 00:43
  • hi zannyrt, I did check all the layer's weights with layer.get_config() and layer.get_weights(), but no weights for MyLayer are stored. All the Lambda layers return empty lists. – g herb Oct 23 '22 at 00:59
  • From the ***ValueError*** you posted, it seems like *MyLayer* is not considered a layer of model. What do you get from `self.model.layers`? – learner Oct 23 '22 at 14:25
  • Yes, *MyLayer* is not properly incorporated in the model. Instead of creating a layer with accesible weights it creates 3 lambda layers with no stored weights. The model has an input layer; 3 encoder layers (0,1,2); *decoder_3*; a lambda layer (expands its input dims); *decoder_2*; another lambda layer (subtracts inputs and weights); *decoder _1*; lambda layer (squares the subtraction result); and the two outputs, *decoder_0* and final lambda layer that returns a distance. After training I get both outputs but I can't access to the weights of a *MyLayer* from the architecture defined this way. – g herb Oct 23 '22 at 15:54

1 Answers1

0

A (not very elegant) way to solve it is calling the custom layer in the __init__ method. By doing this, the layer is created as a model attribute making its weights accesible.

  def __init__(self, dims, size):
    self.dims = dims
    self.size = size

    self.autoencoder = None
    self.encoder = None
    self.decoder = None
    self.model = None
    self.custom_layer = MyLayer(self.size, name= 'MyLayer')

def initialize(self):

    self.autoencoder, self.encoder, self.decoder = mlp_autoencoder(self.dims)
    h = self.custom_layer(self.encoder.output)
    
    self.model = Model(inputs= self.autoencoder.input,
                       outputs= [self.autoencoder.output, h])

Getting weights:

    def layer_weights(self):
        return self.custom_layer.get_weights()[0]
g herb
  • 23
  • 5