0

Let's say we have a custom layer in Keras like this:

import numpy as np
import tensorflow as tf
from keras import backend as K
from keras.layers import Layer


class Custom_Layer(Layer):
    def __init__(self,**kwargs):
        super(ProbabilisticActivation, self).__init__(**kwargs)
        self.params_1 = 0
        self.params_2 = 0
    def build(self, input_shape):
        self.params_1 = K.variable(np.zeros(shape=input_shape[1::]))
        self.params_2 = K.variable(np.zeros(shape=input_shape[1::]))
        super(Custom_Layer,self).build(input_shape) 

    def call(self, x, training=None): 
       # DO SOMETHING

How could I access the value of the parameters (params_1, params_2) in the training process? I tried to get parameters by using model.get_layer('Name of Custom Layer').params_1, but in this case, I can not access the value of the parameters.

Here is the model architecture:

def get_model(img_height, img_width:
    input_layer = Input(shape=(img_height, img_width, 3))
    x = Conv2D(32, (3, 3), padding='same', name='conv2d_1', activation='relu')(input_layer)
    x = Custom_Layer()(x)
    x = MaxPooling2D(pool_size=(2, 2))(x)
    x = Dropout(0.25)(x)
    x = Conv2D(64, kernel_size=(3, 3), name='conv2d_2', activation='relu')(x)
    x = Conv2D(64, (3, 3), name='conv2d_4', activation='relu')(x)
    x = MaxPooling2D(pool_size=(2, 2))(x)
    x = Dropout(0.25)(x)
    x = Flatten()(x)
    x = Dense(512)(x)
    x = Activation('relu')(x)
    x = Dropout(0.5)(x)
    x = Dense(10)(x)
    x = Activation('softmax')(x)
    model = Model(inputs=[input_layer], outputs=[x])
    model.summary()

    return model
Panda
  • 231
  • 3
  • 17

1 Answers1

0

Note that params_1 and params_2 are TensorFlow tensors. To get their value, you should run them within a tf.Session. You could do something along the lines of:

from keras import backend as K

# ... train model

sess = K.get_session()
params_1 = model.get_layer('Name of Custom Layer').params_1
values_1 = sess.run(params_1)
print(values_1)

NOTE: Not tested.

rvinas
  • 11,824
  • 36
  • 58
  • Thanks for your response. I am getting this error: (0) Invalid argument: You must feed a value for placeholder tensor 'input_1' with dtype float and shape [?,32,32,3] [[{{node input_1}}]] [[act_1/add/_95]] (1) Invalid argument: You must feed a value for placeholder tensor 'input_1' with dtype float and shape [?,32,32,3] [[{{node input_1}}]] – Panda Aug 22 '19 at 17:16
  • Ah, does this happen because the shape of the parameters is dependant on the input? If that's the case, you can either hardcode the weights' shape or do `sess.run(params_1, feed_dict={your_input_tensor: your_input_value})` – rvinas Aug 22 '19 at 20:27
  • I do not feed the network directly. In the training process, I want to get the params_1, so I have done that with writing a callback for training. However, I could not pass anything to it. – Panda Aug 24 '19 at 01:13
  • You should be able to pass custom arguments to your callback. In any case, I do not understand why your parameters' tensors depend on your inputs. Could you maybe try defining them using a hardcoded shape? – rvinas Aug 24 '19 at 06:53
  • @Panda Any update? I am happy to help you further :) – rvinas Aug 26 '19 at 15:16
  • Thanks a lot :). My goal is to print or visualize params1 during training, in other words, after each iteration, I want to see params1, in this case, I don't have direct access to batch data, so I cannot feed data to the command that you sent. – Panda Aug 26 '19 at 15:58
  • Alright. What is the value of input shape in `build`? Also, would it be possible for you to share the model definition? I still find it very weird that you need to feed data to check the value of the parameters. – rvinas Aug 26 '19 at 16:33
  • @rvinas Any solution for this? I am having the same problem, https://stackoverflow.com/questions/62254883/how-to-store-result-of-an-operation-like-topk-per-epoch-in-keras Could you please let me know how did you fix it? – sariii Jul 01 '20 at 20:49