0

How can I initialize bias with a pre-defined vector (no constant or random)? For example, I would like to spread up the vector in the range [-1, 1]. Something likee this:
tf.linspace(-1, 1 , shape, name="linspace")

what I have done so far, that it is not working, is:

def b_init(shape):
    return tf.keras.initializers.Constant(tf.linspace(-1, 1 , shape,  name="linspace"))

I am looking for something I can call in the same way that the predefined classes, for example:

b_initializer = tf.keras.initializers.HeNormal(seed=initialization_number)
deijany91
  • 9
  • 4

1 Answers1

0

You can follow the guide on the keras documentation: Creating custom initializers.

Simple callables

You can pass a custom callable as initializer. It must take the arguments shape (shape of the variable to initialize) and dtype (dtype of generated values):

def my_init(shape, dtype=None):
    return tf.random.normal(shape, dtype=dtype)

layer = Dense(64, kernel_initializer=my_init)

In your case, you can simply use the following function:

def linspace_init(shape, dtype=None):
    return tf.linspace(-1, 1, shape, dtype=dtype)

And use it as an initializer in your layers.

layer = Dense(64, kernel_initializer=linspace_init)

If you absolutely want to have an object rather than a simple function, you just have to subclass the Initializer class:

class LinSpace(tf.keras.initializers.Initializer):

    def __call__(self, shape, dtype=None)`:
      return return tf.linspace(-1, 1, shape, dtype=dtype)
Lescurel
  • 10,749
  • 16
  • 39