31

I just read about the Keras weight initializers in here. In the documentation, only different initializers has been introduced. Such as:

model.add(Dense(64, kernel_initializer='random_normal'))

I want to know what is the default weight when I don't specify the kernel_initializer argument. Is there a way to access it?

flyingduck92
  • 1,449
  • 2
  • 23
  • 39
  • 3
    The default is glorot initializer. It draws samples from a uniform distribution within [-limit, limit] where limit is sqrt(6 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor. – Amir Jan 02 '19 at 18:22
  • 2
    Already answered here: https://stackoverflow.com/questions/46883606/what-is-the-default-kernel-initializer-in-keras – hafiz031 Nov 13 '20 at 22:29

1 Answers1

50

Each layer has its own default value for initializing the weights. For most of the layers, such as Dense, convolution and RNN layers, the default kernel initializer is 'glorot_uniform' and the default bias intializer is 'zeros' (you can find this by going to the related section for each layer in the documentation; for example here is the Dense layer doc). You can find the definition of glorot_uniform initializer here in the Keras documentation.

As for accessing the weights of each layer, it has already been answered here.

today
  • 32,602
  • 8
  • 95
  • 115
  • 14
    I think this article is very interesting and it shows roughly that for `"tanh"` activations you should use `'glorot_uniform'` and for `"relu"` layers you should use `"he_uniform"`: https://towardsdatascience.com/weight-initialization-in-neural-networks-a-journey-from-the-basics-to-kaiming-954fb9b47c79 – Daniel Möller Dec 27 '19 at 17:31
  • For embedding layer it says 'uniform' however which uniform? Glorot or he? Cannot find the answer on the documentation. – haneulkim Aug 22 '22 at 16:17