23

I'm running a simple feed-forward network using Keras . Having just one hidden layer I would like to make some inference regarding the relevance of each input to each output and I would like to extract the weights.

This is the model:

def build_model(input_dim, output_dim):
    n_output_layer_1 = 150
    n_output = output_dim
    model = Sequential()
    model.add(Dense(n_output_layer_1, input_dim=input_dim, activation='relu'))
    model.add(Dropout(0.25))
    model.add(Dense(n_output))

To extract the weight I wrote:

for layer in model.layers:
    weights = layer.get_weights() 


weights = np.array(weights[0])     #this is hidden to output
first = model.layers[0].get_weights() #input to hidden
first = np.array(first[0])

Unfortunately I don't get the biases columns in the matrices, which I know Keras automatically puts in it.

Do you know how to retrieve the biases weights?

Thank you in advance for your help !

Tommaso Guerrini
  • 1,499
  • 5
  • 17
  • 33

1 Answers1

39

get_weights() for a Dense layer returns a list of two elements, the first element contains the weights, and the second element contains the biases. So you can simply do:

weights = model.layers[0].get_weights()[0]
biases = model.layers[0].get_weights()[1]

Note that weights and biases are already numpy arrays.

nbro
  • 15,395
  • 32
  • 113
  • 196
Dr. Snoopy
  • 55,122
  • 7
  • 121
  • 140
  • sorry for bothering you, what kind of activation function does the output have in my model? Is there a default choice or is it just a weighted sum of the hidden neurons activations? – Tommaso Guerrini Feb 24 '17 at 18:56
  • By default, keras selects 'linear' activation. Check out [Dense layer documentation](https://keras.io/layers/core/#dense). Default for `activation=None`, meaning `a(x)=x` – sahdeV Mar 18 '18 at 03:31
  • 3
    Does each neuron have its own bias weight? – Ben Sep 25 '19 at 07:18
  • The above gives numpy arrays. If you want tensors use the following: weights = model.layers[0].weights[0] biases = model.layers[0].weights[1] – lalitm Jan 22 '22 at 06:16