0

I have a list containing the activation function names to apply to each of the hidden layers of a NN.

I want to modify the following command so that instead of relu, the command calls activation[i]:

#activation = ['relu','softplus','softsign']
for i in range (1,hidden_layers+1):
    W_dmd[i] = tf.Variable(tf.initializers.GlorotUniform()(shape=[input_shape,code_length]))
    b_encoder_dmd[i] = tf.Variable(tf.initializers.Zeros()(shape=[code_length]))
    x_dmd[i] = tf.compat.v1.nn.relu(tf.compat.v1.nn.xw_plus_b(x_dmd[i-1], W_dmd[i], b_encoder_dmd[i])) #Instead of relu, same command but with the name in activation[i]

The commands should then be:

x_dmd[1] = tf.compat.v1.nn.relu(tf.compat.v1.nn.xw_plus_b(x_dmd[0], W_dmd[1], b_encoder_dmd[1])) 
x_dmd[1] = tf.compat.v1.nn.softplus(tf.compat.v1.nn.xw_plus_b(x_dmd[1], W_dmd[2], b_encoder_dmd[2]))
x_dmd[3] = tf.compat.v1.nn.softsign(tf.compat.v1.nn.xw_plus_b(x_dmd[2], W_dmd[3], b_encoder_dmd[3]))

From this post, I can come up with:

x_dmd[i] = getattr(tf.compat.v1.nn, activation[i])(tf.compat.v1.nn.xw_plus_b(x_dmd[i-1], W_dmd[i], b_encoder_dmd[i]))

Is this correct? Thanks!

user11400799
  • 143
  • 1
  • 3
  • 10

0 Answers0