3

I have a simple model like so:

model = Sequential()

model.add(Conv2D(64, (3, 3), input_shape=X.shape[1:]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Conv2D(64, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Flatten())  # this converts our 3D feature maps to 1D feature vectors
model.add(Dense(64))
model.add(Activation('relu'))

model.add(Dense(1))
model.add(Activation('sigmoid'))

tensorboard = TensorBoard(log_dir="logs/{}".format(NAME))

model.compile(loss='binary_crossentropy',
              optimizer='adam',
              metrics=['accuracy'],
              )

model.fit(X, y,
          batch_size=32,
          epochs=3,
          validation_split=0.3,
          callbacks=[tensorboard])

I want to acquire the logits from the last dense function, so I can add in weighting for both classes

weights = tf.placeholder(name="loss_weights", shape=[None], dtype=tf.float32)
loss_per_example = tf.nn.sparse_softmax_cross_entropy_with_logits(logits, labels)
loss = tf.reduce_mean(weights * loss_per_example)

How can I get the logits from this model?

MRDJR97
  • 818
  • 2
  • 10
  • 27
  • 1
    The duplicate applied to this question: counting from the beginning of the model, the logits are in layer #9: `model.layers[9].output` – E_net4 Mar 06 '19 at 14:36
  • This post is duplcated. Please go to [link](https://stackoverflow.com/questions/53266350/how-to-tell-pytorch-to-not-use-the-gpu) – Álvaro H.G Nov 10 '21 at 20:51

0 Answers0