I'm trying to extract the values of the logits of a trained model when you have the softmax activation inside the dense layer.
Something along these lines:
outputs = keras.layers.Dense(num_labels, activation='softmax')(x)
I've tried to recompile the model and use loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
but this didn't work...