I am trying to fine tune Universal Sentence Encoder and use the new encoder layer for something else.
import tensorflow as tf
from tensorflow.keras.models import Model, Sequential
from tensorflow.keras.layers import Dense, Dropout
import tensorflow_hub as hub
module_url = "universal-sentence-encoder"
model = Sequential([
hub.KerasLayer(module_url, input_shape=[], dtype=tf.string, trainable=True, name="use"),
Dropout(0.5, name="dropout"),
Dense(256, activation="relu", name="dense"),
Dense(len(y), activation="sigmoid", name="activation")
])
model.compile(optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"])
model.fit(X, y, batch_size=256, epochs=30, validation_split=0.25)
This worked. Loss went down and accuracy was decent. Now I want to extract just Universal Sentence Encoder
layer. However, here is what I get.
- Do you know how I can fix this
nan
issue? I expected to see encoding of numeric values. - Is it only possible to save
tuned_use
layer as a model as this post recommends? Ideally, I want to savetuned_use
layer just likeUniversal Sentence Encoder
so that I can open and use it exactly the same ashub.KerasLayer(tuned_use_location, input_shape=[], dtype=tf.string)
.