1

I have created a BERT model for classifying a user generated text string as FAQ or not FAQ. I have saved my model using the export_savedmodel() function. I wish to write a function to predict the output for a new set of strings, which takes as input a list of the strings.

I tried using predictor.from_saved_model() method but that method requires passing key value pairs for input id, segment id, label id and input mask. I am a beginner and I do not understand completely what to pass here.

Exporting or saving the model

def serving_input_fn():
    label_ids = tf.placeholder(tf.int32, [None], name='label_ids')
    input_ids = tf.placeholder(tf.int32, [None, MAX_SEQ_LENGTH], name='input_ids')
    input_mask = tf.placeholder(tf.int32, [None, MAX_SEQ_LENGTH], name='input_mask')
    segment_ids = tf.placeholder(tf.int32, [None, MAX_SEQ_LENGTH], name='segment_ids')
    input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn({
        'label_ids': label_ids,
        'input_ids': input_ids,
        'input_mask': input_mask,
        'segment_ids': segment_ids,
    })()
    return input_fn

export_dir = "..."
estimator._export_to_tpu = False
estimator.export_savedmodel(export_dir, serving_input_fn)

#Predicting
with tf.Session() as sess:   
    predict_fn = predictor.from_saved_model(...')

#Data description
My data is a simple table having a column for input string and another for output label.

# Error.
ValueError: Got unexpected keys in input_dict: {'pred'}
expected: {'label_ids', 'input_mask', 'segment_ids', 'input_ids'}

#Thank you for any help!
skg
  • 11
  • 2
  • Refer: - https://stackoverflow.com/questions/61310331/performing-inference-with-a-bert-tf-1-x-saved-model/61788285#61788285 and - https://towardsdatascience.com/3-ways-to-optimize-and-export-bert-model-for-online-serving-8f49d774a501 This may be useful. – rajeshkumargp Jul 14 '20 at 15:23

0 Answers0