1

I try to build my custom experiment in Tensorflow, but I don't understand what is the use of the export_strategy argument ? And also, how do you build the serving_input_fn ?

Thank you!

Nicolas REY
  • 269
  • 2
  • 12

1 Answers1

1

Answers inspired by the CloudML example

Question 1: What is the use of the export_strategy (source)?

See also the response of question two, but the export strategy is (as the name suggest) the possibility to make some changes to the graph when it is exported. In the example below the proper input functions which will be used when serving the model are added.

  learn_runner.run(
      generate_experiment_fn(
          min_eval_frequency=args.min_eval_frequency,
          eval_delay_secs=args.eval_delay_secs,
          train_steps=args.train_steps,
          eval_steps=args.eval_steps,
          export_strategies=[saved_model_export_utils.make_export_strategy(
              model.SERVING_FUNCTIONS[args.export_format],
              exports_to_keep=1
          )]
      ),
      run_config=tf.contrib.learn.RunConfig(model_dir=args.job_dir),
      hparams=hparam.HParams(**args.__dict__)
  )

Question 2: How do you build the serving_input_fn (source) ?

What this actually does is when you save the model and prepare it for serving you will require some inputs to the graphs to be present, based upon the desired input (in this case json, csv, ...) it adds some inputs to the graphs, should these be missing it wouldn't be possible to feed the graph when going to serve it.

def csv_serving_input_fn():
  """Build the serving inputs."""
  csv_row = tf.placeholder(
      shape=[None],
      dtype=tf.string
  )
  features = parse_csv(csv_row)
  # Ignore label column
  features.pop(LABEL_COLUMN)
  return tf.estimator.export.ServingInputReceiver(
      features, {'csv_row': csv_row})
def example_serving_input_fn():
  """Build the serving inputs."""
  example_bytestring = tf.placeholder(
      shape=[None],
      dtype=tf.string,
  )
  features = tf.parse_example(
      example_bytestring,
      tf.feature_column.make_parse_example_spec(INPUT_COLUMNS)
  )
  return tf.estimator.export.ServingInputReceiver(
      features, {'example_proto': example_bytestring})


def json_serving_input_fn():
  """Build the serving inputs."""
  inputs = {}
  for feat in INPUT_COLUMNS:
    inputs[feat.name] = tf.placeholder(shape=[None], dtype=feat.dtype)
  return tf.estimator.export.ServingInputReceiver(inputs, inputs)


SERVING_FUNCTIONS = {
    'JSON': json_serving_input_fn,
    'EXAMPLE': example_serving_input_fn,
    'CSV': csv_serving_input_fn
}

This question is also related to Example of tensorflow.contrib.learn.ExportStrategy

amo-ej1
  • 3,279
  • 26
  • 35
  • Thanks for your reply. In tf.estimator.EstimatorSpec, there is the export_outputs argument. Do you know how is it related with the export_strat? And simply, what is the use for exporting a model ? – Nicolas REY Aug 11 '17 at 21:14
  • The most obvious use-case could be using when using Google cloud ML (ref https://cloud.google.com/ml-engine/docs/how-tos/getting-started-training-prediction ) where you have both the 'training' job where you train a model, the trained model gets stored and will later be used to 'serve' the model. – amo-ej1 Aug 11 '17 at 21:22
  • Great thank you! Last question, do you have any idea why the following serving_fn does work when used in the ? _feature_spec = {"output": tf.placeholder(dtype=tf.float32, shape=[1, None])} serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_spec) Then using learn.Experiment( ...arguments... export_strategies = saved_model_export_utils.make_export_strategy(serving_input_fn = serving_input_fn) )_ – Nicolas REY Aug 11 '17 at 21:35
  • Do I need an exportstrategy with a lean_runner to run train_and_eval and then predict ? – Nicolas REY Aug 15 '17 at 21:40