0

I'm getting a weird error when trying to call predict for a particular model in tensorflow-serving:

grpc.framework.interfaces.face.face.AbortionError: 
AbortionError(code=StatusCode.INVALID_ARGUMENT, details="ConcatOp : Expected 
concatenating dimensions in the range [-1, 1), but got 1
 [[Node: lys_conc/concat = ConcatV2[N=4, T=DT_FLOAT, Tidx=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_lys_in_0_4, _arg_lyb_in_0_0, flatten_1/Reshape, batch_normalization_2/batchnorm_1/add_1, lyt_conc/concat/axis)]]")

Background: I set up a tensorflow-serving container, and successfully moved some models in and checked I could get client responses (I could).

I made my models by building and training them in keras, then exporting them and loading into tf-serving, as per the answer in this post.

The node it is failing on is meant to concatenate 4 sources (2 inputs, one flattened embedding and one output of a dropout). The dropout has been stripped out by the export process - which is why we see the (preceding) batchnorm there instead.

Other points:

  • I have another model with a similar concat of just two inputs (an embedding + a dropout) which works fine on the same tf-serving instance.
  • In Keras I didn't specify an axis for the concat, but I can see in the graph def (before export - see below) that it defaulted (correctly) to 1.
  • I've noticed that the axis variable being referenced in the error is not the one that belongs to this concat layer (lyt instead of lys). But the one being referenced should also be set to 1. I'm wondering if this is related to the error, or if it's just some little optimization happening in the export process.

Relevant graph def:

node {
  name: "lys_conc/concat/axis"
  op: "Const"
  attr {
    key: "dtype"
    value {
      type: DT_INT32
    }
  }
  attr {
    key: "value"
    value {
      tensor {
        dtype: DT_INT32
        tensor_shape {
        }
        int_val: 1
      }
    }
  }
}
node {
  name: "lys_conc/concat"
  op: "ConcatV2"
  input: "lys_in"
  input: "lyb_in"
  input: "flatten_1/Reshape"
  input: "dropout_2/Identity"
  input: "lys_conc/concat/axis"
  attr {
    key: "N"
    value {
      i: 4
    }
  }
  attr {
    key: "T"
    value {
      type: DT_FLOAT
    }
  }
  attr {
    key: "Tidx"
    value {
      type: DT_INT32
    }
  }
}

Any help or advice to debug appreciated!

Ry-
  • 218,210
  • 55
  • 464
  • 476
willycs40
  • 63
  • 6

1 Answers1

0

First, it might be a good idea to separate potential issues in tensorflow serving and the data input format to the prediction itself Most likely there is an input dimensionality issue such that when the embeddings are concatenated, an error is thrown.

Try creating a tensorflow predictor object from the model you exported, and see if you can get a valid output from the prediction

from tensorflow.contrib import predictor
predictor_obj = predictor.from_saved_model(export_dir)
y = predictor_obj(inputs_to_model)

If this runs, then your input should have the correct dimensionality. Make sure you try batched inputs as well as single input)

If this fails, you probably need to reshape the inputs

khuang834
  • 931
  • 1
  • 9
  • 12