0

I found it hard to use simple tf operations when building tf.keras model. For a toy example, let's say I want to stack two tensors from previous layers into one, keras doesn't have a stack function but tf does, but in order to use it, I have to do something like:

t1 = ...
t1 = ...
t_stack = tf.keras.layers.Lambda(lambda x: tf.stack(x, axis=-1))([t1, t2])

I'm just using tf.stack as a toy example, it could be any tf operations that keras doesn't have (such as tf.image.resize, lots of tf.math operations etc.).

I want to know if there is a easy way to use arbitrary tf operations in keras? What about using tf.keras.backend operations? I recon it is probably better to keep every operation as a keras layer. Will using backend operations break that rule?

Hongtao Yang
  • 381
  • 3
  • 14
  • I added a toy example. – Hongtao Yang Aug 25 '20 at 13:25
  • In general, there's nothing stopping you from doing `tf.stack([t1, t2])`, but if you want to have it as a proper layer (for example to see it with `summary()`) you need to wrap it with `Lambda`, which shouldn't be an issue anyway. Backend operations are just wrappers fro TF operations so that doesn't make a difference. You can see [this guide](https://www.tensorflow.org/guide/keras/custom_layers_and_models) about custom layers and models for how to use custom code more generally. – jdehesa Aug 25 '20 at 13:49
  • Thanks @jdehesa, model subclassing feels "unnecessarily complicated" when you just want to use one or two simple tf operations. It is more suited towards grouping several operations (and layers) together into one logical custom layer IMO. The problem with just using tf operations without `Lambda`, is that tf will throw an error when saving/loading the keras model, because it expects every thing to be a keras layer. – Hongtao Yang Aug 25 '20 at 14:41
  • Well you can group many operations in a single `Lambda` layer, just define a function that does all the operations and pass it to the `Lambda` constructor. Note that layers that do not have weights also have a "functional" interface, e.g. [`multiply`](https://www.tensorflow.org/api_docs/python/tf/keras/layers/multiply), which can be useful if you are used to Keras syntax but want to group several operations into a single `Lambda` layer. – jdehesa Aug 25 '20 at 16:17

1 Answers1

0

Keras does have that operation; what you are actually looking for is called the Concatenate() layer.

You can have a look here : https://www.tensorflow.org/api_docs/python/tf/keras/layers/concatenate

You may also want to have a look here: How to concatenate two layers in keras?

Timbus Calin
  • 13,809
  • 5
  • 41
  • 59
  • Yeah I know about concatenate, but it require a extra dim on the tensor to achieve what stack does. Also, it is just an example for illustration purpose, there are many others that don't have a keras equivalent (or even a keras workaround). – Hongtao Yang Aug 25 '20 at 14:30