6

I want to perform a check for even and odd elements of the batch and swap them if needed. I managed to result with two tensors I want to interweave:

def tf_oplu(x, name=None):   


    even = x[:,::2] #slicing into odd and even parts on the batch
    odd = x[:,1::2]

    even_flatten = tf.reshape(even, [-1]) # flatten tensors 
    #in row-major order to apply function across them

    odd_flatten = tf.reshape(odd, [-1])

    compare = tf.to_float(even_flatten<odd_flatten)
    compare_not = tf.to_float(even_flatten>=odd_flatten)

    #def oplu(x,y): # trivial function  
    #    if x<y : # (x<y)==1
    #       return y, x 
    #    else:
    #       return x, y # (x<y)==0

    even_flatten_new = odd_flatten * compare + even_flatten * compare_not
    odd_flatten_new = odd_flatten * compare_not + even_flatten * compare

    # convolute back 

    even_new = tf.reshape(even_flatten_new,[100,128])
    odd_new = tf.reshape(odd_flatten_new,[100,128])

Now I want to get back $[100,256]$ tensor with even and odd places filled. In numpy I would of course do:

y = np.empty((even_new.size + odd_newsize,), dtype=even_new.dtype)
y[:,0::2] = even_new
y[:,1::2] = odd_new

return y   

But such thing is not possible for tensoflow, as tensor is not modifiable. I suppose it is possible with either sparse tensor or tf.gather_nd, but both require generating array of indices, which is again non-trivial task for me. One more note: I don not want to use any python functions via tf.py_func, as I checked that they run on CPU only. Maybe lambda and tf.map_fn may help somehow? Thanks!

P-Gn
  • 23,115
  • 9
  • 87
  • 104
Slowpoke
  • 1,069
  • 1
  • 13
  • 37

3 Answers3

8

To interleave two matrices vertically, you do not big guns such as gather or map_fn. You can simply interleave them as follows:

tf.reshape(
  tf.stack([even_new, odd_new], axis=1),
  [-1, tf.shape(even_new)[1]])

EDIT

To interleave them horizontally:

tf.reshape(
  tf.concat([even_new[...,tf.newaxis], odd_new[...,tf.newaxis]], axis=-1), 
  [tf.shape(even_new)[0],-1])

The idea is to use stack to interleave them in memory. The dimension where the stack occurs gives the granularity of the interleaving. If we stack at axis=0, then the interleaving occurs at each element, mixing columns. If we stack at axis=1, entire input rows remain contiguous, interleaving occurs between rows.

P-Gn
  • 23,115
  • 9
  • 87
  • 104
  • Thanks very much! But this code produces tensor [200,128] instead of [100,256]. I've changed it to `y = tf.reshape(tf.stack([even_new, odd_new], axis=0), [tf.shape(even_new)[0],-1])` so the output is as expected. Can you please provide me a little explanation why it indeed places even and odd elements where needed? – Slowpoke Jul 06 '17 at 15:56
  • As far, as I understand, it stacks them vertically and then does reshape that should place elements that lay under each other together horizontally. – Slowpoke Jul 06 '17 at 16:06
  • Yes, I followed your numpy example, which also stacks tensors vertically (along the first dimension). Your modification to stack them horizontally is right. – P-Gn Jul 06 '17 at 16:12
  • Oh, sorry - that was typo, I meant `y[:, ::2]` and `y[:,1::2]` ! – Slowpoke Jul 06 '17 at 16:19
  • I am working with 100 batches each with 256 elements – Slowpoke Jul 06 '17 at 16:20
  • After spending some time trying to understand why my network does not train, I found that this code is still incorrect and needs additional transposing, please try this: ` x = tf.Variable([[1,2,3],[4,5,6],[7,8,9]]); y = tf.Variable([[0,0,0],[8,8,8],[9,9,9]]); w = tf.reshape(tf.stack([x, y], axis=0), [tf.shape(x)[0],-1]); z = tf.transpose(tf.stack([tf.transpose(x), tf.transpose(y)], axis=0)); zz = tf.reshape(z,[tf.shape(x)[0],-1]); init = tf.global_variables_initializer(); with tf.Session() as sess: sess.run(init); print(w.eval()); print(zz.eval());` – Slowpoke Jul 10 '17 at 19:26
  • 1
    @Slowpoke sorry you are right, I should have checked your formula better. I propose you a new formula instead, that is not based on `transpose`, which is somewhat of a heavyweight operation. Tell me how it goes. – P-Gn Jul 10 '17 at 19:38
2

you can use tf.dynamic_stitch, that takes as first argument a list of tensors of indices for each tensor to interleave and as second argument a list of tensors to interleave. The tensors will be interleaved along the first dimension so we need to transpose them and then transpose back. Here is the code:

even_new = tf.transpose(even_new,perm=[1,0])
odd_new = tf.transpose(odd_new,perm=[1,0])
even_pos = tf.convert_to_tensor(list(range(0,256,2)),dtype=tf.int32)
odd_pos = tf.convert_to_tensor(list(range(1,256,2)),dtype=tf.int32)
interleaved = tf.dynamic_stitch([even_pos,odd_pos],[even_new,odd_new])
interleaved = tf.transpose(interleaved,perm=[1,0])
Dema
  • 301
  • 2
  • 5
1

You can use assign to assign into slices.

odd_new = tf.constant([1,3,5])
even_new = tf.constant([2,4,6])
y=tf.Variable(tf.zeros(6, dtype=tf.int32))

sess = tf.InteractiveSession()
sess.run(tf.global_variables_initializer())
y[0::2].assign(odd_new).eval()
y[1::2].assign(even_new).eval()
Manolo Santos
  • 1,915
  • 1
  • 14
  • 25
  • Thanks very much! I initially tried to construct my network using assign operators, [but faced some problems and was discouraged from doing it](https://stackoverflow.com/questions/44737322/) (however, I think that code was correct), so now I prefer using embedded tensorflow functions, so the first answer suits me. But thanks nevertheless! – Slowpoke Jul 06 '17 at 16:27