0

I need help in creating a last layer in my model to be the cartesian product of three previous layers.

I have three input layers of size (None, 2) and the model returns three output layers of size (None,2).

So my final layer should have the cartesian product with size (None, 8).

I have tried the suggestion in:

Cartesian Product in Tensorflow

and

https://github.com/keras-team/keras/issues/12608

with no sucess.

I am using keras. I tried the merge layer multiply function, but it is just the element-wise product, not the cartesian product.

My model is the following:

from keras.layers import Dense, Input, Flatten, Lambda
from keras.models import Model
from keras.layers.merge import concatenate, multiply, add
from keras import backend as K
import tensorflow as tf


left_input = Input(shape=(2, ), name='alice')
left_branch = Dense(128, activation='relu', name='left_branch')(left_input)
a = Dense(2, activation='softmax', name='soft_left_branch')(left_branch)


middle_input = Input(shape=(2,), name='bob')
middle_branch = Dense(128, activation='relu', name='middle_branch')(middle_input)
b = Dense(2, activation='softmax', name='soft_middle_branch')(middle_branch)

right_input = Input(shape=(2,), name='charlie')
right_branch = Dense(128, activation='relu', name='right_branch')(right_input)
c = Dense(2, activation='softmax', name='soft_right_branch')(right_branch)

x = multiply([a, b, c])

predictions = Dense(8, activation='softmax', name='main_output')(x)
model = Model(inputs=[left_input, middle_input, right_input], outputs=predictions)

so instead of "multiply" in "x" I need the cartesian product (axbxc).

If a = [a1, a2], b = [b1,b2] and c = [c1,c2],

I need x = [a1*b1*c1, a1*b1*c2, a1*b2*c1, a1*b2*c2, a2*b1*c1, a2*b1*c2, a2*b2*c1, a2*b2*c2], with shape (None, 8).

thanks in advance for any help.

1 Answers1

1

Inspired by this answer: Slice tensor with variable indexes with Lambda Layer in Keras ,

I ended up with this solution, maybe it might help someone.

from keras.layers import Dense, Input, Flatten, Lambda, BatchNormalization
from keras.models import Model
from keras.layers.merge import concatenate, multiply, add, Dot
from keras.backend import slice


left_input = Input(shape=(2, ), name='alice')
left_branch = Dense(128, activation='relu', name='left_branch')(left_input)
a = Dense(2, activation='softmax', name='soft_left_branch')(left_branch)


middle_input = Input(shape=(2,), name='bob')
middle_branch = Dense(128, activation='relu', name='middle_branch')(middle_input)
b = Dense(2, activation='softmax', name='soft_middle_branch')(middle_branch)

right_input = Input(shape=(2,), name='charlie')
right_branch = Dense(128, activation='relu', name='right_branch')(right_input)
c = Dense(2, activation='softmax', name='soft_right_branch')(right_branch)


outs_a = []
outs_b = []
outs_c = []
len_outs = a.get_shape().as_list()[1] #a, b and c have same number of columns. Here "len_outs = 2".
for i in range(0, len_outs):
    outs_a.append(Lambda(lambda x: x[:,i:i+1], output_shape=(1,), name="a"+str(i))(a))
    outs_b.append(Lambda(lambda x: x[:,i:i+1], output_shape=(1,), name="b"+str(i))(b))
    outs_c.append(Lambda(lambda x: x[:,i:i+1], output_shape=(1,), name="c"+str(i))(c))

cp_l   = []
for i in range(0,len_outs):
    for j in range(0,len_outs):
        for k in range(0,len_outs):
            cp_l.append(   multiply([  outs_a[i],outs_b[j],outs_c[k]   ])   )

cp = concatenate(cp_l) #cartesian product as the concatenation of the terms in list cp_l

predictions = cp
model = Model(inputs=[left_input, middle_input, right_input], outputs = predictions)