0

I have a model in Keras. The model is using B. cross-entropy (log loss). However, I wanna create my custom B.C.E log loss for it. here is my model

    def get_model(train, num_users, num_items, layers=[20, 10, 5, 2]):
        num_layer = len(layers)  # Number of layers in the MLP
        user_matrix = K.constant(getTrainMatrix(train))
        item_matrix = K.constant(getTrainMatrix(train).T)

        # Input variables
        user_input = Input(shape=(1,), dtype='int32', name='user_input')
        item_input = Input(shape=(1,), dtype='int32', name='item_input')

        user_rating = Lambda(lambda x: tf.gather(user_matrix, tf.to_int32(x)))(user_input)
        item_rating = Lambda(lambda x: tf.gather(item_matrix, tf.to_int32(x)))(item_input)
        user_rating = Reshape((num_items, ))(user_rating)
        item_rating = Reshape((num_users, ))(item_rating)
        MLP_Embedding_User = Dense(layers[0]//2, activation="linear" , name='user_embedding')
        MLP_Embedding_Item  = Dense(layers[0]//2, activation="linear" , name='item_embedding')
        user_latent = MLP_Embedding_User(user_rating)
        item_latent = MLP_Embedding_Item(item_rating)

        # The 0-th layer is the concatenation of embedding layers
        vector = concatenate([user_latent, item_latent])

        # Final prediction layer
        prediction = Dense(1, activation='sigmoid', kernel_initializer=initializers.lecun_normal(),
                       name='prediction')(vector)

        model_ = Model(inputs=[user_input, item_input],
                   outputs=prediction)

        return model_

Here is the call to the compile function.

model.compile(optimizer=Adam(lr=learning_rate), loss='binary_crossentropy')

Now my question is how to define a custome binary cross entropy loss for it?

  • 1
    How exactly you want to customize your loss? There are other questions that cover how to implement custom losses, for example https://stackoverflow.com/questions/43818584/custom-loss-function-in-keras and https://stackoverflow.com/questions/45961428/make-a-custom-loss-function-in-keras – Dr. Snoopy Oct 18 '19 at 09:59
  • Actually all i want is to write this loss function = y * tf.log(self.y_) + (1 - y) * tf.log(1 - self.y_). How exactly I will do this? – tauhid ullah Oct 18 '19 at 12:21
  • What is `y` and `self.y_`? This seems exactly a normal binary crossentropy and it doesn't seem you need to make it any more complicated than `loss='binary_crossentropy'`. – Daniel Möller Oct 18 '19 at 12:50
  • self.y_ is the predicted rating and I need it. Actually, I wanna apply a normalized B. cross-entropy into it. – tauhid ullah Oct 19 '19 at 02:31

0 Answers0