0

I'm trying to do something similar to Make a custom loss function in keras, but struggling at implementation.

I have some data that relates age to failures:

# make some data
times = pd.array([10 ,15, 22, 30, 4, 17, 38, 12, 17, 22])
events = pd.array([0, 1, 1, 1, 0, 1, 1, 0, 0, 1])
data = pd.DataFrame({'age':times, 'failure':events})

I have a parameterized function that is used to make predictions:

# this gives the y_pred values
def calc_prob(param1, param2, param3, age):
    prob = (((100*param1*pow((100/param3),-(pow((age/param2),param1))))*pow(age/param2,param1)*math.log(100/param3))/age)/100
    return prob

and I have a metric I'd like to use in the cost function. I want the neural net to estimate parameters that minimize this function:

import statistics
# this is the metric to minimize
def brier_score(y_true, y_pred):
    # the brier score acts as an MSE metric
    brier_score = statistics.mean(pow((y_pred - y_true),2))
    return(brier_score)


from keras import models
from keras import layers

def build_model():
    model = models.Sequential()
    model.add(layers.Dense(1, activation='relu', input_shape=(data.shape[1],)))
    model.add(layers.Dense(5, activation='relu'))
    model.add(layers.Dense(3))
    model.compile(optimizer='rmsprop', loss='mse', metrics=['mae'])
    return model

The output of the model is three parameters that I would like to pass to calc_prob() to be used within the loss function. If those parameters at a particular iteration are defined like the values below, I want it to use the calc_prob() function to predict the values within the loss function

params = [2.64, 30, 40]
y_pred = calc_prob(params[0], params[1], params[2], data['age'])
y_true = data['failure']

The cost function would be

brier_score(y_true, y_pred)
0.5672474859914267

I'm not sure how to properly wrap these functions so that I can use them similar to:

model.compile(loss='brier_score', optimizer='adam', metrics=['accuracy'])
coolhand
  • 1,876
  • 5
  • 25
  • 46

1 Answers1

1

I'm not sure I understood you correctly.

Put "age" into labels

x = ...
y = {"age": data['age'], "prob": data['failure']}

def brier_score(y_true, y_pred):
    prob = calc_prob(y_pred[:, 0], y_pred[:, 1], y_pred[:, 2], y_true["age"])
    brier_score = tf.reduce_mean((prob - y_true["prob"]) ** 2, axis=1)
    return brier_score

model.compile(loss=brier_score, optimizer='adam', metrics=['accuracy'])

model.fit(x=x, y=y, ...)

In calc_prob, just replace pow by tf.math.pow.

In order to have positive NN output for calc_prob you can add one more activation function in the last layer. softplus would be probably a better choice then relu.

Alexey Tochin
  • 653
  • 5
  • 8
  • How would this implement the parameterized `calc_prob()` function? – coolhand Feb 07 '22 at 13:07
  • 1
    Oh, did not get the problem, see the update. – Alexey Tochin Feb 07 '22 at 20:28
  • Do I need to modify my cost function to get around this error? `ValueError: Found unexpected keys that do not correspond to any Model output: dict_keys(['age', 'prob']). Expected: ['dense_122']` – coolhand Feb 07 '22 at 22:43
  • In order to fix the error entire code is needed. It seems that you did not pass the label into the model `fit` method properly. The point is that the labels are a dictionary of tensors not a single tensor as usual. Like `y` in the code above. The cost function is specified in `model.compile(loss=brier_score, ...)`. – Alexey Tochin Feb 08 '22 at 21:01