This is related to How do you create a custom activation function with Keras?
I have implemented my own cost function
import numpy as np
import math
import keras
from keras.models import Model, Sequential
from keras.layers import Input, Dense, Activation
from keras import regularizers
from keras import backend as K
def custom_activation(x):
return (K.sigmoid(x) *2-1 )
x_train=np.random.uniform(low=-1,high=1,size=(200,2))
model=Sequential([
Dense(20,input_shape=(2,)),
Activation(custom_activation),
Dense(2,),
Activation('linear')
])
model.compile(optimizer='adam',loss='mean_squared_error')
model.fit(x_train,x_train,epochs=20,validation_split=0.1)
Instead of letting Keras automatically take the derivative of my approximation function, can I give it the derivative?
Please note that this is only an example. My real custom_activation is way more complicated.