0

I am trying to create my own custom activation function in Tensorflow. It's mostly because I have to test out some sigmoid approximations. The thing is I only need it for inference. I found this solution in post here. However, I only want to use it for inference and because of this it seems to me pointlessly overcomplicated.

For any suggestion, I'd be grateful.

  • You mean you want to use regular sigmoids for training but this custom sigmoid approximation for inference? Then you would have to, on the one hand, compute the loss using the actual sigmoid and, on the other hand, compute the inference output using the same variables but the sigmoid approximation instead (you could also have a "training model output" that also uses real sigmoids along with the loss as well if you wanted it for evaluation or something). – jdehesa Feb 15 '19 at 17:25
  • Yes exactly that's what I meant. I currently made a model where I have a variable training which is true when performing training and when doing inference I use the other activation function – Karol Charlie Marso Feb 17 '19 at 20:00

0 Answers0